Sample records for seismic processing techniques

  1. 1D Seismic reflection technique to increase depth information in surface seismic investigations

    NASA Astrophysics Data System (ADS)

    Camilletti, Stefano; Fiera, Francesco; Umberto Pacini, Lando; Perini, Massimiliano; Prosperi, Andrea

    2017-04-01

    1D seismic methods, such as MASW Re.Mi. and HVSR, have been extensively used in engineering investigations, bedrock research, Vs profile and to some extent for hydrologic applications, during the past 20 years. Recent advances in equipment, sound sources and computer interpretation techniques, make 1D seismic methods highly effective in shallow subsoil modeling. Classical 1D seismic surveys allows economical collection of subsurface data however they fail to return accurate information for depths greater than 50 meters. Using a particular acquisition technique it is possible to collect data that can be quickly processed through reflection technique in order to obtain more accurate velocity information in depth. Furthermore, data processing returns a narrow stratigraphic section, alongside the 1D velocity model, where lithological boundaries are represented. This work will show how collect a single-CMP to determine: (1) depth of bedrock; (2) gravel layers in clayey domains; (3) accurate Vs profile. Seismic traces was processed by means a new software developed in collaboration with SARA electronics instruments S.r.l company, Perugia - ITALY. This software has the great advantage of being able to be used directly in the field in order to reduce the times elapsing between acquisition and processing.

  2. Aerospace technology can be applied to exploration 'back on earth'. [offshore petroleum resources

    NASA Technical Reports Server (NTRS)

    Jaffe, L. D.

    1977-01-01

    Applications of aerospace technology to petroleum exploration are described. Attention is given to seismic reflection techniques, sea-floor mapping, remote geochemical sensing, improved drilling methods and down-hole acoustic concepts, such as down-hole seismic tomography. The seismic reflection techniques include monitoring of swept-frequency explosive or solid-propellant seismic sources, as well as aerial seismic surveys. Telemetry and processing of seismic data may also be performed through use of aerospace technology. Sea-floor sonor imaging and a computer-aided system of geologic analogies for petroleum exploration are also considered.

  3. Fuzzy logic and image processing techniques for the interpretation of seismic data

    NASA Astrophysics Data System (ADS)

    Orozco-del-Castillo, M. G.; Ortiz-Alemán, C.; Urrutia-Fucugauchi, J.; Rodríguez-Castellanos, A.

    2011-06-01

    Since interpretation of seismic data is usually a tedious and repetitive task, the ability to do so automatically or semi-automatically has become an important objective of recent research. We believe that the vagueness and uncertainty in the interpretation process makes fuzzy logic an appropriate tool to deal with seismic data. In this work we developed a semi-automated fuzzy inference system to detect the internal architecture of a mass transport complex (MTC) in seismic images. We propose that the observed characteristics of a MTC can be expressed as fuzzy if-then rules consisting of linguistic values associated with fuzzy membership functions. The constructions of the fuzzy inference system and various image processing techniques are presented. We conclude that this is a well-suited problem for fuzzy logic since the application of the proposed methodology yields a semi-automatically interpreted MTC which closely resembles the MTC from expert manual interpretation.

  4. The use of vertical seismic profiles in seismic investigations of the earth

    USGS Publications Warehouse

    Balch, Alfred H.; Lee, M.W.; Miller, J.J.; Ryder, Robert T.

    1982-01-01

    During the past 8 years, the U.S. Geological Survey has conducted an extensive investigation on the use of vertical seismic profiles (VSP) in a variety of seismic exploration applications. Seismic sources used were surface air guns, vibrators, explosives, marine air guns, and downhole air guns. Source offsets have ranged from 100 to 7800 ft. Well depths have been from 1200 to over 10,000 ft. We have found three specific ways in which VSPs can be applied to seismic exploration. First, seismic events observed at the surface of the ground can be traced, level by level, to their point of origin within the earth. Thus, one can tie a surface profile to a well log with an extraordinarily high degree of confidence. Second, one can establish the detectability of a target horizon, such as a porous zone. One can determine (either before or after surface profiling) whether or not a given horizon or layered sequence returns a detectable reflection to the surface. The amplitude and character of the reflection can also be observed. Third, acoustic properties of a stratigraphic sequence can be measured and sometimes correlated to important exploration parameters. For example, sometimes a relationship between apparent attenuation and sand percentage can be established. The technique shows additional promise of aiding surface exploration indirectly through studies of the evolution of the seismic pulse, studies of ghosts and multiples, and studies of seismic trace inversion techniques. Nearly all current seismic data‐processing techniques are adaptable to the processing of VSP data, such as normal moveout (NMO) corrections, stacking, single‐and multiple‐channel filtering, deconvolution, and wavelet shaping.

  5. Integrating long-offset transient electromagnetics (LOTEM) with seismics in an exploration environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strack, K.M.; Vozoff, K.

    The applications of electromagnetics have increased in the past two decades because of an improved understanding of the methods, improves service availability, and the increased focus of exploration in the more complex reservoir characterization issues. For electromagnetic methods surface applications for hydrocarbon Exploration and Production are still a special case, while applications in borehole and airborne research and for engineering and environmental objectives are routine. In the past, electromagnetic techniques, in particular deep transient electromagnetics, made up a completely different discipline in geophysics, although many of the principles are similar to the seismic one. With an understanding of the specificmore » problems related to data processing initially and then acquisition, the inclusion of principles learned from seismics happened almost naturally. Initially, the data processing was very similar to seismic full-waveform processing. The hardware was also changed to include multichannel acquisition systems, and the field procedures became very similar to seismic surveying. As a consequence, the integration and synergism of the interpretation process is becoming almost automatic. The long-offset transient electromagnetic (LOTEM) technique will be summarized from the viewpoint of its similarity to seismics. The complete concept of the method will also be reviewed. An interpretation case history that integrates seismic and LOTEM from a hydrocarbon area in China clearly demonstrates the limitations and benefits of the method.« less

  6. Seismic reflection response from cross-correlations of ambient vibrations on non-conventional hidrocarbon reservoir

    NASA Astrophysics Data System (ADS)

    Huerta, F. V.; Granados, I.; Aguirre, J.; Carrera, R. Á.

    2017-12-01

    Nowadays, in hydrocarbon industry, there is a need to optimize and reduce exploration costs in the different types of reservoirs, motivating the community specialized in the search and development of alternative exploration geophysical methods. This study show the reflection response obtained from a shale gas / oil deposit through the method of seismic interferometry of ambient vibrations in combination with Wavelet analysis and conventional seismic reflection techniques (CMP & NMO). The method is to generate seismic responses from virtual sources through the process of cross-correlation of records of Ambient Seismic Vibrations (ASV), collected in different receivers. The seismic response obtained is interpreted as the response that would be measured in one of the receivers considering a virtual source in the other. The acquisition of ASV records was performed in northern of Mexico through semi-rectangular arrays of multi-component geophones with instrumental response of 10 Hz. The in-line distance between geophones was 40 m while in cross-line was 280 m, the sampling used during the data collection was 2 ms and the total duration of the records was 6 hours. The results show the reflection response of two lines in the in-line direction and two in the cross-line direction for which the continuity of coherent events have been identified and interpreted as reflectors. There is certainty that the events identified correspond to reflections because the time-frequency analysis performed with the Wavelet Transform has allowed to identify the frequency band in which there are body waves. On the other hand, the CMP and NMO techniques have allowed to emphasize and correct the reflection response obtained during the correlation processes in the frequency band of interest. The results of the processing and analysis of ASV records through the seismic interferometry method have allowed us to see interesting results in light of the cross-correlation process in combination with the Wavelet analysis and conventional seismic reflection techniques. Therefore it was possible to recover the seismic response on each analyzed source-receiver pair, allowing us to obtain the reflection response of each analyzed seismic line.

  7. Regional seismic lines reprocessed using post-stack processing techniques; National Petroleum Reserve, Alaska

    USGS Publications Warehouse

    Miller, John J.; Agena, W.F.; Lee, M.W.; Zihlman, F.N.; Grow, J.A.; Taylor, D.J.; Killgore, Michele; Oliver, H.L.

    2000-01-01

    This CD-ROM contains stacked, migrated, 2-Dimensional seismic reflection data and associated support information for 22 regional seismic lines (3,470 line-miles) recorded in the National Petroleum Reserve ? Alaska (NPRA) from 1974 through 1981. Together, these lines constitute about one-quarter of the seismic data collected as part of the Federal Government?s program to evaluate the petroleum potential of the Reserve. The regional lines, which form a grid covering the entire NPRA, were created by combining various individual lines recorded in different years using different recording parameters. These data were reprocessed by the USGS using modern, post-stack processing techniques, to create a data set suitable for interpretation on interactive seismic interpretation computer workstations. Reprocessing was done in support of ongoing petroleum resource studies by the USGS Energy Program. The CD-ROM contains the following files: 1) 22 files containing the digital seismic data in standard, SEG-Y format; 2) 1 file containing navigation data for the 22 lines in standard SEG-P1 format; 3) 22 small scale graphic images of each seismic line in Adobe Acrobat? PDF format; 4) a graphic image of the location map, generated from the navigation file, with hyperlinks to the graphic images of the seismic lines; 5) an ASCII text file with cross-reference information for relating the sequential trace numbers on each regional line to the line number and shotpoint number of the original component lines; and 6) an explanation of the processing used to create the final seismic sections (this document). The SEG-Y format seismic files and SEG-P1 format navigation file contain all the information necessary for loading the data onto a seismic interpretation workstation.

  8. Visualization of volumetric seismic data

    NASA Astrophysics Data System (ADS)

    Spickermann, Dela; Böttinger, Michael; Ashfaq Ahmed, Khawar; Gajewski, Dirk

    2015-04-01

    Mostly driven by demands of high quality subsurface imaging, highly specialized tools and methods have been developed to support the processing, visualization and interpretation of seismic data. 3D seismic data acquisition and 4D time-lapse seismic monitoring are well-established techniques in academia and industry, producing large amounts of data to be processed, visualized and interpreted. In this context, interactive 3D visualization methods proved to be valuable for the analysis of 3D seismic data cubes - especially for sedimentary environments with continuous horizons. In crystalline and hard rock environments, where hydraulic stimulation techniques may be applied to produce geothermal energy, interpretation of the seismic data is a more challenging problem. Instead of continuous reflection horizons, the imaging targets are often steep dipping faults, causing a lot of diffractions. Without further preprocessing these geological structures are often hidden behind the noise in the data. In this PICO presentation we will present a workflow consisting of data processing steps, which enhance the signal-to-noise ratio, followed by a visualization step based on the use the commercially available general purpose 3D visualization system Avizo. Specifically, we have used Avizo Earth, an extension to Avizo, which supports the import of seismic data in SEG-Y format and offers easy access to state-of-the-art 3D visualization methods at interactive frame rates, even for large seismic data cubes. In seismic interpretation using visualization, interactivity is a key requirement for understanding complex 3D structures. In order to enable an easy communication of the insights gained during the interactive visualization process, animations of the visualized data were created which support the spatial understanding of the data.

  9. Noise suppression in surface microseismic data by τ-p transform

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Batzle, Mike; Behura, Jyoti; Willis, Mark; Haines, Seth; Davidson, Michael

    2013-01-01

    Surface passive seismic methods are receiving increased attention for monitoring changes in reservoirs during the production of unconventional oil and gas. However, in passive seismic data the strong cultural and ambient noise (mainly surface-waves) decreases the effectiveness of these techniques. Hence, suppression of surface-waves is a critical step in surface microseismic monitoring. We apply a noise suppression technique, based on the τ — p transform, to a surface passive seismic dataset recorded over a Barnett Shale reservoir undergoing a hydraulic fracturing process. This technique not only improves the signal-to-noise ratios of added synthetic microseismic events, but it also preserves the event waveforms.

  10. Sub-basalt Imaging of Hydrocarbon-Bearing Mesozoic Sediments Using Ray-Trace Inversion of First-Arrival Seismic Data and Elastic Finite-Difference Full-Wave Modeling Along Sinor-Valod Profile of Deccan Syneclise, India

    NASA Astrophysics Data System (ADS)

    Talukdar, Karabi; Behera, Laxmidhar

    2018-03-01

    Imaging below the basalt for hydrocarbon exploration is a global problem because of poor penetration and significant loss of seismic energy due to scattering, attenuation, absorption and mode-conversion when the seismic waves encounter a highly heterogeneous and rugose basalt layer. The conventional (short offset) seismic data acquisition, processing and modeling techniques adopted by the oil industry generally fails to image hydrocarbon-bearing sub-trappean Mesozoic sediments hidden below the basalt and is considered as a serious problem for hydrocarbon exploration in the world. To overcome this difficulty of sub-basalt imaging, we have generated dense synthetic seismic data with the help of elastic finite-difference full-wave modeling using staggered-grid scheme for the model derived from ray-trace inversion using sparse wide-angle seismic data acquired along Sinor-Valod profile in the Deccan Volcanic Province of India. The full-wave synthetic seismic data generated have been processed and imaged using conventional seismic data processing technique with Kirchhoff pre-stack time and depth migrations. The seismic image obtained correlates with all the structural features of the model obtained through ray-trace inversion of wide-angle seismic data, validating the effectiveness of robust elastic finite-difference full-wave modeling approach for imaging below thick basalts. Using the full-wave modeling also allows us to decipher small-scale heterogeneities imposed in the model as a measure of the rugose basalt interfaces, which could not be dealt with ray-trace inversion. Furthermore, we were able to accurately image thin low-velocity hydrocarbon-bearing Mesozoic sediments sandwiched between and hidden below two thick sequences of high-velocity basalt layers lying above the basement.

  11. Mapping Diffuse Seismicity Using Empirical Matched Field Processing Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, J; Templeton, D C; Harris, D B

    The objective of this project is to detect and locate more microearthquakes using the empirical matched field processing (MFP) method than can be detected using only conventional earthquake detection techniques. We propose that empirical MFP can complement existing catalogs and techniques. We test our method on continuous seismic data collected at the Salton Sea Geothermal Field during November 2009 and January 2010. In the Southern California Earthquake Data Center (SCEDC) earthquake catalog, 619 events were identified in our study area during this time frame and our MFP technique identified 1094 events. Therefore, we believe that the empirical MFP method combinedmore » with conventional methods significantly improves the network detection ability in an efficient matter.« less

  12. Instant Variations in Velocity and Attenuation of Seismic Waves in a Friable Medium Under a Vibrational Dynamic Loading

    NASA Astrophysics Data System (ADS)

    Geza, N.; Yushin, V.

    2007-12-01

    Instant variations of the velocities and attenuation of seismic waves in a friable medium subjected to dynamic loading have been studied by new experimental techniques using a powerful seismic vibrator. The half-space below the operating vibrator baseplate was scanned by high-frequency elastic waves, and the recorded fluctuations were exposed to a stroboscopic analysis. It was found that the variations of seismic velocities and attenuation are synchronous with the external vibrational load but have phase shift from it. Instant variations of the seismic waves parameters depend on the magnitude and absolute value of deformation, which generally result in decreasing of the elastic-wave velocities. New experimental techniques have a high sensitivity to the dynamic disturbance in the medium and allow one to detect a weak seismic boundaries. The relaxation process after dynamic vibrational loading were investigated and the results of research are presented.

  13. Free Surface Downgoing VSP Multiple Imaging

    NASA Astrophysics Data System (ADS)

    Maula, Fahdi; Dac, Nguyen

    2018-03-01

    The common usage of a vertical seismic profile is to capture the reflection wavefield (upgoing wavefield) so that it can be used for further well tie or other interpretations. Borehole Seismic (VSP) receivers capture the reflection from below the well trajectory, traditionally no seismic image information above trajectory. The non-traditional way of processing the VSP multiple can be used to expand the imaging above the well trajectory. This paper presents the case study of using VSP downgoing multiples for further non-traditional imaging applications. In general, VSP processing, upgoing and downgoing arrivals are separated during processing. The up-going wavefield is used for subsurface illumination, whereas the downgoing wavefield and multiples are normally excluded from the processing. In a situation where the downgoing wavefield passes the reflectors several times (multiple), the downgoing wavefield carries reflection information. Its benefit is that it can be used for seismic tie up to seabed, and possibility for shallow hazards identifications. One of the concepts of downgoing imaging is widely known as mirror-imaging technique. This paper presents a case study from deep water offshore Vietnam. The case study is presented to demonstrate the robustness of the technique, and the limitations encountered during its processing.

  14. Investigation of the detection of shallow tunnels using electromagnetic and seismic waves

    NASA Astrophysics Data System (ADS)

    Counts, Tegan; Larson, Gregg; Gürbüz, Ali Cafer; McClellan, James H.; Scott, Waymond R., Jr.

    2007-04-01

    Multimodal detection of subsurface targets such as tunnels, pipes, reinforcement bars, and structures has been investigated using both ground-penetrating radar (GPR) and seismic sensors with signal processing techniques to enhance localization capabilities. Both systems have been tested in bi-static configurations but the GPR has been expanded to a multi-static configuration for improved performance. The use of two compatible sensors that sense different phenomena (GPR detects changes in electrical properties while the seismic system measures mechanical properties) increases the overall system's effectiveness in a wider range of soils and conditions. Two experimental scenarios have been investigated in a laboratory model with nearly homogeneous sand. Images formed from the raw data have been enhanced using beamforming inversion techniques and Hough Transform techniques to specifically address the detection of linear targets. The processed data clearly indicate the locations of the buried targets of various sizes at a range of depths.

  15. Time-lapse seismic waveform modelling and attribute analysis using hydromechanical models for a deep reservoir undergoing depletion

    NASA Astrophysics Data System (ADS)

    He, Y.-X.; Angus, D. A.; Blanchard, T. D.; Wang, G.-L.; Yuan, S.-Y.; Garcia, A.

    2016-04-01

    Extraction of fluids from subsurface reservoirs induces changes in pore pressure, leading not only to geomechanical changes, but also perturbations in seismic velocities and hence observable seismic attributes. Time-lapse seismic analysis can be used to estimate changes in subsurface hydromechanical properties and thus act as a monitoring tool for geological reservoirs. The ability to observe and quantify changes in fluid, stress and strain using seismic techniques has important implications for monitoring risk not only for petroleum applications but also for geological storage of CO2 and nuclear waste scenarios. In this paper, we integrate hydromechanical simulation results with rock physics models and full-waveform seismic modelling to assess time-lapse seismic attribute resolution for dynamic reservoir characterization and hydromechanical model calibration. The time-lapse seismic simulations use a dynamic elastic reservoir model based on a North Sea deep reservoir undergoing large pressure changes. The time-lapse seismic traveltime shifts and time strains calculated from the modelled and processed synthetic data sets (i.e. pre-stack and post-stack data) are in a reasonable agreement with the true earth models, indicating the feasibility of using 1-D strain rock physics transform and time-lapse seismic processing methodology. Estimated vertical traveltime shifts for the overburden and the majority of the reservoir are within ±1 ms of the true earth model values, indicating that the time-lapse technique is sufficiently accurate for predicting overburden velocity changes and hence geomechanical effects. Characterization of deeper structure below the overburden becomes less accurate, where more advanced time-lapse seismic processing and migration is needed to handle the complex geometry and strong lateral induced velocity changes. Nevertheless, both migrated full-offset pre-stack and near-offset post-stack data image the general features of both the overburden and reservoir units. More importantly, the results from this study indicate that integrated seismic and hydromechanical modelling can help constrain time-lapse uncertainty and hence reduce risk due to fluid extraction and injection.

  16. Improving Vintage Seismic Data Quality through Implementation of Advance Processing Techniques

    NASA Astrophysics Data System (ADS)

    Latiff, A. H. Abdul; Boon Hong, P. G.; Jamaludin, S. N. F.

    2017-10-01

    It is essential in petroleum exploration to have high resolution subsurface images, both vertically and horizontally, in uncovering new geological and geophysical aspects of our subsurface. The lack of success may have been from the poor imaging quality which led to inaccurate analysis and interpretation. In this work, we re-processed the existing seismic dataset with an emphasis on two objectives. Firstly, to produce a better 3D seismic data quality with full retention of relative amplitudes and significantly reduce seismic and structural uncertainty. Secondly, to facilitate further prospect delineation through enhanced data resolution, fault definitions and events continuity, particularly in syn-rift section and basement cover contacts and in turn, better understand the geology of the subsurface especially in regard to the distribution of the fluvial and channel sands. By adding recent, state-of-the-art broadband processing techniques such as source and receiver de-ghosting, high density velocity analysis and shallow water de-multiple, the final results produced a better overall reflection detail and frequency in specific target zones, particularly in the deeper section.

  17. Fallon, Nevada FORGE Seismic Reflection Profiles

    DOE Data Explorer

    Blankenship, Doug; Faulds, James; Queen, John; Fortuna, Mark

    2018-02-01

    Newly reprocessed Naval Air Station Fallon (1994) seismic lines: pre-stack depth migrations, with interpretations to support the Fallon FORGE (Phase 2B) 3D Geologic model. Data along seven profiles (>100 km of total profile length) through and adjacent to the Fallon site were re-processed. The most up-to-date, industry-tested seismic processing techniques were utilized to improve the signal strength and coherency in the sedimentary, volcanic, and Mesozoic crystalline basement sections, in conjunction with fault diffractions in order to improve the identification and definition of faults within the study area.

  18. Automatic Classification of volcano-seismic events based on Deep Neural Networks.

    NASA Astrophysics Data System (ADS)

    Titos Luzón, M.; Bueno Rodriguez, A.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.

    2017-12-01

    Seismic monitoring of active volcanoes is a popular remote sensing technique to detect seismic activity, often associated to energy exchanges between the volcano and the environment. As a result, seismographs register a wide range of volcano-seismic signals that reflect the nature and underlying physics of volcanic processes. Machine learning and signal processing techniques provide an appropriate framework to analyze such data. In this research, we propose a new classification framework for seismic events based on deep neural networks. Deep neural networks are composed by multiple processing layers, and can discover intrinsic patterns from the data itself. Internal parameters can be initialized using a greedy unsupervised pre-training stage, leading to an efficient training of fully connected architectures. We aim to determine the robustness of these architectures as classifiers of seven different types of seismic events recorded at "Volcán de Fuego" (Colima, Mexico). Two deep neural networks with different pre-training strategies are studied: stacked denoising autoencoder and deep belief networks. Results are compared to existing machine learning algorithms (SVM, Random Forest, Multilayer Perceptron). We used 5 LPC coefficients over three non-overlapping segments as training features in order to characterize temporal evolution, avoid redundancy and encode the signal, regardless of its duration. Experimental results show that deep architectures can classify seismic events with higher accuracy than classical algorithms, attaining up to 92% recognition accuracy. Pre-training initialization helps these models to detect events that occur simultaneously in time (such explosions and rockfalls), increase robustness against noisy inputs, and provide better generalization. These results demonstrate deep neural networks are robust classifiers, and can be deployed in real-environments to monitor the seismicity of restless volcanoes.

  19. Characterization of a complex near-surface structure using well logging and passive seismic measurements

    NASA Astrophysics Data System (ADS)

    Benjumea, Beatriz; Macau, Albert; Gabàs, Anna; Figueras, Sara

    2016-04-01

    We combine geophysical well logging and passive seismic measurements to characterize the near-surface geology of an area located in Hontomin, Burgos (Spain). This area has some near-surface challenges for a geophysical study. The irregular topography is characterized by limestone outcrops and unconsolidated sediments areas. Additionally, the near-surface geology includes an upper layer of pure limestones overlying marly limestones and marls (Upper Cretaceous). These materials lie on top of Low Cretaceous siliciclastic sediments (sandstones, clays, gravels). In any case, a layer with reduced velocity is expected. The geophysical data sets used in this study include sonic and gamma-ray logs at two boreholes and passive seismic measurements: three arrays and 224 seismic stations for applying the horizontal-to-vertical amplitude spectra ratio method (H/V). Well-logging data define two significant changes in the P-wave-velocity log within the Upper Cretaceous layer and one more at the Upper to Lower Cretaceous contact. This technique has also been used for refining the geological interpretation. The passive seismic measurements provide a map of sediment thickness with a maximum of around 40 m and shear-wave velocity profiles from the array technique. A comparison between seismic velocity coming from well logging and array measurements defines the resolution limits of the passive seismic techniques and helps it to be interpreted. This study shows how these low-cost techniques can provide useful information about near-surface complexity that could be used for designing a geophysical field survey or for seismic processing steps such as statics or imaging.

  20. Virtual and super - virtual refraction method: Application to synthetic data and 2012 of Karangsambung survey data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nugraha, Andri Dian; Adisatrio, Philipus Ronnie

    2013-09-09

    Seismic refraction survey is one of geophysical method useful for imaging earth interior, definitely for imaging near surface. One of the common problems in seismic refraction survey is weak amplitude due to attenuations at far offset. This phenomenon will make it difficult to pick first refraction arrival, hence make it challenging to produce the near surface image. Seismic interferometry is a new technique to manipulate seismic trace for obtaining Green's function from a pair of receiver. One of its uses is for improving first refraction arrival quality at far offset. This research shows that we could estimate physical properties suchmore » as seismic velocity and thickness from virtual refraction processing. Also, virtual refraction could enhance the far offset signal amplitude since there is stacking procedure involved in it. Our results show super - virtual refraction processing produces seismic image which has higher signal-to-noise ratio than its raw seismic image. In the end, the numbers of reliable first arrival picks are also increased.« less

  1. Pre-processing ambient noise cross-correlations with equalizing the covariance matrix eigenspectrum

    NASA Astrophysics Data System (ADS)

    Seydoux, Léonard; de Rosny, Julien; Shapiro, Nikolai M.

    2017-09-01

    Passive imaging techniques from ambient seismic noise requires a nearly isotropic distribution of the noise sources in order to ensure reliable traveltime measurements between seismic stations. However, real ambient seismic noise often partially fulfils this condition. It is generated in preferential areas (in deep ocean or near continental shores), and some highly coherent pulse-like signals may be present in the data such as those generated by earthquakes. Several pre-processing techniques have been developed in order to attenuate the directional and deterministic behaviour of this real ambient noise. Most of them are applied to individual seismograms before cross-correlation computation. The most widely used techniques are the spectral whitening and temporal smoothing of the individual seismic traces. We here propose an additional pre-processing to be used together with the classical ones, which is based on the spatial analysis of the seismic wavefield. We compute the cross-spectra between all available stations pairs in spectral domain, leading to the data covariance matrix. We apply a one-bit normalization to the covariance matrix eigenspectrum before extracting the cross-correlations in the time domain. The efficiency of the method is shown with several numerical tests. We apply the method to the data collected by the USArray, when the M8.8 Maule earthquake occurred on 2010 February 27. The method shows a clear improvement compared with the classical equalization to attenuate the highly energetic and coherent waves incoming from the earthquake, and allows to perform reliable traveltime measurement even in the presence of the earthquake.

  2. Development of seismic tomography software for hybrid supercomputers

    NASA Astrophysics Data System (ADS)

    Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton

    2015-04-01

    Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on supercomputers using multicore CPUs only, with preliminary performance tests showing good parallel efficiency on large numerical grids. Porting of the algorithms to hybrid supercomputers is currently ongoing.

  3. Integral Analysis of Seismic Refraction and Ambient Vibration Survey for Subsurface Profile Evaluation

    NASA Astrophysics Data System (ADS)

    Hazreek, Z. A. M.; Kamarudin, A. F.; Rosli, S.; Fauziah, A.; Akmal, M. A. K.; Aziman, M.; Azhar, A. T. S.; Ashraf, M. I. M.; Shaylinda, M. Z. N.; Rais, Y.; Ishak, M. F.; Alel, M. N. A.

    2018-04-01

    Geotechnical site investigation as known as subsurface profile evaluation is the process of subsurface layer characteristics determination which finally used for design and construction phase. Traditionally, site investigation was performed using drilling technique thus suffers from several limitation due to cost, time, data coverage and sustainability. In order to overcome those problems, this study adopted surface techniques using seismic refraction and ambient vibration method for subsurface profile depth evaluation. Seismic refraction data acquisition and processing was performed using ABEM Terraloc and OPTIM software respectively. Meanwhile ambient vibration data acquisition and processing was performed using CityShark II, Lennartz and GEOPSY software respectively. It was found that studied area consist of two layers representing overburden and bedrock geomaterials based on p-wave velocity value (vp = 300 – 2500 m/s and vp > 2500 m/s) and natural frequency value (Fo = 3.37 – 3.90 Hz) analyzed. Further analysis found that both methods show some good similarity in term of depth and thickness with percentage accuracy at 60 – 97%. Consequently, this study has demonstrated that the application of seismic refractin and ambient vibration method was applicable in subsurface profile depth and thickness estimation. Moreover, surface technique which consider as non-destructive method adopted in this study was able to compliment conventional drilling method in term of cost, time, data coverage and environmental sustainaibility.

  4. Data Processing Methods for 3D Seismic Imaging of Subsurface Volcanoes: Applications to the Tarim Flood Basalt.

    PubMed

    Wang, Lei; Tian, Wei; Shi, Yongmin

    2017-08-07

    The morphology and structure of plumbing systems can provide key information on the eruption rate and style of basalt lava fields. The most powerful way to study subsurface geo-bodies is to use industrial 3D reflection seismological imaging. However, strategies to image subsurface volcanoes are very different from that of oil and gas reservoirs. In this study, we process seismic data cubes from the Northern Tarim Basin, China, to illustrate how to visualize sills through opacity rendering techniques and how to image the conduits by time-slicing. In the first case, we isolated probes by the seismic horizons marking the contacts between sills and encasing strata, applying opacity rendering techniques to extract sills from the seismic cube. The resulting detailed sill morphology shows that the flow direction is from the dome center to the rim. In the second seismic cube, we use time-slices to image the conduits, which corresponds to marked discontinuities within the encasing rocks. A set of time-slices obtained at different depths show that the Tarim flood basalts erupted from central volcanoes, fed by separate pipe-like conduits.

  5. Broadband seismic : case study modeling and data processing

    NASA Astrophysics Data System (ADS)

    Cahyaningtyas, M. B.; Bahar, A.

    2018-03-01

    Seismic data with wide range of frequency is needed due to its close relation to resolution and the depth of the target. Low frequency provides deeper penetration for the imaging of deep target. In addition, the wider the frequency bandwidth, the sharper the wavelet. Sharp wavelet is responsible for high-resolution imaging and is very helpful to resolve thin bed. As a result, the demand for broadband seismic data is rising and it spurs the technology development of broadband seismic in oil and gas industry. An obstacle that is frequently found on marine seismic data is the existence of ghost that affects the frequency bandwidth contained on the seismic data. Ghost alters bandwidth to bandlimited. To reduce ghost effect and to acquire broadband seismic data, lots of attempts are used, both on the acquisition and on the processing of seismic data. One of the acquisition technique applied is the multi-level streamer, where some streamers are towed on some levels of depth. Multi-level streamer will yield data with varied ghost notch shown on frequency domain. If the ghost notches are not overlapping, the summation of multi-level streamer data will reduce the ghost effect. The result of the multi-level streamer data processing shows that reduction of ghost notch on frequency domain indeed takes place.

  6. Optimized suppression of coherent noise from seismic data using the Karhunen-Loève transform

    NASA Astrophysics Data System (ADS)

    Montagne, Raúl; Vasconcelos, Giovani L.

    2006-07-01

    Signals obtained in land seismic surveys are usually contaminated with coherent noise, among which the ground roll (Rayleigh surface waves) is of major concern for it can severely degrade the quality of the information obtained from the seismic record. This paper presents an optimized filter based on the Karhunen-Loève transform for processing seismic images contaminated with ground roll. In this method, the contaminated region of the seismic record, to be processed by the filter, is selected in such way as to correspond to the maximum of a properly defined coherence index. The main advantages of the method are that the ground roll is suppressed with negligible distortion of the remnant reflection signals and that the filtering procedure can be automated. The image processing technique described in this study should also be relevant for other applications where coherent structures embedded in a complex spatiotemporal pattern need to be identified in a more refined way. In particular, it is argued that the method is appropriate for processing optical coherence tomography images whose quality is often degraded by coherent noise (speckle).

  7. A Fiber-Optic Borehole Seismic Vector Sensor System for Geothermal Site Characterization and Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulsson, Bjorn N.P.; Thornburg, Jon A.; He, Ruiqing

    2015-04-21

    Seismic techniques are the dominant geophysical techniques for the characterization of subsurface structures and stratigraphy. The seismic techniques also dominate the monitoring and mapping of reservoir injection and production processes. Borehole seismology, of all the seismic techniques, despite its current shortcomings, has been shown to provide the highest resolution characterization and most precise monitoring results because it generates higher signal to noise ratio and higher frequency data than surface seismic techniques. The operational environments for borehole seismic instruments are however much more demanding than for surface seismic instruments making both the instruments and the installation much more expensive. The currentmore » state-of-the-art borehole seismic instruments have not been robust enough for long term monitoring compounding the problems with expensive instruments and installations. Furthermore, they have also not been able to record the large bandwidth data available in boreholes or having the sensitivity allowing them to record small high frequency micro seismic events with high vector fidelity. To reliably achieve high resolution characterization and long term monitoring of Enhanced Geothermal Systems (EGS) sites a new generation of borehole seismic instruments must therefore be developed and deployed. To address the critical site characterization and monitoring needs for EGS programs, US Department of Energy (DOE) funded Paulsson, Inc. in 2010 to develop a fiber optic based ultra-large bandwidth clamped borehole seismic vector array capable of deploying up to one thousand 3C sensor pods suitable for deployment into ultra-high temperature and high pressure boreholes. Tests of the fiber optic seismic vector sensors developed on the DOE funding have shown that the new borehole seismic sensor technology is capable of generating outstanding high vector fidelity data with extremely large bandwidth: 0.01 – 6,000 Hz. Field tests have shown that the system can record events at magnitudes much smaller than M-2.6 at frequencies up to 2,000 Hz. The sensors have also proved to be about 100 times more sensitive than the regular coil geophones that are used in borehole seismic systems today. The fiber optic seismic sensors have furthermore been qualified to operate at temperatures over 300°C (572°F). Simultaneously with the fiber optic based seismic 3C vector sensors we are using the lead-in fiber to acquire Distributed Acoustic Sensor (DAS) data from the surface to the bottom of the vector array. While the DAS data is of much lower quality than the vector sensor data it provides a 1 m spatial sampling of the downgoing wavefield which will be used to build the high resolution velocity model which is an essential component in high resolution imaging and monitoring.« less

  8. Processing of single channel air and water gun data for imaging an impact structure at the Chesapeake Bay

    USGS Publications Warehouse

    Lee, Myung W.

    1999-01-01

    Processing of 20 seismic profiles acquired in the Chesapeake Bay area aided in analysis of the details of an impact structure and allowed more accurate mapping of the depression caused by a bolide impact. Particular emphasis was placed on enhancement of seismic reflections from the basement. Application of wavelet deconvolution after a second zero-crossing predictive deconvolution improved the resolution of shallow reflections, and application of a match filter enhanced the basement reflections. The use of deconvolution and match filtering with a two-dimensional signal enhancement technique (F-X filtering) significantly improved the interpretability of seismic sections.

  9. High resolution seismic reflection profiling at Aberdeen Proving Grounds, Maryland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, R.D.; Xia, Jianghai; Swartzel, S.

    1996-11-01

    The effectiveness of shallow high resolution seismic reflection (i.e., resolution potential) to image geologic interfaces between about 70 and 750 ft at the Aberdeen Proving Grounds, Maryland (APG), appears to vary locally with the geometric complexity of the unconsolidated sediments that overlay crystalline bedrock. The bedrock surface (which represents the primary geologic target of this study) was imaged at each of three test areas on walkaway noise tests and CDP (common depth point) stacked data. Proven high resolution techniques were used to design and acquire data on this survey. Feasibility of the technique and minimum acquisition requirements were determined throughmore » evaluation and correlation of walkaway noise tests, CDP survey lines, and a downhole velocity check shot survey. Data processing and analysis revealed several critical attributes of shallow seismic data from APG that need careful consideration and compensation on reflection data sets. This survey determined: (1) the feasibility of the technique, (2) the resolution potential (both horizontal and vertical) of the technique, (3) the optimum source for this site, (4) the optimum acquisition geometries, (5) general processing flow, and (6) a basic idea of the acoustic variability across this site. Source testing involved an accelerated weight drop, land air gun, downhole black powder charge, sledge hammer/plate, and high frequency vibrator. Shallow seismic reflection profiles provided for a more detailed picture of the geometric complexity and variability of the distinct clay sequences (aquatards), previously inferred from drilling to be present, based on sparse drill holes and basewide conceptual models. The seismic data also reveal a clear explanation for the difficulties previously noted in correlating individual, borehole-identified sand or clay units over even short distances.« less

  10. Pattern recognition in volcano seismology - Reducing spectral dimensionality

    NASA Astrophysics Data System (ADS)

    Unglert, K.; Radic, V.; Jellinek, M.

    2015-12-01

    Variations in the spectral content of volcano seismicity can relate to changes in volcanic activity. Low-frequency seismic signals often precede or accompany volcanic eruptions. However, they are commonly manually identified in spectra or spectrograms, and their definition in spectral space differs from one volcanic setting to the next. Increasingly long time series of monitoring data at volcano observatories require automated tools to facilitate rapid processing and aid with pattern identification related to impending eruptions. Furthermore, knowledge transfer between volcanic settings is difficult if the methods to identify and analyze the characteristics of seismic signals differ. To address these challenges we evaluate whether a machine learning technique called Self-Organizing Maps (SOMs) can be used to characterize the dominant spectral components of volcano seismicity without the need for any a priori knowledge of different signal classes. This could reduce the dimensions of the spectral space typically analyzed by orders of magnitude, and enable rapid processing and visualization. Preliminary results suggest that the temporal evolution of volcano seismicity at Kilauea Volcano, Hawai`i, can be reduced to as few as 2 spectral components by using a combination of SOMs and cluster analysis. We will further refine our methodology with several datasets from Hawai`i and Alaska, among others, and compare it to other techniques.

  11. Statistical methods for investigating quiescence and other temporal seismicity patterns

    USGS Publications Warehouse

    Matthews, M.V.; Reasenberg, P.A.

    1988-01-01

    We propose a statistical model and a technique for objective recognition of one of the most commonly cited seismicity patterns:microearthquake quiescence. We use a Poisson process model for seismicity and define a process with quiescence as one with a particular type of piece-wise constant intensity function. From this model, we derive a statistic for testing stationarity against a 'quiescence' alternative. The large-sample null distribution of this statistic is approximated from simulated distributions of appropriate functionals applied to Brownian bridge processes. We point out the restrictiveness of the particular model we propose and of the quiescence idea in general. The fact that there are many point processes which have neither constant nor quiescent rate functions underscores the need to test for and describe nonuniformity thoroughly. We advocate the use of the quiescence test in conjunction with various other tests for nonuniformity and with graphical methods such as density estimation. ideally these methods may promote accurate description of temporal seismicity distributions and useful characterizations of interesting patterns. ?? 1988 Birkha??user Verlag.

  12. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  13. Comprehensive Seismological Monitoring of Geomorphic Processes in Taiwan

    NASA Astrophysics Data System (ADS)

    Chao, W. A.; Chen, C. H.

    2016-12-01

    Geomorphic processes such as hillslope mass wasting and river sediment transport are important for studying landscape dynamics. Mass movements induced from geomorphic events can generate seismic waves and be recorded by seismometers. Recent studies demonstrate that seismic monitoring techniques not only fully map the spatiotemporal patterns of geomorphic activity but also allow for exploration of the dynamic links between hillslope failures and channel processes, which may not be resolved by conventional techniques (e.g., optical remote sensing). We have recently developed a real-time landquake monitoring system (RLMS, here we use the term `landquake' to represent all hillslope failures such as rockfall, rock avalanche and landslide), which has been continuously monitoring landquake activities in Taiwan since June 2015 based on broadband seismic records, yielding source information (e.g., location, occurrence time, magnitude and mechanism) for large-sized events (http://140.112.57.117/main.html). Several seismic arrays have also been deployed over the past few years around the catchments and along the river channels in Taiwan for monitoring erosion processes at catchment scale, improving the spatiotemporal resolution in exploring the interaction between geomorphic events and specific meteorological conditions. Based on a forward model accounting for the impulsive impacts of saltating particles, we can further invert for the sediment load flux, a critical parameter in landscape evolution studies, by fitting the seismic observations only. To test the validity of the seismologically determined sediment load flux, we conduct a series of controlled dam breaking experiments that are advantageous in well constraining the spatiotemporal variations of the sediment transport. Incorporating the seismological constrains on geomorphic processes with the effects of tectonic and/or climate perturbations can provide valuable and quantitative information for more fully understanding and modeling of the dynamics of erosional mountain landscapes. Comprehensive seismic monitoring also yields important information for the evaluation, assessment and emergency response of hazardous geomorphic events.

  14. Single station monitoring of volcanoes using seismic ambient noise

    NASA Astrophysics Data System (ADS)

    De Plaen, R. S.; Lecocq, T.; Caudron, C.; Ferrazzini, V.; Francis, O.

    2016-12-01

    During volcanic eruptions, magma transport causes gas release, pressure perturbations and fracturing in the plumbing system. The potential subsequent surface deformation that can be detected using geodetic techniques and deep mechanical processes associated with magma pressurization and/or migration and their spatial-temporal evolution can be monitored with volcanic seismicity. However, these techniques respectively suffer from limited sensitivity to deep changes and a too short-term temporal distribution to expose early aseismic processes such as magma pressurisation. Seismic ambient noise cross-correlation uses the multiple scattering of seismic vibrations by heterogeneities in the crust to retrieves the Green's function for surface waves between two stations by cross-correlating these diffuse wavefields. Seismic velocity changes are then typically measured from the cross-correlation functions with applications for volcanoes, large magnitude earthquakes in the far field and smaller magnitude earthquakes at smaller distances. This technique is increasingly used as a non-destructive way to continuously monitor small seismic velocity changes ( 0.1%) associated with volcanic activity, although it is usually limited to volcanoes equipped with large and dense networks of broadband stations. The single-station approach may provide a powerful and reliable alternative to the classical "cross-stations" approach when measuring variation of seismic velocities. We implemented it on the Piton de la Fournaise in Reunion Island, a very active volcano with a remarkable multi-disciplinary continuous monitoring. Over the past decade, this volcano was increasingly studied using the traditional cross-station approach and therefore represents a unique laboratory to validate our approach. Our results, tested on stations located up to 3.5 km from the eruptive site, performed as well as the classical approach to detect the volcanic eruption in the 1-2 Hz frequency band. This opens new perspectives to successfully forecast volcanic activity at volcanoes equipped with a single 3-component seismometer.

  15. Detection of buried mines with seismic sonar

    NASA Astrophysics Data System (ADS)

    Muir, Thomas G.; Baker, Steven R.; Gaghan, Frederick E.; Fitzpatrick, Sean M.; Hall, Patrick W.; Sheetz, Kraig E.; Guy, Jeremie

    2003-10-01

    Prior research on seismo-acoustic sonar for detection of buried targets [J. Acoust. Soc. Am. 103, 2333-2343 (1998)] has continued with examination of the target strengths of buried test targets as well as targets of interest, and has also examined detection and confirmatory classification of these, all using arrays of seismic sources and receivers as well as signal processing techniques to enhance target recognition. The target strengths of two test targets (one a steel gas bottle, the other an aluminum powder keg), buried in a sand beach, were examined as a function of internal mass load, to evaluate theory developed for seismic sonar target strength [J. Acoust. Soc. Am. 103, 2344-2353 (1998)]. The detection of buried naval and military targets of interest was achieved with an array of 7 shaker sources and 5, three-axis seismometers, at a range of 5 m. Vector polarization filtering was the main signal processing technique for detection. It capitalizes on the fact that the vertical and horizontal components in Rayleigh wave echoes are 90 deg out of phase, enabling complex variable processing to obtain the imaginary component of the signal power versus time, which is unique to Rayleigh waves. Gabor matrix processing of this signal component was the main technique used to determine whether the target was man-made or just a natural target in the environment. [Work sponsored by ONR.

  16. Passive Seismic Monitoring for Rockfall at Yucca Mountain: Concept Tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, J; Twilley, K; Murvosh, H

    2003-03-03

    For the purpose of proof-testing a system intended to remotely monitor rockfall inside a potential radioactive waste repository at Yucca Mountain, a system of seismic sub-arrays will be deployed and tested on the surface of the mountain. The goal is to identify and locate rockfall events remotely using automated data collecting and processing techniques. We install seismometers on the ground surface, generate seismic energy to simulate rockfall in underground space beneath the array, and interpret the surface response to discriminate and locate the event. Data will be analyzed using matched-field processing, a generalized beam forming method for localizing discrete signals.more » Software is being developed to facilitate the processing. To date, a three-component sub-array has been installed and successfully tested.« less

  17. Detecting and Locating Seismic Events Without Phase Picks or Velocity Models

    NASA Astrophysics Data System (ADS)

    Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.

    2015-12-01

    The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.

  18. Surface Deformation and Source Model at Semisopochnoi Volcano from InSAR and Seismic Analysis During the 2014 and 2015 Seismic Swarms

    NASA Astrophysics Data System (ADS)

    DeGrandpre, K.; Pesicek, J. D.; Lu, Z.

    2016-12-01

    During the summer of 2014 and the early spring of 2015 two notable increases in seismic activity at Semisopochnoi volcano in the western Aleutian islands were recorded on AVO seismometers on Semisopochnoi and neighboring islands. These seismic swarms did not lead to an eruption. This study employs differential SAR techniques using TerraSAR-X images in conjunction with more accurately relocating the recorded seismic events through simultaneous inversion of event travel times and a three-dimensional velocity model using tomoDD. The interferograms created from the SAR images exhibit surprising coherence and an island wide spatial distribution of inflation that is then used in a Mogi model in order to define the three-dimensional location and volume change required for a source at Semisopochnoi to produce the observed surface deformation. The tomoDD relocations provide a more accurate and realistic three-dimensional velocity model as well as a tighter clustering of events for both swarms that clearly outline a linear seismic void within the larger group of shallow (<10 km) seismicity. While no direct conclusions as to the relationship of these seismic events and the observed surface deformation can be made at this time, these techniques are both complimentary and efficient forms of remotely monitoring volcanic activity that provide much deeper insights into the processes involved without having to risk hazardous or costly field work.

  19. COMPARISON OF SEISMIC SIGNATURES OF FLARES OBTAINED BY SOHO/MICHELSON DOPPLER IMAGER AND GONG INSTRUMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zharkov, S.; Matthews, S. A.; Zharkova, V. V.

    2011-10-01

    The first observations of seismic responses to solar flares were carried out using time-distance (TD) and holography techniques applied to SOHO/Michelson Doppler Imager (MDI) Dopplergrams obtained from space and unaffected by terrestrial atmospheric disturbances. However, the ground-based network GONG is potentially a very valuable source of sunquake observations, especially in cases where space observations are unavailable. In this paper, we present an updated technique for pre-processing of GONG observations for the application of subjacent vantage holography. Using this method and TD diagrams, we investigate several sunquakes observed in association with M- and X-class solar flares and compare the outcomes withmore » those reported earlier using MDI data. In both GONG and MDI data sets, for the first time, we also detect the TD ridge associated with the 2001 September 9 flare. Our results show reassuringly positive identification of sunquakes from GONG data that can provide further information about the physics of seismic processes associated with solar flares.« less

  20. A High-Sensitivity Broad-Band Seismic Sensor for Shallow Seismic Sounding of the Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Pike, W. Thomas; Standley, Ian M.; Banerdt, W. Bruce

    2005-01-01

    The recently undertaken Space Exploration Initiative has prompted a renewed interest in techniques for characterizing the surface and shallow subsurface (0-10s of meters depth) of the Moon. There are several reasons for this: First, there is an intrinsic scientific interest in the subsurface structure. For example the stratigraphy, depth to bedrock, density/porosity, and block size distribution all have implications for the formation of, and geological processes affecting the surface, such as sequential crater ejecta deposition, impact gardening, and seismic settling. In some permanently shadowed craters there may be ice deposits just below the surface. Second, the geotechnical properties of the lunar surface layers are of keen interest to future mission planners. Regolith thickness, strength, density, grain size and compaction will affect construction of exploration infrastructure in terms of foundation strength and stability, ease of excavation, radiation shielding effectiveness, as well as raw material handling and processing techniques for resource extraction.

  1. Monitoring southwest Greenland's ice sheet melt with ambient seismic noise.

    PubMed

    Mordret, Aurélien; Mikesell, T Dylan; Harig, Christopher; Lipovsky, Bradley P; Prieto, Germán A

    2016-05-01

    The Greenland ice sheet presently accounts for ~70% of global ice sheet mass loss. Because this mass loss is associated with sea-level rise at a rate of 0.7 mm/year, the development of improved monitoring techniques to observe ongoing changes in ice sheet mass balance is of paramount concern. Spaceborne mass balance techniques are commonly used; however, they are inadequate for many purposes because of their low spatial and/or temporal resolution. We demonstrate that small variations in seismic wave speed in Earth's crust, as measured with the correlation of seismic noise, may be used to infer seasonal ice sheet mass balance. Seasonal loading and unloading of glacial mass induces strain in the crust, and these strains then result in seismic velocity changes due to poroelastic processes. Our method provides a new and independent way of monitoring (in near real time) ice sheet mass balance, yielding new constraints on ice sheet evolution and its contribution to global sea-level changes. An increased number of seismic stations in the vicinity of ice sheets will enhance our ability to create detailed space-time records of ice mass variations.

  2. 3D Seismic Experimentation and Advanced Processing/Inversion Development for Investigations of the Shallow Subsurface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levander, Alan Richard; Zelt, Colin A.

    2015-03-17

    The work plan for this project was to develop and apply advanced seismic reflection and wide-angle processing and inversion techniques to high resolution seismic data for the shallow subsurface to seismically characterize the shallow subsurface at hazardous waste sites as an aid to containment and cleanup activities. We proposed to continue work on seismic data that we had already acquired under a previous DoE grant, as well as to acquire additional new datasets for analysis. The project successfully developed and/or implemented the use of 3D reflection seismology algorithms, waveform tomography and finite-frequency tomography using compressional and shear waves for highmore » resolution characterization of the shallow subsurface at two waste sites. These two sites have markedly different near-surface structures, groundwater flow patterns, and hazardous waste problems. This is documented in the list of refereed documents, conference proceedings, and Rice graduate theses, listed below.« less

  3. Tutorial review of seismic surface waves' phenomenology

    NASA Astrophysics Data System (ADS)

    Levshin, A. L.; Barmin, M. P.; Ritzwoller, M. H.

    2018-03-01

    In recent years, surface wave seismology has become one of the leading directions in seismological investigations of the Earth's structure and seismic sources. Various applications cover a wide spectrum of goals, dealing with differences in sources of seismic excitation, penetration depths, frequency ranges, and interpretation techniques. Observed seismic data demonstrates the great variability of phenomenology which can produce difficulties in interpretation for beginners. This tutorial review is based on the many years' experience of authors in processing and interpretation of seismic surface wave observations and the lectures of one of the authors (ALL) at Workshops on Seismic Wave Excitation, Propagation and Interpretation held at the Abdus Salam International Center for Theoretical Physics (Trieste, Italy) in 1990-2012. We present some typical examples of wave patterns which could be encountered in different applications and which can serve as a guide to analysis of observed seismograms.

  4. Continuous monitoring of the lunar or Martian subsurface using on-board pattern recognition and neural processing of Rover geophysical data

    NASA Technical Reports Server (NTRS)

    Mcgill, J. W.; Glass, C. E.; Sternberg, B. K.

    1990-01-01

    The ultimate goal is to create an extraterrestrial unmanned system for subsurface mapping and exploration. Neural networks are to be used to recognize anomalies in the profiles that correspond to potentially exploitable subsurface features. The ground penetrating radar (GPR) techniques are likewise identical. Hence, the preliminary research focus on GPR systems will be directly applicable to seismic systems once such systems can be designed for continuous operation. The original GPR profile may be very complex due to electrical behavior of the background, targets, and antennas, much as the seismic record is made complex by multiple reflections, ghosting, and ringing. Because the format of the GPR data is similar to the format of seismic data, seismic processing software may be applied to GPR data to help enhance the data. A neural network may then be trained to more accurately identify anomalies from the processed record than from the original record.

  5. Support Vector Machine Model for Automatic Detection and Classification of Seismic Events

    NASA Astrophysics Data System (ADS)

    Barros, Vesna; Barros, Lucas

    2016-04-01

    The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support-vector network to various classical learning algorithms used before in seismic detection and classification is an essential final step to analyze the advantages and disadvantages of the model.

  6. Limitations of quantitative analysis of deep crustal seismic reflection data: Examples from GLIMPCE

    USGS Publications Warehouse

    Lee, Myung W.; Hutchinson, Deborah R.

    1992-01-01

    Amplitude preservation in seismic reflection data can be obtained by a relative true amplitude (RTA) processing technique in which the relative strength of reflection amplitudes is preserved vertically as well as horizontally, after compensating for amplitude distortion by near-surface effects and propagation effects. Quantitative analysis of relative true amplitudes of the Great Lakes International Multidisciplinary Program on Crustal Evolution seismic data is hampered by large uncertainties in estimates of the water bottom reflection coefficient and the vertical amplitude correction and by inadequate noise suppression. Processing techniques such as deconvolution, F-K filtering, and migration significantly change the overall shape of amplitude curves and hence calculation of reflection coefficients and average reflectance. Thus lithological interpretation of deep crustal seismic data based on the absolute value of estimated reflection strength alone is meaningless. The relative strength of individual events, however, is preserved on curves generated at different stages in the processing. We suggest that qualitative comparisons of relative strength, if used carefully, provide a meaningful measure of variations in reflectivity. Simple theoretical models indicate that peg-leg multiples rather than water bottom multiples are the most severe source of noise contamination. These multiples are extremely difficult to remove when the water bottom reflection coefficient is large (>0.6), a condition that exists beneath parts of Lake Superior and most of Lake Huron.

  7. Evaluation of seismic design spectrum based on UHS implementing fourth-generation seismic hazard maps of Canada

    NASA Astrophysics Data System (ADS)

    Ahmed, Ali; Hasan, Rafiq; Pekau, Oscar A.

    2016-12-01

    Two recent developments have come into the forefront with reference to updating the seismic design provisions for codes: (1) publication of new seismic hazard maps for Canada by the Geological Survey of Canada, and (2) emergence of the concept of new spectral format outdating the conventional standardized spectral format. The fourth -generation seismic hazard maps are based on enriched seismic data, enhanced knowledge of regional seismicity and improved seismic hazard modeling techniques. Therefore, the new maps are more accurate and need to incorporate into the Canadian Highway Bridge Design Code (CHBDC) for its next edition similar to its building counterpart National Building Code of Canada (NBCC). In fact, the code writers expressed similar intentions with comments in the commentary of CHBCD 2006. During the process of updating codes, NBCC, and AASHTO Guide Specifications for LRFD Seismic Bridge Design, American Association of State Highway and Transportation Officials, Washington (2009) lowered the probability level from 10 to 2% and 10 to 5%, respectively. This study has brought five sets of hazard maps corresponding to 2%, 5% and 10% probability of exceedance in 50 years developed by the GSC under investigation. To have a sound statistical inference, 389 Canadian cities are selected. This study shows the implications of the changes of new hazard maps on the design process (i.e., extent of magnification or reduction of the design forces).

  8. Ground Truth Events with Source Geometry in Eurasia and the Middle East

    DTIC Science & Technology

    2016-06-02

    source properties, including seismic moment, corner frequency, radiated energy , and stress drop have been obtained using spectra for S waves following...PARAMETERS Other source parameters, including radiated energy , corner frequency, seismic moment, and static stress drop were calculated using a spectral...technique (Richardson & Jordan, 2002; Andrews, 1986). The process entails separating event and station spectra and median- stacking each event’s

  9. Seismic Characterization of EGS Reservoirs

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.

    2014-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  10. Seismic imaging of post-glacial sediments - test study before Spitsbergen expedition

    NASA Astrophysics Data System (ADS)

    Szalas, Joanna; Grzyb, Jaroslaw; Majdanski, Mariusz

    2017-04-01

    This work presents results of the analysis of reflection seismic data acquired from testing area in central Poland. For this experiment we used total number of 147 vertical component seismic stations (DATA-CUBE and Reftek "Texan") with accelerated weight drop (PEG-40). The profile was 350 metres long. It is a part of pilot study for future research project on Spitsbergen. The purpose of the study is to recognise the characteristics of seismic response of post-glacial sediments in order to design the most adequate survey acquisition parameters and processing sequence for data from Spitsbergen. Multiple tests and comparisons have been performed to obtain the best possible quality of seismic image. In this research we examine the influence of receiver interval size, front mute application and surface wave attenuation attempts. Although seismic imaging is the main technique we are planning to support this analysis with additional data from traveltime tomography, MASW and other a priori information.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mellors, R J

    The Comprehensive Nuclear Test Ban Treaty (CTBT) includes provisions for an on-site inspection (OSI), which allows the use of specific techniques to detect underground anomalies including cavities and rubble zones. One permitted technique is active seismic surveys such as seismic refraction or reflection. The purpose of this report is to conduct some simple modeling to evaluate the potential use of seismic reflection in detecting cavities and to test the use of open-source software in modeling possible scenarios. It should be noted that OSI inspections are conducted under specific constraints regarding duration and logistics. These constraints are likely to significantly impactmore » active seismic surveying, as a seismic survey typically requires considerable equipment, effort, and expertise. For the purposes of this study, which is a first-order feasibility study, these issues will not be considered. This report provides a brief description of the seismic reflection method along with some commonly used software packages. This is followed by an outline of a simple processing stream based on a synthetic model, along with results from a set of models representing underground cavities. A set of scripts used to generate the models are presented in an appendix. We do not consider detection of underground facilities in this work and the geologic setting used in these tests is an extremely simple one.« less

  12. Post-Seismic Deformation from the 2009 Mw 6.3 Dachaidan Earthquake in the Northern Qaidam Basin Detected by Small Baseline Subset InSAR Technique

    PubMed Central

    Liu, Yang; Xu, Caijun; Wen, Yangmao; Li, Zhicai

    2016-01-01

    On 28 August 2009, one thrust-faulting Mw 6.3 earthquake struck the northern Qaidam basin, China. Due to the lack of ground observations in this remote region, this study presents high-precision and high spatio-temporal resolution post-seismic deformation series with a small baseline subset InSAR technique. At the temporal scale, this changes from fast to slow with time, with a maximum uplift up to 7.4 cm along the line of sight 334 days after the event. At the spatial scale, this is more obvious at the hanging wall than that at the footwall, and decreases from the middle to both sides at the hanging wall. We then propose a method to calculate the correlation coefficient between co-seismic and post-seismic deformation by normalizing them. The correlation coefficient is found to be 0.73, indicating a similar subsurface process occurring during both phases. The results indicate that afterslip may dominate the post-seismic deformation during 19–334 days after the event, which mainly occurs with the fault geometry and depth similar to those of the c-seismic rupturing, and partly extends to the shallower and deeper depths. PMID:26861330

  13. Post-Seismic Deformation from the 2009 Mw 6.3 Dachaidan Earthquake in the Northern Qaidam Basin Detected by Small Baseline Subset InSAR Technique.

    PubMed

    Liu, Yang; Xu, Caijun; Wen, Yangmao; Li, Zhicai

    2016-02-05

    On 28 August 2009, one thrust-faulting Mw 6.3 earthquake struck the northern Qaidam basin, China. Due to the lack of ground observations in this remote region, this study presents high-precision and high spatio-temporal resolution post-seismic deformation series with a small baseline subset InSAR technique. At the temporal scale, this changes from fast to slow with time, with a maximum uplift up to 7.4 cm along the line of sight 334 days after the event. At the spatial scale, this is more obvious at the hanging wall than that at the footwall, and decreases from the middle to both sides at the hanging wall. We then propose a method to calculate the correlation coefficient between co-seismic and post-seismic deformation by normalizing them. The correlation coefficient is found to be 0.73, indicating a similar subsurface process occurring during both phases. The results indicate that afterslip may dominate the post-seismic deformation during 19-334 days after the event, which mainly occurs with the fault geometry and depth similar to those of the c-seismic rupturing, and partly extends to the shallower and deeper depths.

  14. Swept Impact Seismic Technique (SIST)

    USGS Publications Warehouse

    Park, C.B.; Miller, R.D.; Steeples, D.W.; Black, R.A.

    1996-01-01

    A coded seismic technique is developed that can result in a higher signal-to-noise ratio than a conventional single-pulse method does. The technique is cost-effective and time-efficient and therefore well suited for shallow-reflection surveys where high resolution and cost-effectiveness are critical. A low-power impact source transmits a few to several hundred high-frequency broad-band seismic pulses during several seconds of recording time according to a deterministic coding scheme. The coding scheme consists of a time-encoded impact sequence in which the rate of impact (cycles/s) changes linearly with time providing a broad range of impact rates. Impact times used during the decoding process are recorded on one channel of the seismograph. The coding concept combines the vibroseis swept-frequency and the Mini-Sosie random impact concepts. The swept-frequency concept greatly improves the suppression of correlation noise with much fewer impacts than normally used in the Mini-Sosie technique. The impact concept makes the technique simple and efficient in generating high-resolution seismic data especially in the presence of noise. The transfer function of the impact sequence simulates a low-cut filter with the cutoff frequency the same as the lowest impact rate. This property can be used to attenuate low-frequency ground-roll noise without using an analog low-cut filter or a spatial source (or receiver) array as is necessary with a conventional single-pulse method. Because of the discontinuous coding scheme, the decoding process is accomplished by a "shift-and-stacking" method that is much simpler and quicker than cross-correlation. The simplicity of the coding allows the mechanical design of the source to remain simple. Several different types of mechanical systems could be adapted to generate a linear impact sweep. In addition, the simplicity of the coding also allows the technique to be used with conventional acquisition systems, with only minor modifications.

  15. Development of a low cost method to estimate the seismic signature of a geothermal field from ambient seismic noise analysis, Authors: Tibuleac, I. M., J. Iovenitti, S. Pullammanapallil, D. von Seggern, F.H. Ibser, D. Shaw and H. McLahlan

    NASA Astrophysics Data System (ADS)

    Tibuleac, I. M.; Iovenitti, J. L.; Pullammanappallil, S. K.; von Seggern, D. H.; Ibser, H.; Shaw, D.; McLachlan, H.

    2015-12-01

    A new, cost effective and non-invasive exploration method using ambient seismic noise has been tested at Soda Lake, NV, with promising results. Seismic interferometry was used to extract Green's Functions (P and surface waves) from 21 days of continuous ambient seismic noise. With the advantage of S-velocity models estimated from surface waves, an ambient noise seismic reflection survey along a line (named Line 2), although with lower resolution, reproduced the results of the active survey, when the ambient seismic noise was not contaminated by strong cultural noise. Ambient noise resolution was less at depth (below 1000m) compared to the active survey. Useful information could be recovered from ambient seismic noise, including dipping features and fault locations. Processing method tests were developed, with potential to improve the virtual reflection survey results. Through innovative signal processing techniques, periods not typically analyzed with high frequency sensors were used in this study to obtain seismic velocity model information to a depth of 1.4km. New seismic parameters such as Green's Function reflection component lateral variations, waveform entropy, stochastic parameters (Correlation Length and Hurst number) and spectral frequency content extracted from active and passive surveys showed potential to indicate geothermal favorability through their correlation with high temperature anomalies, and showed potential as fault indicators, thus reducing the uncertainty in fault identification. Geothermal favorability maps along ambient seismic Line 2 were generated considering temperature, lithology and the seismic parameters investigated in this study and compared to the active Line 2 results. Pseudo-favorability maps were also generated using only the seismic parameters analyzed in this study.

  16. Infrasound as a Geophysical Probe Using Earth as a Venus Analog

    NASA Astrophysics Data System (ADS)

    Komjathy, Attila; Cutts, James; Pauken, Michael; Kedar, Sharon; Smrekar, Suzanne

    2016-10-01

    JPL is in a process of developing an instrument to measure seismic activity on Venus by detecting infrasonic waves in the atmosphere. The overall objective of this research is to demonstrate the feasibility of using sensitive barometers to detect infrasonic signals from seismic and explosive activity on Venus from a balloon platform. Because of Venus' dense atmosphere, seismic signatures from even small quakes (magnitude ~3) are effectively coupled into the atmosphere. The seismic signals are known to couple about 60 times more efficiently into the atmosphere on Venus than on Earth. It was found that almost no attenuation below 80 km on Venus for frequency less than 1Hz. Whereas wind noise is a major source of background noise for terrestrial infrasonic arrays, it is expected that a balloon platform, which drifts with winds will be capable of very sensitive measurements with low noise.In our research we will demonstrate and apply techniques for discriminating upward propagating waves from a seismic event by making measurements with two or more infrasonic sensors using very sensitive barometers on a tether deployed from the balloon in a series of earth-based tests. We will first demonstrate and validate the technique using an artificial infrasound source in a deployment from a hot air balloon on Earth and then extend it with longer duration flights in the troposphere and stratosphere.We will report results on the first flight experiment that will focus on using the barometer instruments on a tethered helium-filled balloon. The balloon flight will be conducted in the vicinity of a known seismic source generated by a seismic hammer. Earlier tests conducted by Sandia National Laboratory demonstrated that this is a highly reproducible source of seismic and acoustic energy using infrasound sensors. The results of the experiments are intended to validate the two-barometer signal processing approach using a well-characterized point signal source.

  17. Infrasound as a Geophysical Probe Using Earth as a Venus Analog

    NASA Astrophysics Data System (ADS)

    Komjathy, A.; Cutts, J. A.; Pauken, M.; Kedar, S.; Smrekar, S. E.; Hall, J. R.

    2016-12-01

    JPL is in a process of developing an instrument to measure seismic activity on Venus by detecting infrasonic waves in the atmosphere. The overall objective of this research is to demonstrate the feasibility of using sensitive barometers to detect infrasonic signals from seismic and explosive activity on Venus from a balloon platform. Because of Venus' dense atmosphere, seismic signatures from even small quakes (magnitude 3) are effectively coupled into the atmosphere. The seismic signals are known to couple about 60 times more efficiently into the atmosphere on Venus than on Earth. It was found that almost no attenuation below 80 km on Venus for frequency less than 1Hz. Whereas wind noise is a major source of background noise for terrestrial infrasonic arrays, it is expected that a balloon platform, which drifts with winds will be capable of very sensitive measurements with low noise. In our research we will demonstrate and apply techniques for discriminating upward propagating waves from a seismic event by making measurements with two or more infrasonic sensors using very sensitive barometers on a tether deployed from the balloon in a series of earth-based tests. We will first demonstrate and validate the technique using an artificial infrasound source in a deployment from a hot air balloon on Earth and then extend it with longer duration flights in the troposphere and stratosphere. We will report results on the first flight experiment that will focus on using the barometer instruments on a tethered helium-filled balloon. The balloon flight will be conducted in the vicinity of a known seismic source generated by a seismic hammer. Earlier tests conducted by Sandia National Laboratory demonstrated that this is a highly reproducible source of seismic and acoustic energy using infrasound sensors. The results of the experiments are intended to validate the two-barometer signal processing approach using a well-characterized point signal source.

  18. Glacier seismology: eavesdropping on the ice-bed interface

    NASA Astrophysics Data System (ADS)

    Walter, F.; Röösli, C.

    2015-12-01

    Glacier sliding plays a central role in ice dynamics. A number of remote sensing and deep drilling initiatives have therefore focused on the ice-bed interface. Although these techniques have provided valuable insights into bed properties, they do not supply theorists with data of sufficient temporal and spatial resolution to rigorously test mathematical sliding laws. As an alternative, passive seismic techniques have gained popularity in glacier monitoring. Analysis of glacier-related seismic sources ('icequakes') has become a useful technique to study inaccessible regions of the cryosphere, including the ice-bed interface. Seismic monitoring networks on the polar ice sheets have shown that ice sliding is not only a smooth process involving viscous deformation and regelation of basal ice layers. Instead, ice streams exhibit sudden slip episodes over their beds and intermittent phases of partial or complete stagnation. Here we discuss new and recently published discoveries of basal seismic sources beneath various glacial bodies. We revisit basal seismicity of hard-bedded Alpine glaciers, which is not the result of pure stick-slip motion. Sudden changes in seismicity suggest that the local configuration of the subglacial drainage system undergoes changes on sub daily time scales. Accordingly, such observations place constraints on basal resistance and sliding of hard-bedded glaciers. In contrast, certain clusters of stick-slip dislocations associated with micro seismicity beneath the Greenland ice sheet undergo diurnal variations in magnitudes and inter event times. This is best explained with a soft till bed, which hosts the shear dislocations and whose strength varies in response to changes in subglacial water pressure. These results suggest that analysis of basal icequakes is well suited for characterizing glacier and ice sheet beds. Future studies should address the relative importance between "smooth" and seismogenic sliding in different glacial environments.

  19. Hydrogeologic Controls on Water Dynamics in a Discontinuous Permafrost, Lake-Rich Landscape

    NASA Astrophysics Data System (ADS)

    Walvoord, M. A.; Briggs, M. A.; Day-Lewis, F. D.; Jepsen, S. M.; Lane, J. W., Jr.; McKenzie, J. M.; Minsley, B. J.; Striegl, R. G.; Voss, C. I.; Wellman, T. P.

    2014-12-01

    Glacier sliding plays a central role in ice dynamics. A number of remote sensing and deep drilling initiatives have therefore focused on the ice-bed interface. Although these techniques have provided valuable insights into bed properties, they do not supply theorists with data of sufficient temporal and spatial resolution to rigorously test mathematical sliding laws. As an alternative, passive seismic techniques have gained popularity in glacier monitoring. Analysis of glacier-related seismic sources ('icequakes') has become a useful technique to study inaccessible regions of the cryosphere, including the ice-bed interface. Seismic monitoring networks on the polar ice sheets have shown that ice sliding is not only a smooth process involving viscous deformation and regelation of basal ice layers. Instead, ice streams exhibit sudden slip episodes over their beds and intermittent phases of partial or complete stagnation. Here we discuss new and recently published discoveries of basal seismic sources beneath various glacial bodies. We revisit basal seismicity of hard-bedded Alpine glaciers, which is not the result of pure stick-slip motion. Sudden changes in seismicity suggest that the local configuration of the subglacial drainage system undergoes changes on sub daily time scales. Accordingly, such observations place constraints on basal resistance and sliding of hard-bedded glaciers. In contrast, certain clusters of stick-slip dislocations associated with micro seismicity beneath the Greenland ice sheet undergo diurnal variations in magnitudes and inter event times. This is best explained with a soft till bed, which hosts the shear dislocations and whose strength varies in response to changes in subglacial water pressure. These results suggest that analysis of basal icequakes is well suited for characterizing glacier and ice sheet beds. Future studies should address the relative importance between "smooth" and seismogenic sliding in different glacial environments.

  20. Semi-automatic mapping for identifying complex geobodies in seismic images

    NASA Astrophysics Data System (ADS)

    Domínguez-C, Raymundo; Romero-Salcedo, Manuel; Velasquillo-Martínez, Luis G.; Shemeretov, Leonid

    2017-03-01

    Seismic images are composed of positive and negative seismic wave traces with different amplitudes (Robein 2010 Seismic Imaging: A Review of the Techniques, their Principles, Merits and Limitations (Houten: EAGE)). The association of these amplitudes together with a color palette forms complex visual patterns. The color intensity of such patterns is directly related to impedance contrasts: the higher the contrast, the higher the color intensity. Generally speaking, low impedance contrasts are depicted with low tone colors, creating zones with different patterns whose features are not evident for a 3D automated mapping option available on commercial software. In this work, a workflow for a semi-automatic mapping of seismic images focused on those areas with low-intensity colored zones that may be associated with geobodies of petroleum interest is proposed. The CIE L*A*B* color space was used to perform the seismic image processing, which helped find small but significant differences between pixel tones. This process generated binary masks that bound color regions to low-intensity colors. The three-dimensional-mask projection allowed the construction of 3D structures for such zones (geobodies). The proposed method was applied to a set of digital images from a seismic cube and tested on four representative study cases. The obtained results are encouraging because interesting geobodies are obtained with a minimum of information.

  1. Array seismological investigation of the South Atlantic 'Superplume'

    NASA Astrophysics Data System (ADS)

    Hempel, Stefanie; Gassmöller, Rene; Thomas, Christine

    2015-04-01

    We apply the axisymmetric, spherical Earth spectral elements code AxiSEM to model seismic compressional waves which sample complex `superplume' structures in the lower mantle. High-resolution array seismological stacking techniques are evaluated regarding their capability to resolve large-scale high-density low-velocity bodies including interior structure such as inner upwellings, high density lenses, ultra-low velocity zones (ULVZs), neighboring remnant slabs and adjacent small-scale uprisings. Synthetic seismograms are also computed and processed for models of the Earth resulting from geodynamic modelling of the South Atlantic mantle including plate reconstruction. We discuss the interference and suppression of the resulting seismic signals and implications for a seismic data study in terms of visibility of the South Atlantic `superplume' structure. This knowledge is used to process, invert and interpret our data set of seismic sources from the Andes and the South Sandwich Islands detected at seismic arrays spanning from Ethiopia over Cameroon to South Africa mapping the South Atlantic `superplume' structure including its interior structure. In order too present the model of the South Atlantic `superplume' structure that best fits the seismic data set, we iteratively compute synthetic seismograms while adjusting the model according to the dependencies found in the parameter study.

  2. Earthquake Building Damage Mapping Based on Feature Analyzing Method from Synthetic Aperture Radar Data

    NASA Astrophysics Data System (ADS)

    An, L.; Zhang, J.; Gong, L.

    2018-04-01

    Playing an important role in gathering information of social infrastructure damage, Synthetic Aperture Radar (SAR) remote sensing is a useful tool for monitoring earthquake disasters. With the wide application of this technique, a standard method, comparing post-seismic to pre-seismic data, become common. However, multi-temporal SAR processes, are not always achievable. To develop a post-seismic data only method for building damage detection, is of great importance. In this paper, the authors are now initiating experimental investigation to establish an object-based feature analysing classification method for building damage recognition.

  3. Deviant Earthquakes: Data-driven Constraints on the Variability in Earthquake Source Properties and Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Trugman, Daniel Taylor

    The complexity of the earthquake rupture process makes earthquakes inherently unpredictable. Seismic hazard forecasts often presume that the rate of earthquake occurrence can be adequately modeled as a space-time homogenenous or stationary Poisson process and that the relation between the dynamical source properties of small and large earthquakes obey self-similar scaling relations. While these simplified models provide useful approximations and encapsulate the first-order statistical features of the historical seismic record, they are inconsistent with the complexity underlying earthquake occurrence and can lead to misleading assessments of seismic hazard when applied in practice. The six principle chapters of this thesis explore the extent to which the behavior of real earthquakes deviates from these simplified models, and the implications that the observed deviations have for our understanding of earthquake rupture processes and seismic hazard. Chapter 1 provides a brief thematic overview and introduction to the scope of this thesis. Chapter 2 examines the complexity of the 2010 M7.2 El Mayor-Cucapah earthquake, focusing on the relation between its unexpected and unprecedented occurrence and anthropogenic stresses from the nearby Cerro Prieto Geothermal Field. Chapter 3 compares long-term changes in seismicity within California's three largest geothermal fields in an effort to characterize the relative influence of natural and anthropogenic stress transients on local seismic hazard. Chapter 4 describes a hybrid, hierarchical clustering algorithm that can be used to relocate earthquakes using waveform cross-correlation, and applies the new algorithm to study the spatiotemporal evolution of two recent seismic swarms in western Nevada. Chapter 5 describes a new spectral decomposition technique that can be used to analyze the dynamic source properties of large datasets of earthquakes, and applies this approach to revisit the question of self-similar scaling of southern California seismicity. Chapter 6 builds upon these results and applies the same spectral decomposition technique to examine the source properties of several thousand recent earthquakes in southern Kansas that are likely human-induced by massive oil and gas operations in the region. Chapter 7 studies the connection between source spectral properties and earthquake hazard, focusing on spatial variations in dynamic stress drop and its influence on ground motion amplitudes. Finally, Chapter 8 provides a summary of the key findings of and relations between these studies, and outlines potential avenues of future research.

  4. 30 CFR 280.51 - What types of geophysical data and information must I submit to MMS?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., shallow and deep subbottom profiles, bathymetry, sidescan sonar, gravity and magnetic surveys, and special... and of a quality suitable for processing; (c) Processed geophysical information derived from seismic... interpretive evaluation, reflecting state-of-the-art processing techniques; and (d) Other geophysical data...

  5. Enhancement of seismic monitoring in hydrocarbon reservoirs

    NASA Astrophysics Data System (ADS)

    Caffagni, Enrico; Bokelmann, Götz

    2017-04-01

    Hydraulic Fracturing (HF) is widely considered as one of the most significant enablers of the successful exploitation of hydrocarbons in North America. Massive usage of HF is currently adopted to increase the permeability in shale and tight-sand deep reservoirs, despite the economical downturn. The exploitation success is less due to the subsurface geology, but in technology that improves exploration, production, and decision-making. This includes monitoring of the reservoir, which is vital. Indeed, the general mindset in the industry is to keep enhancing seismic monitoring. It allows understanding and tracking processes in hydrocarbon reservoirs, which serves two purposes, a) to optimize recovery, and b) to help minimize environmental impact. This raises the question of how monitoring, and especially seismic techniques could be more efficient. There is a pressing demand from seismic service industry to evolve quickly and to meet the oil-gas industry's changing needs. Nonetheless, the innovative monitoring techniques, to achieve the purpose, must enhance the characterization or the visualization of a superior-quality images of the reservoir. We discuss recent applications of seismic monitoring in hydrocarbon reservoirs, detailing potential enhancement and eventual limitations. The aim is to test the validity of these seismic monitoring techniques, qualitatively discuss their potential application to energy fields that are not only limited to HF. Outcomes from our investigation may benefit operators and regulators in case of future massive HF applications in Europe, as well. This work is part of the FracRisk consortium (www.fracrisk.eu), funded by the Horizon2020 research programme, whose aims is to help minimize the environmental footprint of the shale-gas exploration and exploitation.

  6. An active seismic experiment at Tenerife Island (Canary Island, Spain): Imaging an active volcano edifice

    NASA Astrophysics Data System (ADS)

    Garcia-Yeguas, A.; Ibañez, J. M.; Rietbrock, A.; Tom-Teidevs, G.

    2008-12-01

    An active seismic experiment to study the internal structure of Teide Volcano was carried out on Tenerife, a volcanic island in Spain's Canary Islands. The main objective of the TOM-TEIDEVS experiment is to obtain a 3-dimensional structural image of Teide Volcano using seismic tomography and seismic reflection/refraction imaging techniques. At present, knowledge of the deeper structure of Teide and Tenerife is very limited, with proposed structural models mainly based on sparse geophysical and geological data. This multinational experiment which involves institutes from Spain, Italy, the United Kingdom, Ireland, and Mexico will generate a unique high resolution structural image of the active volcano edifice and will further our understanding of volcanic processes.

  7. Processing grounded-wire TEM signal in time-frequency-pseudo-seismic domain: A new paradigm

    NASA Astrophysics Data System (ADS)

    Khan, M. Y.; Xue, G. Q.; Chen, W.; Huasen, Z.

    2017-12-01

    Grounded-wire TEM has received great attention in mineral, hydrocarbon and hydrogeological investigations for the last several years. Conventionally, TEM soundings have been presented as apparent resistivity curves as function of time. With development of sophisticated computational algorithms, it became possible to extract more realistic geoelectric information by applying inversion programs to 1-D & 3-D problems. Here, we analyze grounded-wire TEM data by carrying out analysis in time, frequency and pseudo-seismic domain supported by borehole information. At first, H, K, A & Q type geoelectric models are processed using a proven inversion program (1-D Occam inversion). Second, time-to-frequency transformation is conducted from TEM ρa(t) curves to magneto telluric MT ρa(f) curves for the same models based on all-time apparent resistivity curves. Third, 1-D Bostick's algorithm was applied to the transformed resistivity. Finally, EM diffusion field is transformed into propagating wave field obeying the standard wave equation using wavelet transformation technique and constructed pseudo-seismic section. The transformed seismic-like wave indicates that some reflection and refraction phenomena appear when the EM wave field interacts with geoelectric interface at different depth intervals due to contrast in resistivity. The resolution of the transformed TEM data is significantly improved in comparison to apparent resistivity plots. A case study illustrates the successful hydrogeophysical application of proposed approach in recovering water-filled mined-out area in a coal field located in Ye county, Henan province, China. The results support the introduction of pseudo-seismic imaging technology in short-offset version of TEM which can also be an useful aid if integrated with seismic reflection technique to explore possibilities for high resolution EM imaging in future.

  8. Improved Phase Characterization of Far-Regional Body Wave Arrivals in Central Asia

    DTIC Science & Technology

    2009-09-30

    array processing techniques. The regional seismic arrays that have been built in the last fifteen years should be a rich data source for the study of...far-regional phase behavior. The arrays are composed of high-quality borehole seismometers that make high fidelity, low-noise recordings. However...that propagate from the different seismic regions of South-Central Asia, utilizing recordings from the Makanchi (MKAR) and Karatau (KKAR) arrays in

  9. Time-resolved seismic tomography at the EGS geothermal reservoir of Soultz-Sous-Forêts (France) during hydraulic stimulations. A comparison between different injection tests

    NASA Astrophysics Data System (ADS)

    Dorbath, C.; Calo, M.; Cornet, F.; Frogneux, M.

    2011-12-01

    One major goal of monitoring seismicity accompanying hydraulic fracturing of a reservoir is to recover the seismic velocity field in and around the geothermal site. Several studies have shown that the 4D (time dependent) seismic tomographies are very useful to illustrate and study the temporal variation of the seismic velocities conditioned by injected fluids. However, only an appropriate separation of the data in subsets and a reliable tomographic method allow studying representative variations of the seismic velocities during and after the injection periods. We present here new 4D seismic tomographies performed using datasets regarding some stimulation tests performed at the Enhanced Geothermal System (EGS) site of Soultz-sous-Forêts (Alsace, France). The data used were recorded during the stimulation tests occurred in 2000, 2003 and 2004 that involved the wells GPK2, GPK3 and GPK4. For each set of events, the subsetting of the data was performed by taking into account the injection parameters of the stimulation tests (namely the injected flow rate and the wellhead pressure). The velocity models have been obtained using the Double-Difference tomographic method (Zhang and Thurber 2003) and further improved with the post-processing WAM technique (Calo' et al., 2009, 2011). This technique resulted very powerful because combines high resolution and reliablity of the seismic velocity fields calculated even with small datasets. In this work we show the complete sequence of the time-lapse tomographies and their variations in time and between different stimulation tests.

  10. Seismic and acoustic signal identification algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LADD,MARK D.; ALAM,M. KATHLEEN; SLEEFE,GERARD E.

    2000-04-03

    This paper will describe an algorithm for detecting and classifying seismic and acoustic signals for unattended ground sensors. The algorithm must be computationally efficient and continuously process a data stream in order to establish whether or not a desired signal has changed state (turned-on or off). The paper will focus on describing a Fourier based technique that compares the running power spectral density estimate of the data to a predetermined signature in order to determine if the desired signal has changed state. How to establish the signature and the detection thresholds will be discussed as well as the theoretical statisticsmore » of the algorithm for the Gaussian noise case with results from simulated data. Actual seismic data results will also be discussed along with techniques used to reduce false alarms due to the inherent nonstationary noise environments found with actual data.« less

  11. Analysis of volcano-related seismicity to constrain the magmatic plumbing system beneath Fogo, Cape Verde, by (multi-)array techniques

    NASA Astrophysics Data System (ADS)

    Dietrich, Carola; Wölbern, Ingo; Faria, Bruno; Rümpker, Georg

    2017-04-01

    Fogo is the only island of the Cape Verde archipelago with regular occurring volcanic eruptions since its discovery in the 15th century. The volcanism of the archipelago originates from a mantle plume beneath an almost stationary tectonic plate. With an eruption interval of approximately 20 years, Fogo belongs to the most active oceanic volcanoes. The latest eruption started in November 2014 and ceased in February 2015. This study aims to characterize and investigate the seismic activity and the magmatic plumbing system of Fogo, which is believed to be related to a magmatic source close to the neighboring island of Brava. According to previous studies, using conventional seismic network configurations, most of the seismic activity occurs offshore. Therefore, seismological array techniques represent powerful tools in investigating earthquakes and other volcano-related events located outside of the networks. Another advantage in the use of seismic arrays is their possibility to detect events of relatively small magnitude and to locate seismic signals without a clear onset of phases, such as volcanic tremors. Since October 2015 we have been operating a test array on Fogo as part of a pilot study. This array consists of 10 seismic stations, distributed in a circular shape with an aperture of 700 m. The stations are equipped with Omnirecs CUBE dataloggers, and either 4.5 Hz geophones (7 stations) or Trillium-Compact broad-band seismometers (3 stations). In January 2016 we installed three additional broad-band stations distributed across the island of Fogo to improve the capabilities for event localization. The data of the pilot study is dominated by seismic activity around Brava, but also exhibit tremors and hybrid events of unknown origin within the caldera of Fogo volcano. The preliminary analysis of these events includes the characterization and localization of the different event types using seismic array processing in combination with conventional localization methods. In the beginning of August 2016, a "seismic crisis" occurred on the island of Brava which led to the evacuation of a village. The seismic activity recorded by our instruments on Fogo exhibits more than 40 earthquakes during this time. Locations and magnitudes of these events will be presented. In January 2017 the pilot project discussed here will be complemented by three additional seismic arrays (two on Fogo, one on Brava) to improve seismic event localization and structural imaging based on scattered seismic phases by using multi-array techniques. Initial recordings from the new arrays are expected to be available by April 2017.

  12. Application of seismic-refraction techniques to hydrologic studies

    USGS Publications Warehouse

    Haeni, F.P.

    1986-01-01

    During the past 30 years, seismic-refraction methods have been used extensively in petroleum, mineral, and engineering investigations, and to some extent for hydrologic applications. Recent advances in equipment, sound sources, and computer interpretation techniques make seismic refraction a highly effective and economical means of obtaining subsurface data in hydrologic studies. Aquifers that can be defined by one or more high seismic-velocity surfaces, such as (1) alluvial or glacial deposits in consolidated rock valleys, (2) limestone or sandstone underlain by metamorphic or igneous rock, or (3) saturated unconsolidated deposits overlain by unsaturated unconsolidated deposits,are ideally suited for applying seismic-refraction methods. These methods allow the economical collection of subsurface data, provide the basis for more efficient collection of data by test drilling or aquifer tests, and result in improved hydrologic studies.This manual briefly reviews the basics of seismic-refraction theory and principles. It emphasizes the use of this technique in hydrologic investigations and describes the planning, equipment, field procedures, and intrepretation techniques needed for this type of study.Examples of the use of seismic-refraction techniques in a wide variety of hydrologic studies are presented.

  13. Application of seismic-refraction techniques to hydrologic studies

    USGS Publications Warehouse

    Haeni, F.P.

    1988-01-01

    During the past 30 years, seismic-refraction methods have been used extensively in petroleum, mineral, and engineering investigations and to some extent for hydrologic applications. Recent advances in equipment, sound sources, and computer interpretation techniques make seismic refraction a highly effective and economical means of obtaining subsurface data in hydrologic studies. Aquifers that can be defined by one or more high-seismic-velocity surface, such as (1) alluvial or glacial deposits in consolidated rock valleys, (2) limestone or sandstone underlain by metamorphic or igneous rock, or (3) saturated unconsolidated deposits overlain by unsaturated unconsolidated deposits, are ideally suited for seismic-refraction methods. These methods allow economical collection of subsurface data, provide the basis for more efficient collection of data by test drilling or aquifer tests, and result in improved hydrologic studies. This manual briefly reviews the basics of seismic-refraction theory and principles. It emphasizes the use of these techniques in hydrologic investigations and describes the planning, equipment, field procedures, and interpretation techniques needed for this type of study. Further-more, examples of the use of seismic-refraction techniques in a wide variety of hydrologic studies are presented.

  14. Improved microseismic event locations through large-N arrays and wave-equation imaging and inversion

    NASA Astrophysics Data System (ADS)

    Witten, B.; Shragge, J. C.

    2016-12-01

    The recent increased focus on small-scale seismicity, Mw < 4 has come about primarily for two reasons. First, there is an increase in induced seismicity related to injection operations primarily for wastewater disposal and hydraulic fracturing for oil and gas recovery and for geothermal energy production. While the seismicity associated with injection is sometimes felt, it is more often weak. Some weak events are detected on current sparse arrays; however, accurate location of the events often requires a larger number of (multi-component) sensors. This leads to the second reason for an increased focus on small magnitude seismicity: a greater number of seismometers are being deployed in large N-arrays. The greater number of sensors decreases the detection threshold and therefore significantly increases the number of weak events found. Overall, these two factors bring new challenges and opportunities. Many standard seismological location and inversion techniques are geared toward large, easily identifiable events recorded on a sparse number of stations. However, with large-N arrays we can detect small events by utilizing multi-trace processing techniques, and increased processing power equips us with tools that employ more complete physics for simultaneously locating events and inverting for P- and S-wave velocity structure. We present a method that uses large-N arrays and wave-equation-based imaging and inversion to jointly locate earthquakes and estimate the elastic velocities of the earth. The technique requires no picking and is thus suitable for weak events. We validate the methodology through synthetic and field data examples.

  15. Monitoring the englacial fracture state using virtual-reflector seismology

    NASA Astrophysics Data System (ADS)

    Lindner, F.; Weemstra, C.; Walter, F.; Hadziioannou, C.

    2017-12-01

    Fracturing and changes in the englacial macroscopic water content change the elastic bulk properties of ice bodies. Small seismic velocity variations, resulting from such changes, can be measured using a technique called coda-wave interferometry. Here, coda refers to the later-arriving, multiply scattered waves. Often, this technique is applied to so-called virtual-source responses, which can be obtained using seismic interferometry (a simple crosscorrelation process). Compared to other media (e.g., the Earth's crust), however, ice bodies exhibit relatively little scattering. This complicates the application of coda-wave interferometry to the retrieved virtual-source responses. In this work, we therefore investigate the applicability of coda-wave interferometry to virtual-source responses obtained using two alternative seismic interferometric techniques, namely, seismic interferometry by multidimensional deconvolution (SI by MDD), and virtual-reflector seismology (VRS). To that end, we use synthetic data, as well as active-source glacier data acquired on Glacier de la Plaine Morte, Switzerland. Both SI by MDD and VRS allow the retrieval of more accurate virtual-source responses. In particular, the dependence of the retrieved virtual-source responses on the illumination pattern is reduced. We find that this results in more accurate glacial phase-velocity estimates. In addition, VRS introduces virtual reflections from a receiver contour (partly) enclosing the medium of interest. By acting as a sort of virtual reverberation, the coda resulting from the application of VRS significantly increases seismic monitoring capabilities, in particular in cases where natural scattering coda is not available.

  16. Passive seismic monitoring of natural and induced earthquakes: case studies, future directions and socio-economic relevance

    USGS Publications Warehouse

    Bohnhoff, Marco; Dresen, Georg; Ellsworth, William L.; Ito, Hisao; Cloetingh, Sierd; Negendank, Jörg

    2010-01-01

    An important discovery in crustal mechanics has been that the Earth’s crust is commonly stressed close to failure, even in tectonically quiet areas. As a result, small natural or man-made perturbations to the local stress field may trigger earthquakes. To understand these processes, Passive Seismic Monitoring (PSM) with seismometer arrays is a widely used technique that has been successfully applied to study seismicity at different magnitude levels ranging from acoustic emissions generated in the laboratory under controlled conditions, to seismicity induced by hydraulic stimulations in geological reservoirs, and up to great earthquakes occurring along plate boundaries. In all these environments the appropriate deployment of seismic sensors, i.e., directly on the rock sample, at the earth’s surface or in boreholes close to the seismic sources allows for the detection and location of brittle failure processes at sufficiently low magnitude-detection threshold and with adequate spatial resolution for further analysis. One principal aim is to develop an improved understanding of the physical processes occurring at the seismic source and their relationship to the host geologic environment. In this paper we review selected case studies and future directions of PSM efforts across a wide range of scales and environments. These include induced failure within small rock samples, hydrocarbon reservoirs, and natural seismicity at convergent and transform plate boundaries. Each example represents a milestone with regard to bridging the gap between laboratory-scale experiments under controlled boundary conditions and large-scale field studies. The common motivation for all studies is to refine the understanding of how earthquakes nucleate, how they proceed and how they interact in space and time. This is of special relevance at the larger end of the magnitude scale, i.e., for large devastating earthquakes due to their severe socio-economic impact.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bame, D.

    To determine if seismic signals at frequencies up to 50 Hz are useful for detecting events and discriminating between earthquakes and explosions, approximately 180 events from the three-component high-frequency seismic element (HFSE) installed at the center of the Norwegian Regional Seismic Array (NRSA) have been analyzed. The attenuation of high-frequency signals in Scandinavia varies with distance, azimuth, magnitude, and source effects. Most of the events were detected with HFSE, although detections were better on the NRSA where signal processing techniques were used. Based on a preliminary analysis, high-frequency data do not appear to be a useful discriminant in Scandinavia. 21more » refs., 29 figs., 3 tabs.« less

  18. Cluster Computing For Real Time Seismic Array Analysis.

    NASA Astrophysics Data System (ADS)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by a pro- gram which reads data from disk files and send them to a remote host by using the Internet protocols.

  19. Towards Quantification of Glacier Dynamic Ice Loss through Passive Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Köhler, A.; Nuth, C.; Weidle, C.; Schweitzer, J.; Kohler, J.; Buscaino, G.

    2015-12-01

    Global glaciers and ice caps loose mass through calving, while existing models are currently not equipped to realistically predict dynamic ice loss. This is mainly because long-term continuous calving records, that would help to better understand fine scale processes and key climatic-dynamic feedbacks between calving, climate, terminus evolution and marine conditions, do not exist. Combined passive seismic/acoustic strategies are the only technique able to capture rapid calving events continuously, independent of daylight or meteorological conditions. We have produced such a continuous calving record for Kronebreen, a tidewater glacier in Svalbard, using data from permanent seismic stations between 2001 and 2014. However, currently no method has been established in cryo-seismology to quantify the calving ice loss directly from seismic data. Independent calibration data is required to derive 1) a realistic estimation of the dynamic ice loss unobserved due to seismic noise and 2) a robust scaling of seismic calving signals to ice volumes. Here, we analyze the seismic calving record at Kronebreen and independent calving data in a first attempt to quantify ice loss directly from seismic records. We make use of a) calving flux data with weekly to monthly resolution obtained from satellite remote sensing and GPS data between 2007 and 2013, and b) direct, visual calving observations in two weeks in 2009 and 2010. Furthermore, the magnitude-scaling property of seismic calving events is analyzed. We derive and discuss an empirical relation between seismic calving events and calving flux which for the first time allows to estimate a time series of calving volumes more than one decade back in time. Improving our model requires to incorporate more precise, high-resolution calibration data. A new field campaign will combine innovative, multi-disciplinary monitoring techniques to measure calving ice volumes and dynamic ice-ocean interactions simultaneously with terrestrial laser scanning and a temporary seismic/underwater-acoustic network.

  20. Digital processing of array seismic recordings

    USGS Publications Warehouse

    Ryall, Alan; Birtill, John

    1962-01-01

    This technical letter contains a brief review of the operations which are involved in digital processing of array seismic recordings by the methods of velocity filtering, summation, cross-multiplication and integration, and by combinations of these operations (the "UK Method" and multiple correlation). Examples are presented of analyses by the several techniques on array recordings which were obtained by the U.S. Geological Survey during chemical and nuclear explosions in the western United States. Seismograms are synthesized using actual noise and Pn-signal recordings, such that the signal-to-noise ratio, onset time and velocity of the signal are predetermined for the synthetic record. These records are then analyzed by summation, cross-multiplication, multiple correlation and the UK technique, and the results are compared. For all of the examples presented, analysis by the non-linear techniques of multiple correlation and cross-multiplication of the traces on an array recording are preferred to analyses by the linear operations involved in summation and the UK Method.

  1. A Comparison of seismic instrument noise coherence analysis techniques

    USGS Publications Warehouse

    Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.

    2011-01-01

    The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.

  2. Patterns in Seismicity at Mt St Helens and Mt Unzen

    NASA Astrophysics Data System (ADS)

    Lamb, Oliver; De Angelis, Silvio; Lavallee, Yan

    2014-05-01

    Cyclic behaviour on a range of timescales is a well-documented feature of many dome-forming volcanoes. Previous work on Soufrière Hills volcano (Montserrat) and Volcán de Colima (Mexico) revealed broad-scale similarities in behaviour implying the potential to develop general physical models of sub-surface processes [1]. Using volcano-seismic data from Mt St Helens (USA) and Mt Unzen (Japan) this study explores parallels in long-term behaviour of seismicity at two dome-forming systems. Within the last twenty years both systems underwent extended dome-forming episodes accompanied by large Vulcanian explosions or dome collapses. This study uses a suite of quantitative and analytical techniques which can highlight differences or similarities in volcano seismic behaviour, and compare the behaviour to changes in activity during the eruptive episodes. Seismic events were automatically detected and characterized on a single short-period seismometer station located 1.5km from the 2004-2008 vent at Mt St Helens. A total of 714 826 individual events were identified from continuous recording of seismic data from 22 October 2004 to 28 February 2006 (average 60.2 events per hour) using a short-term/long-term average algorithm. An equivalent count will be produced from seismometer recordings over the later stages of the 1991-1995 eruption at MT Unzen. The event count time-series from Mt St Helens is then analysed using Multi-taper Method and the Short-Term Fourier Transform to explore temporal variations in activity. Preliminary analysis of seismicity from Mt St Helens suggests cyclic behaviour of subannual timescale, similar to that described at Volcán de Colima and Soufrière Hills volcano [1]. Frequency Index and waveform correlation tools will be implemented to analyse changes in the frequency content of the seismicity and to explore their relations to different phases of activity at the volcano. A single station approach is used to gain a fine-scale view of variations in seismic behaviour at both volcanoes with a focus on comparisons with changes in activity with the hope of gaining a greater understanding of sub-surface processes occurring within the volcanic systems. This approach and the techniques above were successfully implemented at Redoubt Volcano (USA) [2] which also concluded that these techniques may serve an important role in future real-time eruption monitoring efforts. [1] Lamb O., Varley N., Mather T. et al., in prep Similar Cyclic Behaviour at two lava domes, Volcán de Colima (Mexico) and Soufrière Hills volcano (Montserrat), with implications for monitoring. [2] Ketner, D. & Power, J., 2013. Characterization of seismic events during the 2009 eruption of Redoubt Volcano, Alaska. Journal of Volcanology and Geothermal Research, 259, pp.45-62

  3. Porosity Estimation By Artificial Neural Networks Inversion . Application to Algerian South Field

    NASA Astrophysics Data System (ADS)

    Eladj, Said; Aliouane, Leila; Ouadfeul, Sid-Ali

    2017-04-01

    One of the main geophysicist's current challenge is the discovery and the study of stratigraphic traps, this last is a difficult task and requires a very fine analysis of the seismic data. The seismic data inversion allows obtaining lithological and stratigraphic information for the reservoir characterization . However, when solving the inverse problem we encounter difficult problems such as: Non-existence and non-uniqueness of the solution add to this the instability of the processing algorithm. Therefore, uncertainties in the data and the non-linearity of the relationship between the data and the parameters must be taken seriously. In this case, the artificial intelligence techniques such as Artificial Neural Networks(ANN) is used to resolve this ambiguity, this can be done by integrating different physical properties data which requires a supervised learning methods. In this work, we invert the acoustic impedance 3D seismic cube using the colored inversion method, then, the introduction of the acoustic impedance volume resulting from the first step as an input of based model inversion method allows to calculate the Porosity volume using the Multilayer Perceptron Artificial Neural Network. Application to an Algerian South hydrocarbon field clearly demonstrate the power of the proposed processing technique to predict the porosity for seismic data, obtained results can be used for reserves estimation, permeability prediction, recovery factor and reservoir monitoring. Keywords: Artificial Neural Networks, inversion, non-uniqueness , nonlinear, 3D porosity volume, reservoir characterization .

  4. Sand dune effects on seismic data

    NASA Astrophysics Data System (ADS)

    Arran, M.; Vriend, N. M.; Muyzert, E. J.

    2017-12-01

    Ground roll is a significant source of noise in land seismic data, with cross-line scattered ground roll particularly difficult to suppress. This noise arises from surface heterogeneities lateral to the receiver spread, and in desert regions sand dunes are a major contributor. However, the nature of this noise is poorly understood, preventing the design of more effective data acquisition or processing techniques. Here, we present numerical simulations demonstrating that sand dunes can act as resonators, scattering a seismic signal over an extensive period of time. We introduce a mathematical framework that quantitatively describes the properties of noise scattered by a barchan dune, and we discuss the relevance of heterogeneities within the dune. Having identified regions in time, space, and frequency space at which noise will be more significant, we propose the possibility of reducing dune-scattered noise through careful survey design and data processing.

  5. Improved 3D seismic images of dynamic deformation in the Nankai Trough off Kumano

    NASA Astrophysics Data System (ADS)

    Shiraishi, K.; Moore, G. F.; Yamada, Y.; Kinoshita, M.; Sanada, Y.; Kimura, G.

    2016-12-01

    In order to improve the seismic reflection image of dynamic deformation and seismogenic faults in the Nankai trough, the 2006 Kumano 3D seismic dataset was reprocessed from the original field records by applying advanced technologies a decade after the data acquisition and initial processing. The 3D seismic survey revealed the geometry of megasplay fault system. However, there were still unclear regions in the accretionary prism beneath from Kumano basin to the outer ridge, because of sea floor multiple reflections and noise caused by the Kuroshio current. For the next stage of deep scientific drilling into the Nankai trough seismogenic zone, it is essential to know exactly the shape and depth of the megasplay, and fine structures around the drilling site. Three important improvements were achieved in data processing before imaging. First, full deghosting and optimized zero phasing techniques could recover broadband signals, especially in low frequency, by compensating for ghost effects at both source and receiver, and removing source bubbles. Second, the multiple reflections better attenuated by applying advanced techniques in combination, and the strong noise caused by the Kuroshio were attenuated carefully. Third, data regularization by means of the optimized 4D trace interpolation was effective both to mitigate non-uniform fold distribution and to improve data quality. Further imaging processes led to obvious improvement from previous results by applying PSTM with higher order correction of VTI anisotropy, and PSDM based on the velocity model built by reflection tomography with TTI anisotropy. Final reflection images show new geological aspects, such as clear steep dip faults around the "notch", and fine scale faults related to main thrusts in frontal thrust zone. The improved images will highly contribute to understanding the deformation process in the old accretionary prism and seismogenic features related to the megasplay faults.

  6. Outstanding challenges in the seismological study of volcanic processes: Results from recent U.S. and European community-wide discussion workshops

    NASA Astrophysics Data System (ADS)

    Roman, D. C.; Rodgers, M.; Mather, T. A.; Power, J. A.; Pyle, D. M.

    2014-12-01

    Observations of volcanically induced seismicity are essential for eruption forecasting and for real-time and near-real-time warnings of hazardous volcanic activity. Studies of volcanic seismicity and of seismic wave propagation also provide critical understanding of subsurface magmatic systems and the physical processes associated with magma genesis, transport, and eruption. However, desipite significant advances in recent years, our ability to successfully forecast volcanic eruptions and fully understand subsurface volcanic processes is limited by our current understanding of the source processes of volcano-seismic events, the effects on seismic wave propagation within volcanic structures, limited data, and even the non-standardized terminology used to describe seismic waveforms. Progress in volcano seismology is further hampered by inconsistent data formats and standards, lack of state-of-the-art hardware and professional technical staff, as well as a lack of widely adopted analysis techniques and software. Addressing these challenges will not only advance scientific understanding of volcanoes, but also will lead to more accurate forecasts and warnings of hazardous volcanic eruptions that would ultimately save lives and property world-wide. Two recent workshops held in Anchorage, Alaska, and Oxford, UK, represent important steps towards developing a relationship among members of the academic community and government agencies, focused around a shared, long-term vision for volcano seismology. Recommendations arising from the two workshops fall into six categories: 1) Ongoing and enhanced community-wide discussions, 2) data and code curation and dissemination, 3) code development, 4) development of resources for more comprehensive data mining, 5) enhanced strategic seismic data collection, and 6) enhanced integration of multiple datasets (including seismicity) to understand all states of volcano activity through space and time. As presented sequentially above, these steps can be regarded as a road map for galvanizing and strengthening the volcano seismological community to drive new scientific and technical progress over the next 5-10 years.

  7. Insight into subdecimeter fracturing processes during hydraulic fracture experiment in Äspö hard rock laboratory, Sweden

    NASA Astrophysics Data System (ADS)

    Kwiatek, Grzegorz; Martínez-Garzón, Patricia; Plenkers, Katrin; Leonhardt, Maria; Zang, Arno; Dresen, Georg; Bohnhoff, Marco

    2017-04-01

    We analyze the nano- and picoseismicity recorded during a hydraulic fracturing in-situ experiment performed in Äspö Hard Rock Laboratory, Sweden. The fracturing experiment included six fracture stages driven by three different water injection schemes (continuous, progressive and pulse pressurization) and was performed inside a 28 m long, horizontal borehole located at 410 m depth. The fracturing process was monitored with two different seismic networks covering a wide frequency band between 0.01 Hz and 100000 Hz and included broadband seismometers, geophones, high-frequency accelerometers and acoustic emission sensors. The combined seismic network allowed for detection and detailed analysis of seismicity with moment magnitudes MW<-4 (source sizes approx. on cm scale) that occurred solely during the hydraulic fracturing and refracturing stages. We relocated the seismicity catalog using the double-difference technique and calculated the source parameters (seismic moment, source size, stress drop, focal mechanism and seismic moment tensors). The physical characteristics of induced seismicity are compared to the stimulation parameters and to the formation parameters of the site. The seismic activity varies significantly depending on stimulation strategy with conventional, continuous stimulation being the most seismogenic. We find a systematic spatio-temporal migration of microseismic events (propagation away and towards wellbore injection interval) and temporal transitions in source mechanisms (opening - shearing - collapse) both being controlled by changes in fluid injection pressure. The derived focal mechanism parameters are in accordance with the local stress field orientation, and signify the reactivation of pre-existing rock flaws. The seismicity follows statistical and source scaling relations observed at different scales elsewhere, however, at an extremely low level of seismic efficiency.

  8. Locating hydrothermal acoustic sources at Old Faithful Geyser using Matched Field Processing

    NASA Astrophysics Data System (ADS)

    Cros, E.; Roux, P.; Vandemeulebrouck, J.; Kedar, S.

    2011-10-01

    In 1992, a large and dense array of geophones was placed around the geyser vent of Old Faithful, in the Yellowstone National Park, to determine the origin of the seismic hydrothermal noise recorded at the surface of the geyser and to understand its dynamics. Old Faithful Geyser (OFG) is a small-scale hydrothermal system where a two-phase flow mixture erupts every 40 to 100 min in a high continuous vertical jet. Using Matched Field Processing (MFP) techniques on 10-min-long signal, we localize the source of the seismic pulses recorded at the surface of the geyser. Several MFP approaches are compared in this study, the frequency-incoherent and frequency-coherent approach, as well as the linear Bartlett processing and the non-linear Minimum Variance Distorsionless Response (MVDR) processing. The different MFP techniques used give the same source position with better focalization in the case of the MVDR processing. The retrieved source position corresponds to the geyser conduit at a depth of 12 m and the localization is in good agreement with in situ measurements made at Old Faithful in past studies.

  9. From Geodesy to Tectonics: Observing Earthquake Processes from Space (Augustus Love Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Parsons, Barry

    2017-04-01

    A suite of powerful satellite-based techniques has been developed over the past two decades allowing us to measure and interpret variations in the deformation around active continental faults occurring in earthquakes, before the earthquakes as strain accumulates, and immediately following them. The techniques include radar interferometry and the measurement of vertical and horizontal surface displacements using very high-resolution (VHR) satellite imagery. They provide near-field measurements of earthquake deformation facilitating the association with the corresponding active faults and their topographic expression. The techniques also enable pre- and post-seismic deformation to be determined and hence allow the response of the fault and surrounding medium to changes in stress to be investigated. The talk illustrates both the techniques and the applications with examples from recent earthquakes. These include the 2013 Balochistan earthquake, a predominantly strike-slip event, that occurred on the arcuate Hoshab fault in the eastern Makran linking an area of mainly left-lateral shear in the east to one of shortening in the west. The difficulty of reconciling predominantly strike-slip motion with this shortening has led to a wide range of unconventional kinematic and dynamic models. Using pre-and post-seismic VHR satellite imagery, we are able to determine a 3-dimensional deformation field for the earthquake; Sentinel-1 interferometry shows an increase in the rate of creep on a creeping section bounding the northern end of the rupture in response to the earthquake. In addition, we will look at the 1978 Tabas earthquake for which no measurements of deformation were possible at the time. By combining pre-seismic 'spy' satellite images with modern imagery, and pre-seismic aerial stereo images with post-seismic satellite stereo images, we can determine vertical and horizontal displacements from the earthquake and subsequent post-seismic deformation. These observations suggest post-seismic slip concentrated on a thrust ramp at the end of the likely earthquake fault and, together with new radar measurements, can be modeled with slip rates declining approximately inversely with time from the earthquake. Measurements such as these examples provide the basis for investigating the dynamic response to the earthquakes to changes in stress occurring in them.

  10. Wave equation datuming applied to S-wave reflection seismic data

    NASA Astrophysics Data System (ADS)

    Tinivella, U.; Giustiniani, M.; Nicolich, R.

    2018-05-01

    S-wave high-resolution reflection seismic data was processed using Wave Equation Datuming technique in order to improve signal/noise ratio, attenuating coherent noise, and seismic resolution and to solve static corrections problems. The application of this algorithm allowed obtaining a good image of the shallow subsurface geological features. Wave Equation Datuming moves shots and receivers from a surface to another datum (the datum plane), removing time shifts originated by elevation variation and/or velocity changes in the shallow subsoil. This algorithm has been developed and currently applied to P wave, but it reveals the capacity to highlight S-waves images when used to resolve thin layers in high-resolution prospecting. A good S-wave image facilitates correlation with well stratigraphies, optimizing cost/benefit ratio of any drilling. The application of Wave Equation Datuming requires a reliable velocity field, so refraction tomography was adopted. The new seismic image highlights the details of the subsoil reflectors and allows an easier integration with borehole information and geological surveys than the seismic section obtained by conventional CMP reflection processing. In conclusion, the analysis of S-wave let to characterize the shallow subsurface recognizing levels with limited thickness once we have clearly attenuated ground roll, wind and environmental noise.

  11. Updated Tomographic Seismic Imaging at Kilauea Volcano, Hawaii

    NASA Astrophysics Data System (ADS)

    Okubo, P.; Johnson, J.; Felts, E. S.; Flores, N.

    2013-12-01

    Improved and more detailed geophysical, geological, and geochemical observations and measurements at Kilauea, along with prolonged eruptions at its summit caldera and east rift zone, are encouraging more ambitious interpretation and modeling of volcanic processes over a range of temporal and spatial scales. We are updating three-dimensional models of seismic wave-speed distributions within Kilauea using local earthquake arrival time tomography to support waveform-based modeling of seismic source mechanisms. We start from a tomographic model derived from a combination of permanent seismic stations comprising the Hawaiian Volcano Observatory (HVO) seismographic network and a dense deployment of temporary stations in the Kilauea caldera region in 1996. Using P- and S-wave arrival times measured from the HVO network for local earthquakes from 1997 through 2012, we compute velocity models with the finite difference tomographic seismic imaging technique implemented by Benz and others (1996), and applied to numerous volcanoes including Kilauea. Particular impetus to our current modeling was derived from a focused effort to review seismicity occurring in Kilauea's summit caldera and adjoining regions in 2012. Our results reveal clear P-wave low-velocity features at and slightly below sea level beneath Kilauea's summit caldera, lying between Halemaumau Crater and the north-facing scarps that mark the southern caldera boundary. The results are also suggestive of changes in seismic velocity distributions between 1996 and 2012. One example of such a change is an apparent decrease in the size and southeastward extent, compared to the earlier model, of the low VP feature imaged with the more recent data. However, we recognize the distinct possibility that these changes are reflective of differences in earthquake and seismic station distributions in the respective datasets, and we need to further populate the more recent HVO seismicity catalogs to possibly address this concern. We also look forward to more complete implementation at HVO of seismic imaging techniques that use ambient seismic noise retrieved from continuous seismic recordings, and to using earthquake arrival times and ambient seismic noise jointly to tomographically image Kilauea.

  12. A robust calibration technique for acoustic emission systems based on momentum transfer from a ball drop

    USGS Publications Warehouse

    McLaskey, Gregory C.; Lockner, David A.; Kilgore, Brian D.; Beeler, Nicholas M.

    2015-01-01

    We describe a technique to estimate the seismic moment of acoustic emissions and other extremely small seismic events. Unlike previous calibration techniques, it does not require modeling of the wave propagation, sensor response, or signal conditioning. Rather, this technique calibrates the recording system as a whole and uses a ball impact as a reference source or empirical Green’s function. To correctly apply this technique, we develop mathematical expressions that link the seismic moment $M_{0}$ of internal seismic sources (i.e., earthquakes and acoustic emissions) to the impulse, or change in momentum $\\Delta p $, of externally applied seismic sources (i.e., meteor impacts or, in this case, ball impact). We find that, at low frequencies, moment and impulse are linked by a constant, which we call the force‐moment‐rate scale factor $C_{F\\dot{M}} = M_{0}/\\Delta p$. This constant is equal to twice the speed of sound in the material from which the seismic sources were generated. Next, we demonstrate the calibration technique on two different experimental rock mechanics facilities. The first example is a saw‐cut cylindrical granite sample that is loaded in a triaxial apparatus at 40 MPa confining pressure. The second example is a 2 m long fault cut in a granite sample and deformed in a large biaxial apparatus at lower stress levels. Using the empirical calibration technique, we are able to determine absolute source parameters including the seismic moment, corner frequency, stress drop, and radiated energy of these magnitude −2.5 to −7 seismic events.

  13. In-situ Planetary Subsurface Imaging System

    NASA Astrophysics Data System (ADS)

    Song, W.; Weber, R. C.; Dimech, J. L.; Kedar, S.; Neal, C. R.; Siegler, M.

    2017-12-01

    Geophysical and seismic instruments are considered the most effective tools for studying the detailed global structures of planetary interiors. A planet's interior bears the geochemical markers of its evolutionary history, as well as its present state of activity, which has direct implications to habitability. On Earth, subsurface imaging often involves massive data collection from hundreds to thousands of geophysical sensors (seismic, acoustic, etc) followed by transfer by hard links or wirelessly to a central location for post processing and computing, which will not be possible in planetary environments due to imposed mission constraints on mass, power, and bandwidth. Emerging opportunities for geophysical exploration of the solar system from Venus to the icy Ocean Worlds of Jupiter and Saturn dictate that subsurface imaging of the deep interior will require substantial data reduction and processing in-situ. The Real-time In-situ Subsurface Imaging (RISI) technology is a mesh network that senses and processes geophysical signals. Instead of data collection then post processing, the mesh network performs the distributed data processing and computing in-situ, and generates an evolving 3D subsurface image in real-time that can be transmitted under bandwidth and resource constraints. Seismic imaging algorithms (including traveltime tomography, ambient noise imaging, and microseismic imaging) have been successfully developed and validated using both synthetic and real-world terrestrial seismic data sets. The prototype hardware system has been implemented and can be extended as a general field instrumentation platform tailored specifically for a wide variety of planetary uses, including crustal mapping, ice and ocean structure, and geothermal systems. The team is applying the RISI technology to real off-world seismic datasets. For example, the Lunar Seismic Profiling Experiment (LSPE) deployed during the Apollo 17 Moon mission consisted of four geophone instruments spaced up to 100 meters apart, which in essence forms a small aperture seismic network. A pattern recognition technique based on Hidden Markov Models was able to characterize this dataset, and we are exploring how the RISI technology can be adapted for this dataset.

  14. Development of Vertical Cable Seismic System

    NASA Astrophysics Data System (ADS)

    Asakawa, E.; Murakami, F.; Sekino, Y.; Okamoto, T.; Ishikawa, K.; Tsukahara, H.; Shimura, T.

    2011-12-01

    In 2009, Ministry of Education, Culture, Sports, Science and Technology(MEXT) started the survey system development for Hydrothermal deposit. We proposed the Vertical Cable Seismic (VCS), the reflection seismic survey with vertical cable above seabottom. VCS has the following advantages for hydrothermal deposit survey. (1) VCS is an efficient high-resolution 3D seismic survey in limited area. (2) It achieves high-resolution image because the sensors are closely located to the target. (3) It avoids the coupling problems between sensor and seabottom that cause serious damage of seismic data quality. (4) Because of autonomous recording system on sea floor, various types of marine source are applicable with VCS such as sea-surface source (GI gun etc.) , deep-towed or ocean bottom source. Our first experiment of 2D/3D VCS surveys has been carried out in Lake Biwa, JAPAN, in November 2009. The 2D VCS data processing follows the walk-away VSP, including wave field separation and depth migration. Seismic Interferometry technique is also applied. The results give much clearer image than the conventional surface seismic. Prestack depth migration is applied to 3D data to obtain good quality 3D depth volume. Seismic Interferometry technique is applied to obtain the high resolution image in the very shallow zone. Based on the feasibility study, we have developed the autonomous recording VCS system and carried out the trial experiment in actual ocean at the water depth of about 400m to establish the procedures of deployment/recovery and to examine the VC position or fluctuation at seabottom. The result shows that the VC position is estimated with sufficient accuracy and very little fluctuation is observed. Institute of Industrial Science, the University of Tokyo took the research cruise NT11-02 on JAMSTEC R/V Natsushima in February, 2011. In the cruise NT11-02, JGI carried out the second VCS survey using the autonomous VCS recording system with the deep towed source provided by Institute of Industrial Science, the University of Tokyo. It generates high frequency acoustic waves around 1kHz. The acquired VCS data clearly shows the reflections and currently being processed for imaging the subsurface structure.

  15. Big Data solution for CTBT monitoring: CEA-IDC joint global cross correlation project

    NASA Astrophysics Data System (ADS)

    Bobrov, Dmitry; Bell, Randy; Brachet, Nicolas; Gaillard, Pierre; Kitov, Ivan; Rozhkov, Mikhail

    2014-05-01

    Waveform cross-correlation when applied to historical datasets of seismic records provides dramatic improvements in detection, location, and magnitude estimation of natural and manmade seismic events. With correlation techniques, the amplitude threshold of signal detection can be reduced globally by a factor of 2 to 3 relative to currently standard beamforming and STA/LTA detector. The gain in sensitivity corresponds to a body wave magnitude reduction by 0.3 to 0.4 units and doubles the number of events meeting high quality requirements (e.g. detected by three and more seismic stations of the International Monitoring System (IMS). This gain is crucial for seismic monitoring under the Comprehensive Nuclear-Test-Ban Treaty. The International Data Centre (IDC) dataset includes more than 450,000 seismic events, tens of millions of raw detections and continuous seismic data from the primary IMS stations since 2000. This high-quality dataset is a natural candidate for an extensive cross correlation study and the basis of further enhancements in monitoring capabilities. Without this historical dataset recorded by the permanent IMS Seismic Network any improvements would not be feasible. However, due to the mismatch between the volume of data and the performance of the standard Information Technology infrastructure, it becomes impossible to process all the data within tolerable elapsed time. To tackle this problem known as "BigData", the CEA/DASE is part of the French project "DataScale". One objective is to reanalyze 10 years of waveform data from the IMS network with the cross-correlation technique thanks to a dedicated High Performance Computer (HPC) infrastructure operated by the Centre de Calcul Recherche et Technologie (CCRT) at the CEA of Bruyères-le-Châtel. Within 2 years we are planning to enhance detection and phase association algorithms (also using machine learning and automatic classification) and process about 30 terabytes of data provided by the IDC to update the world seismicity map. From the new events and those in the IDC Reviewed Event Bulletin, we will automatically create various sets of master event templates that will be used for the event location globally by the CTBTO and CEA.

  16. Seismic source parameters of the induced seismicity at The Geysers geothermal area, California, by a generalized inversion approach

    NASA Astrophysics Data System (ADS)

    Picozzi, Matteo; Oth, Adrien; Parolai, Stefano; Bindi, Dino; De Landro, Grazia; Amoroso, Ortensia

    2017-04-01

    The accurate determination of stress drop, seismic efficiency and how source parameters scale with earthquake size is an important for seismic hazard assessment of induced seismicity. We propose an improved non-parametric, data-driven strategy suitable for monitoring induced seismicity, which combines the generalized inversion technique together with genetic algorithms. In the first step of the analysis the generalized inversion technique allows for an effective correction of waveforms for the attenuation and site contributions. Then, the retrieved source spectra are inverted by a non-linear sensitivity-driven inversion scheme that allows accurate estimation of source parameters. We therefore investigate the earthquake source characteristics of 633 induced earthquakes (ML 2-4.5) recorded at The Geysers geothermal field (California) by a dense seismic network (i.e., 32 stations of the Lawrence Berkeley National Laboratory Geysers/Calpine surface seismic network, more than 17.000 velocity records). We find for most of the events a non-selfsimilar behavior, empirical source spectra that requires ωγ source model with γ > 2 to be well fitted and small radiation efficiency ηSW. All these findings suggest different dynamic rupture processes for smaller and larger earthquakes, and that the proportion of high frequency energy radiation and the amount of energy required to overcome the friction or for the creation of new fractures surface changes with the earthquake size. Furthermore, we observe also two distinct families of events with peculiar source parameters that, in one case suggests the reactivation of deep structures linked to the regional tectonics, while in the other supports the idea of an important role of steeply dipping fault in the fluid pressure diffusion.

  17. Integrated reservoir characterization for unconventional reservoirs using seismic, microseismic and well log data

    NASA Astrophysics Data System (ADS)

    Maity, Debotyam

    This study is aimed at an improved understanding of unconventional reservoirs which include tight reservoirs (such as shale oil and gas plays), geothermal developments, etc. We provide a framework for improved fracture zone identification and mapping of the subsurface for a geothermal system by integrating data from different sources. The proposed ideas and methods were tested primarily on data obtained from North Brawley geothermal field and the Geysers geothermal field apart from synthetic datasets which were used to test new algorithms before actual application on the real datasets. The study has resulted in novel or improved algorithms for use at specific stages of data acquisition and analysis including improved phase detection technique for passive seismic (and teleseismic) data as well as optimization of passive seismic surveys for best possible processing results. The proposed workflow makes use of novel integration methods as a means of making best use of the available geophysical data for fracture characterization. The methodology incorporates soft computing tools such as hybrid neural networks (neuro-evolutionary algorithms) as well as geostatistical simulation techniques to improve the property estimates as well as overall characterization efficacy. The basic elements of the proposed characterization workflow involves using seismic and microseismic data to characterize structural and geomechanical features within the subsurface. We use passive seismic data to model geomechanical properties which are combined with other properties evaluated from seismic and well logs to derive both qualitative and quantitative fracture zone identifiers. The study has resulted in a broad framework highlighting a new technique for utilizing geophysical data (seismic and microseismic) for unconventional reservoir characterization. It provides an opportunity to optimally develop the resources in question by incorporating data from different sources and using their temporal and spatial variability as a means to better understand the reservoir behavior. As part of this study, we have developed the following elements which are discussed in the subsequent chapters: 1. An integrated characterization framework for unconventional settings with adaptable workflows for all stages of data processing, interpretation and analysis. 2. A novel autopicking workflow for noisy passive seismic data used for improved accuracy in event picking as well as for improved velocity model building. 3. Improved passive seismic survey design optimization framework for better data collection and improved property estimation. 4. Extensive post-stack seismic attribute studies incorporating robust schemes applicable in complex reservoir settings. 5. Uncertainty quantification and analysis to better quantify property estimates over and above the qualitative interpretations made and to validate observations independently with quantified uncertainties to prevent erroneous interpretations. 6. Property mapping from microseismic data including stress and anisotropic weakness estimates for integrated reservoir characterization and analysis. 7. Integration of results (seismic, microseismic and well logs) from analysis of individual data sets for integrated interpretation using predefined integration framework and soft computing tools.

  18. Magma migration at the onset of the 2012-13 Tolbachik eruption revealed by Seismic Amplitude Ratio Analysis

    NASA Astrophysics Data System (ADS)

    Caudron, Corentin; Taisne, Benoit; Kugaenko, Yulia; Saltykov, Vadim

    2015-12-01

    In contrast of the 1975-76 Tolbachik eruption, the 2012-13 Tolbachik eruption was not preceded by any striking change in seismic activity. By processing the Klyuchevskoy volcano group seismic data with the Seismic Amplitude Ratio Analysis (SARA) method, we gain insights into the dynamics of magma movement prior to this important eruption. A clear seismic migration within the seismic swarm, started 20 hours before the reported eruption onset (05:15 UTC, 26 November 2012). This migration proceeded in different phases and ended when eruptive tremor, corresponding to lava flows, was recorded (at 11:00 UTC, 27 November 2012). In order to get a first order approximation of the magma location, we compare the calculated seismic intensity ratios with the theoretical ones. As expected, the observations suggest that the seismicity migrated toward the eruption location. However, we explain the pre-eruptive observed ratios by a vertical migration under the northern slope of Plosky Tolbachik volcano followed by a lateral migration toward the eruptive vents. Another migration is also captured by this technique and coincides with a seismic swarm that started 16-20 km to the south of Plosky Tolbachik at 20:31 UTC on November 28 and lasted for more than 2 days. This seismic swarm is very similar to the seismicity preceding the 1975-76 Tolbachik eruption and can be considered as a possible aborted eruption.

  19. Seismic imaging: From classical to adjoint tomography

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Gu, Y. J.

    2012-09-01

    Seismic tomography has been a vital tool in probing the Earth's internal structure and enhancing our knowledge of dynamical processes in the Earth's crust and mantle. While various tomographic techniques differ in data types utilized (e.g., body vs. surface waves), data sensitivity (ray vs. finite-frequency approximations), and choices of model parameterization and regularization, most global mantle tomographic models agree well at long wavelengths, owing to the presence and typical dimensions of cold subducted oceanic lithospheres and hot, ascending mantle plumes (e.g., in central Pacific and Africa). Structures at relatively small length scales remain controversial, though, as will be discussed in this paper, they are becoming increasingly resolvable with the fast expanding global and regional seismic networks and improved forward modeling and inversion techniques. This review paper aims to provide an overview of classical tomography methods, key debates pertaining to the resolution of mantle tomographic models, as well as to highlight recent theoretical and computational advances in forward-modeling methods that spearheaded the developments in accurate computation of sensitivity kernels and adjoint tomography. The first part of the paper is devoted to traditional traveltime and waveform tomography. While these approaches established a firm foundation for global and regional seismic tomography, data coverage and the use of approximate sensitivity kernels remained as key limiting factors in the resolution of the targeted structures. In comparison to classical tomography, adjoint tomography takes advantage of full 3D numerical simulations in forward modeling and, in many ways, revolutionizes the seismic imaging of heterogeneous structures with strong velocity contrasts. For this reason, this review provides details of the implementation, resolution and potential challenges of adjoint tomography. Further discussions of techniques that are presently popular in seismic array analysis, such as noise correlation functions, receiver functions, inverse scattering imaging, and the adaptation of adjoint tomography to these different datasets highlight the promising future of seismic tomography.

  20. Accurate estimation of seismic source parameters of induced seismicity by a combined approach of generalized inversion and genetic algorithm: Application to The Geysers geothermal area, California

    NASA Astrophysics Data System (ADS)

    Picozzi, M.; Oth, A.; Parolai, S.; Bindi, D.; De Landro, G.; Amoroso, O.

    2017-05-01

    The accurate determination of stress drop, seismic efficiency, and how source parameters scale with earthquake size is an important issue for seismic hazard assessment of induced seismicity. We propose an improved nonparametric, data-driven strategy suitable for monitoring induced seismicity, which combines the generalized inversion technique together with genetic algorithms. In the first step of the analysis the generalized inversion technique allows for an effective correction of waveforms for attenuation and site contributions. Then, the retrieved source spectra are inverted by a nonlinear sensitivity-driven inversion scheme that allows accurate estimation of source parameters. We therefore investigate the earthquake source characteristics of 633 induced earthquakes (Mw 2-3.8) recorded at The Geysers geothermal field (California) by a dense seismic network (i.e., 32 stations, more than 17.000 velocity records). We find a nonself-similar behavior, empirical source spectra that require an ωγ source model with γ > 2 to be well fit and small radiation efficiency ηSW. All these findings suggest different dynamic rupture processes for smaller and larger earthquakes and that the proportion of high-frequency energy radiation and the amount of energy required to overcome the friction or for the creation of new fractures surface changes with earthquake size. Furthermore, we observe also two distinct families of events with peculiar source parameters that in one case suggests the reactivation of deep structures linked to the regional tectonics, while in the other supports the idea of an important role of steeply dipping faults in the fluid pressure diffusion.

  1. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements through a jackknifing process to isolate the anomalous channels, so that an automated analysis system might discard them prior to FK analysis and beamforming on events of interest.

  2. Performance of 3-Component Nodes in the IRIS Community Wavefield Demonstration Experiment

    NASA Astrophysics Data System (ADS)

    Sweet, J. R.; Anderson, K. R.; Woodward, R.

    2017-12-01

    In June 2016, a field crew of 50 students, faculty, industry personnel, and IRIS staff deployed a total of 390 stations as part of a community seismic experiment above an active seismic lineament in north-central Oklahoma. The goals of the experiment were to test new instrumentation and deployment strategies that record the full seismic wavefield, and to advance understanding of earthquake source processes and regional lithospheric structure. The crew deployed 363 3-component, 5Hz Generation 2 Fairfield Z-Land nodes along three seismic lines and in a seven-layer nested gradiometer array. The seismic lines spanned a region 13 km long by 5 km wide. A broadband, 18 station "Golay 3x6" array with an aperture of approximately 5 km was deployed around the gradiometer and seismic lines to collect waveform data from local and regional events. In addition, 9 infrasound stations were deployed in order to capture and identify acoustic events that might be recorded by the seismic array. The variety and geometry of instrumentation deployed was intended to capture the full seismic wavefield generated by the local and regional seismicity beneath the array and the surrounding region. Additional details on the instrumentation and how it was deployed can be found by visiting our website www.iris.edu/wavefields. We present a detailed analysis of noise across the array—including station performance, as well as noise from nearby sources (wind turbines, automobiles, etc.). We report a clear reduction in noise for buried 3-component nodes compared to co-located surface nodes (see Figure). Using the IRIS DMC's ISPAQ client, we present a variety of metrics to evaluate the network's performance. We also present highlights from student projects at the recently-held IRIS advanced data processing short course, which focused on analyzing the wavefield dataset using array processing techniques.

  3. Exploring the relative contribution of mineralogy and CPO to the seismic velocity anisotropy of evaporites

    NASA Astrophysics Data System (ADS)

    Vargas-Meleza, Liliana; Healy, David; Alsop, G. Ian; Timms, Nicholas E.

    2015-01-01

    We present the influence of mineralogy and microstructure on the seismic velocity anisotropy of evaporites. Bulk elastic properties and seismic velocities are calculated for a suite of 20 natural evaporite samples, which consist mainly of halite, anhydrite, and gypsum. They exhibit strong fabrics as a result of tectonic and diagenetic processes. Sample mineralogy and crystallographic preferred orientation (CPO) were obtained with the electron backscatter diffraction (EBSD) technique and the data used for seismic velocity calculations. Bulk seismic properties for polymineralic evaporites were evaluated with a rock recipe approach. Ultrasonic velocity measurements were also taken on cube shaped samples to assess the contribution of grain-scale shape preferred orientation (SPO) to the total seismic anisotropy. The sample results suggest that CPO is responsible for a significant fraction of the bulk seismic properties, in agreement with observations from previous studies. Results from the rock recipe indicate that increasing modal proportion of anhydrite grains can lead to a greater seismic anisotropy of a halite-dominated rock. Conversely, it can lead to a smaller seismic anisotropy degree of a gypsum-dominated rock until an estimated threshold proportion after which anisotropy increases again. The difference between the predicted anisotropy due to CPO and the anisotropy measured with ultrasonic velocities is attributed to the SPO and grain boundary effects in these evaporites.

  4. Seismic migration for SAR focusing: Interferometrical applications

    NASA Astrophysics Data System (ADS)

    Prati, C.; Montiguarnieri, A.; Damonti, E.; Rocca, F.

    SAR (Synthetic Aperture Radar) data focusing is analyzed from a theoretical point of view. Two applications of a SAR data processing algorithm are presented, where the phases of the returns are used for the recovery of interesting parameters of the observed scenes. Migration techniques, similar to those used in seismic signal processing for oil prospecting, were implemented for the determination of the terrain altitude map from a satellite and the evaluation of the sensor attitude for an airplane. A satisfying precision was achieved, since it was shown how an interferometric system is able to detect variations of the airplane roll angle of a small fraction of a degree.

  5. Navigating Earthquake Physics with High-Resolution Array Back-Projection

    NASA Astrophysics Data System (ADS)

    Meng, Lingsen

    Understanding earthquake source dynamics is a fundamental goal of geophysics. Progress toward this goal has been slow due to the gap between state-of-art earthquake simulations and the limited source imaging techniques based on conventional low-frequency finite fault inversions. Seismic array processing is an alternative source imaging technique that employs the higher frequency content of the earthquakes and provides finer detail of the source process with few prior assumptions. While the back-projection provides key observations of previous large earthquakes, the standard beamforming back-projection suffers from low resolution and severe artifacts. This thesis introduces the MUSIC technique, a high-resolution array processing method that aims to narrow the gap between the seismic observations and earthquake simulations. The MUSIC is a high-resolution method taking advantage of the higher order signal statistics. The method has not been widely used in seismology yet because of the nonstationary and incoherent nature of the seismic signal. We adapt MUSIC to transient seismic signal by incorporating the Multitaper cross-spectrum estimates. We also adopt a "reference window" strategy that mitigates the "swimming artifact," a systematic drift effect in back projection. The improved MUSIC back projections allow the imaging of recent large earthquakes in finer details which give rise to new perspectives on dynamic simulations. In the 2011 Tohoku-Oki earthquake, we observe frequency-dependent rupture behaviors which relate to the material variation along the dip of the subduction interface. In the 2012 off-Sumatra earthquake, we image the complicated ruptures involving orthogonal fault system and an usual branching direction. This result along with our complementary dynamic simulations probes the pressure-insensitive strength of the deep oceanic lithosphere. In another example, back projection is applied to the 2010 M7 Haiti earthquake recorded at regional distance. The high-frequency subevents are located at the edges of geodetic slip regions, which are correlated to the stopping phases associated with rupture speed reduction when the earthquake arrests.

  6. Characteristic Analysis of Air-gun Source Wavelet based on the Vertical Cable Data

    NASA Astrophysics Data System (ADS)

    Xing, L.

    2016-12-01

    Air guns are important sources for marine seismic exploration. Far-field wavelets of air gun arrays, as a necessary parameter for pre-stack processing and source models, plays an important role during marine seismic data processing and interpretation. When an air gun fires, it generates a series of air bubbles. Similar to onshore seismic exploration, the water forms a plastic fluid near the bubble; the farther the air gun is located from the measurement, the more steady and more accurately represented the wavelet will be. In practice, hydrophones should be placed more than 100 m from the air gun; however, traditional seismic cables cannot meet this requirement. On the other hand, vertical cables provide a viable solution to this problem. This study uses a vertical cable to receive wavelets from 38 air guns and data are collected offshore Southeast Qiong, where the water depth is over 1000 m. In this study, the wavelets measured using this technique coincide very well with the simulated wavelets and can therefore represent the real shape of the wavelets. This experiment fills a technology gap in China.

  7. MSNoise: a Python Package for Monitoring Seismic Velocity Changes using Ambient Seismic Noise

    NASA Astrophysics Data System (ADS)

    Lecocq, T.; Caudron, C.; Brenguier, F.

    2013-12-01

    Earthquakes occur every day all around the world and are recorded by thousands of seismic stations. In between earthquakes, stations are recording "noise". In the last 10 years, the understanding of this noise and its potential usage have been increasing rapidly. The method, called "seismic interferometry", uses the principle that seismic waves travel between two recorders and are multiple-scattered in the medium. By cross-correlating the two records, one gets an information on the medium below/between the stations. The cross-correlation function (CCF) is a proxy to the Green Function of the medium. Recent developments of the technique have shown those CCF can be used to image the earth at depth (3D seismic tomography) or study the medium changes with time. We present MSNoise, a complete software suite to compute relative seismic velocity changes under a seismic network, using ambient seismic noise. The whole is written in Python, from the monitoring of data archives, to the production of high quality figures. All steps have been optimized to only compute the necessary steps and to use 'job'-based processing. We present a validation of the software on a dataset acquired during the UnderVolc[1] project on the Piton de la Fournaise Volcano, La Réunion Island, France, for which precursory relative changes of seismic velocity are visible for three eruptions betwee 2009 and 2011.

  8. CMP reflection imaging via interferometry of distributed subsurface sources

    NASA Astrophysics Data System (ADS)

    Kim, D.; Brown, L. D.; Quiros, D. A.

    2015-12-01

    The theoretical foundations of recovering body wave energy via seismic interferometry are well established. However in practice, such recovery remains problematic. Here, synthetic seismograms computed for subsurface sources are used to evaluate the geometrical combinations of realistic ambient source and receiver distributions that result in useful recovery of virtual body waves. This study illustrates how surface receiver arrays that span a limited distribution suite of sources, can be processed to reproduce virtual shot gathers that result in CMP gathers which can be effectively stacked with traditional normal moveout corrections. To verify the feasibility of the approach in practice, seismic recordings of 50 aftershocks following the magnitude of 5.8 Virginia earthquake occurred in August, 2011 have been processed using seismic interferometry to produce seismic reflection images of the crustal structure above and beneath the aftershock cluster. Although monotonic noise proved to be problematic by significantly reducing the number of usable recordings, the edited dataset resulted in stacked seismic sections characterized by coherent reflections that resemble those seen on a nearby conventional reflection survey. In particular, "virtual" reflections at travel times of 3 to 4 seconds suggest reflector sat approximately 7 to 12 km depth that would seem to correspond to imbricate thrust structures formed during the Appalachian orogeny. The approach described here represents a promising new means of body wave imaging of 3D structure that can be applied to a wide array of geologic and energy problems. Unlike other imaging techniques using natural sources, this technique does not require precise source locations or times. It can thus exploit aftershocks too small for conventional analyses. This method can be applied to any type of microseismic cloud, whether tectonic, volcanic or man-made.

  9. Spatial wavefield gradient-based seismic wavefield separation

    NASA Astrophysics Data System (ADS)

    Van Renterghem, C.; Schmelzbach, C.; Sollberger, D.; Robertsson, J. OA

    2018-03-01

    Measurements of the horizontal and vertical components of particle motion combined with estimates of the spatial gradients of the seismic wavefield enable seismic data to be acquired and processed using single dedicated multicomponent stations (e.g. rotational sensors) and/or small receiver groups instead of large receiver arrays. Here, we present seismic wavefield decomposition techniques that use spatial wavefield gradient data to separate land and ocean bottom data into their upgoing/downgoing and P/S constituents. Our method is based on the elastodynamic representation theorem with the derived filters requiring local measurements of the wavefield and its spatial gradients only. We demonstrate with synthetic data and a land seismic field data example that combining translational measurements with spatial wavefield gradient estimates allows separating seismic data recorded either at the Earth's free-surface or at the sea bottom into upgoing/downgoing and P/S wavefield constituents for typical incidence angle ranges of body waves. A key finding is that the filter application only requires knowledge of the elastic properties exactly at the recording locations and is valid for a wide elastic property range.

  10. New Geophysical Techniques for Offshore Exploration.

    ERIC Educational Resources Information Center

    Talwani, Manik

    1983-01-01

    New seismic techniques have been developed recently that borrow theory from academic institutions and technology from industry, allowing scientists to explore deeper into the earth with much greater precision than possible with older seismic methods. Several of these methods are discussed, including the seismic reflection common-depth-point…

  11. Learnings from the Monitoring of Induced Seismicity in Western Canada over the Past Three Years

    NASA Astrophysics Data System (ADS)

    Yenier, E.; Moores, A. O.; Baturan, D.; Spriggs, N.

    2017-12-01

    In response to induced seismicity observed in western Canada, existing public networks have been densified and a number of private networks have been deployed to closely monitor the earthquakes induced by hydraulic fracturing operations in the region. These networks have produced an unprecedented volume of seismic data, which can be used to map pre-existing geological structures and understand their activation mechanisms. Here, we present insights gained over the past three years from induced seismicity monitoring (ISM) for some of the most active operators in Canada. First, we discuss the benefits of high-quality ISM data sets for making operational decisions and how their value largely depends on choice of instrumentation, seismic network design and data processing techniques. Using examples from recent research studies, we illustrate the key role of robust modeling of regional source, attenuation and site attributes on the accuracy of event magnitudes, ground motion estimates and induced seismicity hazard assessment. Finally, acknowledging that the ultimate goal of ISM networks is assisting operators to manage induced seismic risk, we share some examples of how ISM data products can be integrated into existing protocols for developing effective risk management strategies.

  12. Aleutian Array of Arrays (A-cubed) to probe a broad spectrum of fault slip under the Aleutian Islands

    NASA Astrophysics Data System (ADS)

    Ghosh, A.; LI, B.

    2016-12-01

    Alaska-Aleutian subduction zone is one of the most seismically active subduction zones in this planet. It is characterized by remarkable along-strike variations in seismic behavior, more than 50 active volcanoes, and presents a unique opportunity to serve as a natural laboratory to study subduction zone processes including fault dynamics. Yet details of the seismicity pattern, spatiotemporal distribution of slow earthquakes, nature of interaction between slow and fast earthquakes and their implication on the tectonic behavior remain unknown. We use a hybrid seismic network approach and install 3 mini seismic arrays and 5 stand-alone stations to simultaneously image subduction fault and nearby volcanic system (Makushin). The arrays and stations are strategically located in the Unalaska Island, where prolific tremor activity is detected and located by a solo pilot array in summer 2012. The hybrid network is operational between summer 2015 and 2016 in continuous mode. One of the three arrays starts in summer 2014 and provides additional data covering a longer time span. The pilot array in the Akutan Island recorded continuous seismic data for 2 months. An automatic beam-backprojection analysis detects almost daily tremor activity, with an average of more than an hour per day. We imaged two active sources separated by a tremor gap. The western source, right under the Unalaska Island shows the most prolific activity with a hint of steady migration. In addition, we are able to identify more than 10 families of low frequency earthquakes (LFEs) in this area. They are located within the tremor source area as imaged by the bean-backprojection technique. Application of a match filter technique reveals that intervals between LFE activities are shorter during tremor activity and longer during quiet time period. We expect to present new results from freshly obtained data. The experiment A-cubed is illuminating subduction zone processes under Unalaska Island in unprecedented detail.

  13. Uniquely Acquired Vintage Seismic Reflection Data Reveal the Stratigraphic and Tectonic History of the Montana Disturbed Belt, USA

    NASA Astrophysics Data System (ADS)

    Speece, M. A.; Link, C. A.; Stickney, M.

    2011-12-01

    In 1983 and 1984 Techco of Denver, Colorado, acquired approximately 302 linear kilometers of two-dimensional (2D) seismic reflection data in Flathead and Lake Counties, Montana, USA, as part of an initiative to identify potential drilling targets beneath the Swan and Whitefish Mountain Ranges and adjacent basins of northwestern Montana. These seismic lines were collected in the Montana Disturbed Belt (MDB) or Montana thrust belt along the western edge of Glacier National Park in mountainous terrain with complicated subsurface structures including thrust faults and folds. These structures formed during the Laramide Orogeny as sedimentary rocks of the Precambrian Belt Supergroup were thrust eastward. Later, during the Cenozoic, high-angle normal faults produced prominent west-facing mountain scarps of the Mission, Swan and Whitefish mountains. The 1983 data set consisted of two profiles of 24-fold (96-channels) Vibroseis data and four profiles of 24-fold (96-channels) helicopter-assisted dynamite data. The dynamite data were collected using the Poulter Method in which explosives were placed on poles and air shots were recorded. The 1983 dynamite profiles extend from southwest to northeast across the Whitefish Mountain Range to the edge of Glacier National Park and the Vibroseis data were collected along nearby roadways. The 1984 data set consists of four profiles of 30-fold (120-channels) helicopter-assisted dynamite data that were also collected using the Poulter Method. The 1984 profiles cross the Swan Mountain Range between Flathead Lake and Glacier National Park. All of these data sets were recently donated to Montana Tech and subsequently recovered from nine-track tape. Conventionally processed seismic stacked sections from the 1980s of these data show evidence of a basement decollement that separates relatively undeformed basement from overlying structures of the MDB. Unfortunately, these data sets have not been processed using modern seismic processing techniques including linear noise suppression of the air wave and ground roll, refraction statics, and prestack migration. Reprocessing of these data using state-of-the-art seismic reflection processing techniques will provide a detailed picture of the stratigraphy and tectonic framework for this region. Moreover, extended correlations of the Vibroseis records to Moho depths might reveal new insights on crustal thickness and provide a framework for understanding crustal thickening during the Laramide Orogeny as well as later Cenozoic extension.

  14. Unsupervised seismic facies analysis with spatial constraints using regularized fuzzy c-means

    NASA Astrophysics Data System (ADS)

    Song, Chengyun; Liu, Zhining; Cai, Hanpeng; Wang, Yaojun; Li, Xingming; Hu, Guangmin

    2017-12-01

    Seismic facies analysis techniques combine classification algorithms and seismic attributes to generate a map that describes main reservoir heterogeneities. However, most of the current classification algorithms only view the seismic attributes as isolated data regardless of their spatial locations, and the resulting map is generally sensitive to noise. In this paper, a regularized fuzzy c-means (RegFCM) algorithm is used for unsupervised seismic facies analysis. Due to the regularized term of the RegFCM algorithm, the data whose adjacent locations belong to same classification will play a more important role in the iterative process than other data. Therefore, this method can reduce the effect of seismic data noise presented in discontinuous regions. The synthetic data with different signal/noise values are used to demonstrate the noise tolerance ability of the RegFCM algorithm. Meanwhile, the fuzzy factor, the neighbour window size and the regularized weight are tested using various values, to provide a reference of how to set these parameters. The new approach is also applied to a real seismic data set from the F3 block of the Netherlands. The results show improved spatial continuity, with clear facies boundaries and channel morphology, which reveals that the method is an effective seismic facies analysis tool.

  15. The Utility of the Extended Images in Ambient Seismic Wavefield Migration

    NASA Astrophysics Data System (ADS)

    Girard, A. J.; Shragge, J. C.

    2015-12-01

    Active-source 3D seismic migration and migration velocity analysis (MVA) are robust and highly used methods for imaging Earth structure. One class of migration methods uses extended images constructed by incorporating spatial and/or temporal wavefield correlation lags to the imaging conditions. These extended images allow users to directly assess whether images focus better with different parameters, which leads to MVA techniques that are based on the tenets of adjoint-state theory. Under certain conditions (e.g., geographical, cultural or financial), however, active-source methods can prove impractical. Utilizing ambient seismic energy that naturally propagates through the Earth is an alternate method currently used in the scientific community. Thus, an open question is whether extended images are similarly useful for ambient seismic migration processing and verifying subsurface velocity models, and whether one can similarly apply adjoint-state methods to perform ambient migration velocity analysis (AMVA). Herein, we conduct a number of numerical experiments that construct extended images from ambient seismic recordings. We demonstrate that, similar to active-source methods, there is a sensitivity to velocity in ambient seismic recordings in the migrated extended image domain. In synthetic ambient imaging tests with varying degrees of error introduced to the velocity model, the extended images are sensitive to velocity model errors. To determine the extent of this sensitivity, we utilize acoustic wave-equation propagation and cross-correlation-based migration methods to image weak body-wave signals present in the recordings. Importantly, we have also observed scenarios where non-zero correlation lags show signal while zero-lags show none. This may be a valuable missing piece for ambient migration techniques that have yielded largely inconclusive results, and might be an important piece of information for performing AMVA from ambient seismic recordings.

  16. Systematic detection of seismic events at Mount St. Helens with an ultra-dense array

    NASA Astrophysics Data System (ADS)

    Meng, X.; Hartog, J. R.; Schmandt, B.; Hotovec-Ellis, A. J.; Hansen, S. M.; Vidale, J. E.; Vanderplas, J.

    2016-12-01

    During the summer of 2014, an ultra-dense array of 900 geophones was deployed around the crater of Mount St. Helens and continuously operated for 15 days. This dataset provides us an unprecedented opportunity to systematically detect seismic events around an active volcano and study their underlying mechanisms. We use a waveform-based matched filter technique to detect seismic events from this dataset. Due to the large volume of continuous data ( 1 TB), we performed the detection on the GPU cluster Stampede (https://www.tacc.utexas.edu/systems/stampede). We build a suite of template events from three catalogs: 1) the standard Pacific Northwest Seismic Network (PNSN) catalog (45 events); 2) the catalog from Hansen&Schmandt (2015) obtained with a reverse-time imaging method (212 events); and 3) the catalog identified with a matched filter technique using the PNSN permanent stations (190 events). By searching for template matches in the ultra-dense array, we find 2237 events. We then calibrate precise relative magnitudes for template and detected events, using a principal component fit to measure waveform amplitude ratios. The magnitude of completeness and b-value of the detected catalog is -0.5 and 1.1, respectively. Our detected catalog shows several intensive swarms, which are likely driven by fluid pressure transients in conduits or slip transients on faults underneath the volcano. We are currently relocating the detected catalog with HypoDD and measuring the seismic velocity changes at Mount St. Helens using the coda wave interferometry of detected repeating earthquakes. The accurate temporal-spatial migration pattern of seismicity and seismic property changes should shed light on the physical processes beneath Mount St. Helens.

  17. Weighted stacking of seismic AVO data using hybrid AB semblance and local similarity

    NASA Astrophysics Data System (ADS)

    Deng, Pan; Chen, Yangkang; Zhang, Yu; Zhou, Hua-Wei

    2016-04-01

    The common-midpoint (CMP) stacking technique plays an important role in enhancing the signal-to-noise ratio (SNR) in seismic data processing and imaging. Weighted stacking is often used to improve the performance of conventional equal-weight stacking in further attenuating random noise and handling the amplitude variations in real seismic data. In this study, we propose to use a hybrid framework of combining AB semblance and a local-similarity-weighted stacking scheme. The objective is to achieve an optimal stacking of the CMP gathers with class II amplitude-variation-with-offset (AVO) polarity-reversal anomaly. The selection of high-quality near-offset reference trace is another innovation of this work because of its better preservation of useful energy. Applications to synthetic and field seismic data demonstrate a great improvement using our method to capture the true locations of weak reflections, distinguish thin-bed tuning artifacts, and effectively attenuate random noise.

  18. Geodetic survey as a means of improving fast MASW (Multichannel Analysis Of Surface Waves) profiling in difficult terrain/land conditions

    NASA Astrophysics Data System (ADS)

    Matuła, Rafał; Lewińska, Paulina

    2018-01-01

    This paper revolves around newly designed and constructed system that can make 2D seismic measurement in natural, subsoil conditions and role of land survey in obtaining accurate results and linking them to 3D surface maps. A new type of land streamer, designed for shallow subsurface exploration is described in this paper. In land seismic data acquisition methods a vehicle tows a line of seismic cable, lying on construction called streamer. The measurements of points and shots are taken while the line is stationary, arbitrary placed on seismic profile. Exposed land streamer consists of 24 innovatory gimballed 10 Hz geophones. It eliminates the need for hand `planting' of geophones, reducing time and costs. With the use of current survey techniques all data obtained with this instrument are being transferred in to 2D and 3D maps. This process is becoming more automatic.

  19. Exploring Hawaiian volcanism

    USGS Publications Warehouse

    Poland, Michael P.; Okubo, Paul G.; Hon, Ken

    2013-01-01

    In 1912 the Hawaiian Volcano Observatory (HVO) was established by Massachusetts Institute of Technology professor Thomas A. Jaggar Jr. on the island of Hawaii. Driven by the devastation he observed while investigating the volcanic disasters of 1902 at Montagne Pelée in the Caribbean, Jaggar conducted a worldwide search and decided that Hawai‘i provided an excellent natural laboratory for systematic study of earthquake and volcano processes toward better understanding of seismic and volcanic hazards. In the 100 years since HVO’s founding, surveillance and investigation of Hawaiian volcanoes have spurred advances in volcano and seismic monitoring techniques, extended scientists’ understanding of eruptive activity and processes, and contributed to development of global theories about hot spots and mantle plumes.

  20. Exploring Hawaiian Volcanism

    NASA Astrophysics Data System (ADS)

    Poland, Michael P.; Okubo, Paul G.; Hon, Ken

    2013-02-01

    In 1912 the Hawaiian Volcano Observatory (HVO) was established by Massachusetts Institute of Technology professor Thomas A. Jaggar Jr. on the island of Hawaii. Driven by the devastation he observed while investigating the volcanic disasters of 1902 at Montagne Pelée in the Caribbean, Jaggar conducted a worldwide search and decided that Hawai`i provided an excellent natural laboratory for systematic study of earthquake and volcano processes toward better understanding of seismic and volcanic hazards. In the 100 years since HVO's founding, surveillance and investigation of Hawaiian volcanoes have spurred advances in volcano and seismic monitoring techniques, extended scientists' understanding of eruptive activity and processes, and contributed to development of global theories about hot spots and mantle plumes.

  1. Imaging an Active Volcano Edifice at Tenerife Island, Spain

    NASA Astrophysics Data System (ADS)

    Ibáñez, Jesús M.; Rietbrock, Andreas; García-Yeguas, Araceli

    2008-08-01

    An active seismic experiment to study the internal structure of Teide volcano is being carried out on Tenerife, a volcanic island in Spain's Canary Islands archipelago. The main objective of the Tomography at Teide Volcano Spain (TOM-TEIDEVS) experiment, begun in January 2007, is to obtain a three-dimensional (3-D) structural image of Teide volcano using seismic tomography and seismic reflection/refraction imaging techniques. At present, knowledge of the deeper structure of Teide and Tenerife is very limited, with proposed structural models based mainly on sparse geophysical and geological data. The multinational experiment-involving institutes from Spain, the United Kingdom, Italy, Ireland, and Mexico-will generate a unique high-resolution structural image of the active volcano edifice and will further our understanding of volcanic processes.

  2. Discriminating Induced-Microearthquakes Using New Seismic Features

    NASA Astrophysics Data System (ADS)

    Mousavi, S. M.; Horton, S.

    2016-12-01

    We studied characteristics of induced-microearthquakes on the basis of the waveforms recorded on a limited number of surface receivers using machine-learning techniques. Forty features in the time, frequency, and time-frequency domains were measured on each waveform, and several techniques such as correlation-based feature selection, Artificial Neural Networks (ANNs), Logistic Regression (LR) and X-mean were used as research tools to explore the relationship between these seismic features and source parameters. The results show that spectral features have the highest correlation to source depth. Two new measurements developed as seismic features for this study, spectral centroids and 2D cross-correlations in the time-frequency domain, performed better than the common seismic measurements. These features can be used by machine learning techniques for efficient automatic classification of low energy signals recorded at one or more seismic stations. We applied the technique to 440 microearthquakes-1.7Reference: Mousavi, S.M., S.P. Horton, C. A. Langston, B. Samei, (2016) Seismic features and automatic discrimination of deep and shallow induced-microearthquakes using neural network and logistic regression, Geophys. J. Int. doi: 10.1093/gji/ggw258.

  3. Testing seismic amplitude source location for fast debris-flow detection at Illgraben, Switzerland

    NASA Astrophysics Data System (ADS)

    Walter, Fabian; Burtin, Arnaud; McArdell, Brian W.; Hovius, Niels; Weder, Bianca; Turowski, Jens M.

    2017-06-01

    Heavy precipitation can mobilize tens to hundreds of thousands of cubic meters of sediment in steep Alpine torrents in a short time. The resulting debris flows (mixtures of water, sediment and boulders) move downstream with velocities of several meters per second and have a high destruction potential. Warning protocols for affected communities rely on raising awareness about the debris-flow threat, precipitation monitoring and rapid detection methods. The latter, in particular, is a challenge because debris-flow-prone torrents have their catchments in steep and inaccessible terrain, where instrumentation is difficult to install and maintain. Here we test amplitude source location (ASL) as a processing scheme for seismic network data for early warning purposes. We use debris-flow and noise seismograms from the Illgraben catchment, Switzerland, a torrent system which produces several debris-flow events per year. Automatic in situ detection is currently based on geophones mounted on concrete check dams and radar stage sensors suspended above the channel. The ASL approach has the advantage that it uses seismometers, which can be installed at more accessible locations where a stable connection to mobile phone networks is available for data communication. Our ASL processing uses time-averaged ground vibration amplitudes to estimate the location of the debris-flow front. Applied to continuous data streams, inversion of the seismic amplitude decay throughout the network is robust and efficient, requires no manual identification of seismic phase arrivals and eliminates the need for a local seismic velocity model. We apply the ASL technique to a small debris-flow event on 19 July 2011, which was captured with a temporary seismic monitoring network. The processing rapidly detects the debris-flow event half an hour before arrival at the outlet of the torrent and several minutes before detection by the in situ alarm system. An analysis of continuous seismic records furthermore indicates that detectability of Illgraben debris flows of this size is unaffected by changing environmental and anthropogenic seismic noise and that false detections can be greatly reduced with simple processing steps.

  4. Multiple timescales of cyclical behaviour observed at two dome-forming eruptions

    NASA Astrophysics Data System (ADS)

    Lamb, Oliver D.; Varley, Nick R.; Mather, Tamsin A.; Pyle, David M.; Smith, Patrick J.; Liu, Emma J.

    2014-09-01

    Cyclic behaviour over a range of timescales is a well-documented feature of many dome-forming volcanoes, but has not previously been identified in high resolution seismic data from Volcán de Colima (Mexico). Using daily seismic count datasets from Volcán de Colima and Soufrière Hills volcano (Montserrat), this study explores parallels in the long-term behaviour of seismicity at two long-lived systems. Datasets are examined using multiple techniques, including Fast-Fourier Transform, Detrended Fluctuation Analysis and Probabilistic Distribution Analysis, and the comparison of results from two systems reveals interesting parallels in sub-surface processes operating at both systems. Patterns of seismicity at both systems reveal complex but broadly similar long-term temporal patterns with cycles on the order of ~ 50- to ~ 200-days. These patterns are consistent with previously published spectral analyses of SO2 flux time-series at Soufrière Hills volcano, and are attributed to variations in the movement of magma in each system. Detrended Fluctuation Analysis determined that both volcanic systems showed a systematic relationship between the number of seismic events and the relative ‘roughness' of the time-series, and explosions at Volcán de Colima showed a 1.5-2 year cycle; neither observation has a clear explanatory mechanism. At Volcán de Colima, analysis of repose intervals between seismic events shows long-term behaviour that responds to changes in activity at the system. Similar patterns for both volcanic systems suggest a common process or processes driving the observed signal but it is not clear from these results alone what those processes may be. Further attempts to model conduit processes at each volcano must account for the similarities and differences in activity within each system. The identification of some commonalities in the patterns of behaviour during long-lived dome-forming eruptions at andesitic volcanoes provides a motivation for investigating further use of time-series analysis as a monitoring tool.

  5. Tectonic Inversion Along the Algerian and Ligurian Margins: On the Insight Provided By Latest Seismic Processing Techniques Applied to Recent and Vintage 2D Offshore Multichannel Seismic Data

    NASA Astrophysics Data System (ADS)

    Schenini, L.; Beslier, M. O.; Sage, F.; Badji, R.; Galibert, P. Y.; Lepretre, A.; Dessa, J. X.; Aidi, C.; Watremez, L.

    2014-12-01

    Recent studies on the Algerian and the North-Ligurian margins in the Western Mediterranean have evidenced inversion-related superficial structures, such as folds and asymmetric sedimentary perched basins whose geometry hints at deep compressive structures dipping towards the continent. Deep seismic imaging of these margins is difficult due to steep slope and superficial multiples, and, in the Mediterranean context, to the highly diffractive Messinian evaporitic series in the basin. During the Algerian-French SPIRAL survey (2009, R/V Atalante), 2D marine multi-channel seismic (MCS) reflection data were collected along the Algerian Margin using a 4.5 km, 360 channel digital streamer and a 3040 cu. in. air-gun array. An advanced processing workflow has been laid out using Geocluster CGG software, which includes noise attenuation, 2D SRME multiple attenuation, surface consistent deconvolution, Kirchhoff pre-stack time migration. This processing produces satisfactory seismic images of the whole sedimentary cover, and of southward dipping reflectors in the acoustic basement along the central part of the margin offshore Great Kabylia, that are interpreted as inversion-related blind thrusts as part of flat-ramp systems. We applied this successful processing workflow to old 2D marine MCS data acquired on the North-Ligurian Margin (Malis survey, 1995, R/V Le Nadir), using a 2.5 km, 96 channel streamer and a 1140 cu. in. air-gun array. Particular attention was paid to multiple attenuation in adapting our workflow. The resulting reprocessed seismic images, interpreted with a coincident velocity model obtained by wide-angle data tomography, provide (1) enhanced imaging of the sedimentary cover down to the top of the acoustic basement, including the base of the Messinian evaporites and the sub-salt Miocene series, which appear to be tectonized as far as in the mid-basin, and (2) new evidence of deep crustal structures in the margin which the initial processing had failed to reveal.

  6. Capabilities of seismic and georadar 2D/3D imaging of shallow subsurface of transport route using the Seismobile system

    NASA Astrophysics Data System (ADS)

    Pilecki, Zenon; Isakow, Zbigniew; Czarny, Rafał; Pilecka, Elżbieta; Harba, Paulina; Barnaś, Maciej

    2017-08-01

    In this work, the capabilities of the Seismobile system for shallow subsurface imaging of transport routes, such as roads, railways, and airport runways, in different geological conditions were presented. The Seismobile system combines the advantages of seismic profiling using landstreamer and georadar (GPR) profiling. It consists of up to four seismic measuring lines and carriage with a suspended GPR antenna. Shallow subsurface recognition may be achieved to a maximum width of 10.5 m for a distance of 3.5 m between the measurement lines. GPR measurement is performed in the axis of the construction. Seismobile allows the measurement time, labour and costs to be reduced due to easy technique of its installation, remote data transmission from geophones to accompanying measuring modules, automated location of the system based on GPS and a highly automated method of seismic wave excitation. In this paper, the results of field tests carried out in different geological conditions were presented. The methodologies of acquisition, processing and interpretation of seismic and GPR measurements were broadly described. Seismograms and its spectrum registered by Seismobile system were compared to the ones registered by Geode seismograph of Geometrix. Seismic data processing and interpretation software allows for the obtaining of 2D/3D models of P- and S-wave velocities. Combined seismic and GPR results achieved sufficient imaging of shallow subsurface to a depth of over a dozen metres. The obtained geophysical information correlated with geological information from the boreholes with good quality. The results of performed tests proved the efficiency of the Seismobile system in seismic and GPR imaging of a shallow subsurface of transport routes under compound conditions.

  7. Discovering new events beyond the catalogue—application of empirical matched field processing to Salton Sea geothermal field seismicity

    DOE PAGES

    Wang, Jingbo; Templeton, Dennise C.; Harris, David B.

    2015-07-30

    Using empirical matched field processing (MFP), we compare 4 yr of continuous seismic data to a set of 195 master templates from within an active geothermal field and identify over 140 per cent more events than were identified using traditional detection and location techniques alone. In managed underground reservoirs, a substantial fraction of seismic events can be excluded from the official catalogue due to an inability to clearly identify seismic-phase onsets. Empirical MFP can improve the effectiveness of current seismic detection and location methodologies by using conventionally located events with higher signal-to-noise ratios as master events to define wavefield templatesmore » that could then be used to map normally discarded indistinct seismicity. Since MFP does not require picking, it can be carried out automatically and rapidly once suitable templates are defined. In this application, we extend MFP by constructing local-distance empirical master templates using Southern California Earthquake Data Center archived waveform data of events originating within the Salton Sea Geothermal Field. We compare the empirical templates to continuous seismic data collected between 1 January 2008 and 31 December 2011. The empirical MFP method successfully identifies 6249 additional events, while the original catalogue reported 4352 events. The majority of these new events are lower-magnitude events with magnitudes between M0.2–M0.8. Here, the increased spatial-temporal resolution of the microseismicity map within the geothermal field illustrates how empirical MFP, when combined with conventional methods, can significantly improve seismic network detection capabilities, which can aid in long-term sustainability and monitoring of managed underground reservoirs.« less

  8. The Use of Signal Dimensionality for Automatic QC of Seismic Array Data

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.; Draganov, D.; Maceira, M.; Gomez, M.

    2014-12-01

    A significant problem in seismic array analysis is the inclusion of bad sensor channels in the beam-forming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by-node basis, so the dimensionality of the node traffic is instead monitored for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. We examine the signal dimension in similar way to the method addressing node traffic anomalies in large computer systems. We explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements. We show preliminary results applied to arrays in Kazakhstan (Makanchi) and Argentina (Malargue).

  9. Single-station 6C beamforming

    NASA Astrophysics Data System (ADS)

    Nakata, N.; Hadziioannou, C.; Igel, H.

    2017-12-01

    Six-component measurements of seismic ground motion provide a unique opportunity to identify and decompose seismic wavefields into different wave types and incoming azimuths, as well as estimate structural information (e.g., phase velocity). By using the relationship between the transverse component and vertical rotational motion for Love waves, we can find the incident azimuth of the wave and the phase velocity. Therefore, when we scan the entire range of azimuth and slownesses, we can process the seismic waves in a similar way to conventional beamforming processing, without using a station array. To further improve the beam resolution, we use the distribution of amplitude ratio between translational and rotational motions at each time sample. With this beamforming, we decompose multiple incoming waves by azimuth and phase velocity using only one station. We demonstrate this technique using the data observed at Wettzell (vertical rotational motion and 3C translational motions). The beamforming results are encouraging to extract phase velocity at the location of the station, apply to oceanic microseism, and to identify complicated SH wave arrivals. We also discuss single-station beamforming using other components (vertical translational and horizontal rotational components). For future work, we need to understand the resolution limit of this technique, suitable length of time windows, and sensitivity to weak motion.

  10. Automated Processing Workflow for Ambient Seismic Recordings

    NASA Astrophysics Data System (ADS)

    Girard, A. J.; Shragge, J.

    2017-12-01

    Structural imaging using body-wave energy present in ambient seismic data remains a challenging task, largely because these wave modes are commonly much weaker than surface wave energy. In a number of situations body-wave energy has been extracted successfully; however, (nearly) all successful body-wave extraction and imaging approaches have focused on cross-correlation processing. While this is useful for interferometric purposes, it can also lead to the inclusion of unwanted noise events that dominate the resulting stack, leaving body-wave energy overpowered by the coherent noise. Conversely, wave-equation imaging can be applied directly on non-correlated ambient data that has been preprocessed to mitigate unwanted energy (i.e., surface waves, burst-like and electromechanical noise) to enhance body-wave arrivals. Following this approach, though, requires a significant preprocessing effort on often Terabytes of ambient seismic data, which is expensive and requires automation to be a feasible approach. In this work we outline an automated processing workflow designed to optimize body wave energy from an ambient seismic data set acquired on a large-N array at a mine site near Lalor Lake, Manitoba, Canada. We show that processing ambient seismic data in the recording domain, rather than the cross-correlation domain, allows us to mitigate energy that is inappropriate for body-wave imaging. We first develop a method for window selection that automatically identifies and removes data contaminated by coherent high-energy bursts. We then apply time- and frequency-domain debursting techniques to mitigate the effects of remaining strong amplitude and/or monochromatic energy without severely degrading the overall waveforms. After each processing step we implement a QC check to investigate improvements in the convergence rates - and the emergence of reflection events - in the cross-correlation plus stack waveforms over hour-long windows. Overall, the QC analyses suggest that automated preprocessing of ambient seismic recordings in the recording domain successfully mitigates unwanted coherent noise events in both the time and frequency domain. Accordingly, we assert that this method is beneficial for direct wave-equation imaging with ambient seismic recordings.

  11. Seismic reflection imaging of shallow oceanographic structures

    NASA Astrophysics Data System (ADS)

    Piété, Helen; Marié, Louis; Marsset, Bruno; Thomas, Yannick; Gutscher, Marc-André

    2013-05-01

    Multichannel seismic (MCS) reflection profiling can provide high lateral resolution images of deep ocean thermohaline fine structure. However, the shallowest layers of the water column (z < 150 m) have remained unexplored by this technique until recently. In order to explore the feasibility of shallow seismic oceanography (SO), we reprocessed and analyzed four multichannel seismic reflection sections featuring reflectors at depths between 10 and 150 m. The influence of the acquisition parameters was quantified. Seismic data processing dedicated to SO was also investigated. Conventional seismic acquisition systems were found to be ill-suited to the imaging of shallow oceanographic structures, because of a high antenna filter effect induced by large offsets and seismic trace lengths, and sources that typically cannot provide both a high level of emission and fine vertical resolution. We considered a test case, the imagery of the seasonal thermocline on the western Brittany continental shelf. New oceanographic data acquired in this area allowed simulation of the seismic acquisition. Sea trials of a specifically designed system were performed during the ASPEX survey, conducted in early summer 2012. The seismic device featured: (i) four seismic streamers, each consisting of six traces of 1.80 m; (ii) a 1000 J SIG sparker source, providing a 400 Hz signal with a level of emission of 205 dB re 1 μPa @ 1 m. This survey captured the 15 m thick, 30 m deep seasonal thermocline in unprecedented detail, showing images of vertical displacements most probably induced by internal waves.

  12. Data Mining for Tectonic Tremor in a Large Global Seismogram Database using Preprocessed Data Quality Measurements

    NASA Astrophysics Data System (ADS)

    Rasor, B. A.; Brudzinski, M. R.

    2013-12-01

    The collision of plates at subduction zones yields the potential for disastrous earthquakes, yet the processes that lead up to these events are still largely unclear and make them difficult to forecast. Recent advancements in seismic monitoring has revealed subtle ground vibrations termed tectonic tremor that occur as long-lived swarms of narrow bandwidth activity, different from local earthquakes of comparable amplitude that create brief signals of broader, higher frequency. The close proximity of detected tremor events to the lower edge of the seismogenic zone along the subduction interface suggests a potential triggering relationship between tremor and megathrust earthquakes. Most tremor catalogs are constructed with detection methods that involve an exhausting download of years of high sample rate seismic data, as well as large computation power to process the large data volume and identify temporal patterns of tremor activity. We have developed a tremor detection method that employs the underutilized Quality Analysis Control Kit (QuACK), originally built to analyze station performance and identify instrument problems across the many seismic networks that contribute data to one of the largest seismogram databases in the world (IRIS DMC). The QuACK dataset stores seismogram amplitudes at a wide range of frequencies calculated every hour since 2005 for most stations achieved in the IRIS DMC. Such a preprocessed dataset is advantageous considering several tremor detection techniques use hourly seismic amplitudes in the frequency band where tremor is most active (2-5 Hz) to characterize the time history of tremor. Yet these previous detection techniques have relied on downloading years of 40-100 sample-per-second data to make the calculations, which typically takes several days on a 36-node high-performance cluster to calculate the amplitude variations for a single station. Processing times are even longer for a recently developed detection algorithm that utilize the ratio of amplitudes between tremor frequencies and those of local earthquakes (10-15 Hz) and surface waves (0.02-0.1 Hz). Using the QuACK dataset, we can make the more advanced calculations in a fraction of the time. This method works well to quickly detect tremor in the Cascadia region by finding similar times of increased tremor activity when comparing across a variety of stations within a 100km radius of a reference station. We confirm the legitimacy of this method by demonstrating comparable results to several previously developed tremor detection techniques despite a much shorter processing time. The rapid processing time has allowed us to refine the detection algorithm by seeking more optimal frequency bands by comparing results from our technique and others, using several stations across the Cascadia subduction zone. As we move forward, we will apply the method to other subduction zones, and ultimately to the vast set of seismic data stored at the IRIS DMC for which tremor has not been previously investigated.

  13. Studies of short and long memory in mining-induced seismic processes

    NASA Astrophysics Data System (ADS)

    Węglarczyk, Stanisław; Lasocki, Stanisław

    2009-09-01

    Memory of a stochastic process implies its predictability, understood as a possibility to gain information on the future above the random guess level. Here we search for memory in the mining-induced seismic process (MIS), that is, a process induced or triggered by mining operations. Long memory is investigated by means of the Hurst rescaled range analysis, and the autocorrelation function estimate is used to test for short memory. Both methods are complemented with result uncertainty analyses based on different resampling techniques. The analyzed data comprise event series from Rudna copper mine in Poland. The studies show that the interevent time and interevent distance processes have both long and short memory. MIS occurrences and locations are internally interrelated. Internal relations among the sizes of MIS events are apparently weaker than those of other two studied parameterizations and are limited to long term interactions.

  14. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation

    NASA Astrophysics Data System (ADS)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.

    2017-12-01

    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.

  15. Breakup magmatism on the Vøring Margin, mid-Norway: New insight from interpretation of high-quality 2D and 3D seismic reflection data

    NASA Astrophysics Data System (ADS)

    Abdelmalak, M. M.; Planke, S.; Millett, J.; Jerram, D. A.; Maharjan, D.; Zastrozhnov, D.; Schmid, D. W.; Faleide, J. I.; Svensen, H.; Myklebust, R.

    2017-12-01

    The Vøring Margin offshore mid-Norway is a classic volcanic rifted margin, characterized by voluminous Paleogene igneous rocks present on both sides of the continent-ocean boundary. The margin displays (1) thickened transitional crust with a well-defined lower crustal high-velocity body and prominent deep crustal reflections, the so-called T-Reflection, (2) seaward dipping reflector (SDR) wedges and a prominent northeast-trending escarpment on the Vøring Marginal High, and (3) extensive sill complexes in the adjacent Cretaceous Vøring Basin. During the last decade, new 2D and 3D industry seismic data along with improved processing techniques, such as broadband processing and noise reduction processing sequences, have made it possible to image and map the breakup igneous complex in much greater detail than previously possible. Our interpretation includes a combination of (1) seismic horizon picking, (2) integrated seismic-gravity-magnetic (SGM) interpretation, (3) seismic volcanostratigraphy, and (4) igneous seismic geomorphology. The results are integrated with published wide-angle seismic data, re-analyzed borehole data including new geochronology, and new geodynamic modeling of the effects of magmatism on the thermal history and subsidence of the margin. The extensive sill complexes and associated hydrothermal vent complexes in the Vøring Basin have a Paleocene-Eocene boundary age based on high-precision U/Pb dating combined with seismic mapping constraints. On the marginal high, our results show a highly variable crustal structure, with a pre-breakup configuration consisting of large-scale structural highs and sedimentary basins. These structures were in-filled and covered by basalt flows and volcanogenic sediments during the early stages of continental breakup in the earliest Eocene. Subsequently, rift basins developed along the continent-ocean boundary and where infilled by up to ca. 6 km thick basalt sequences, currently imaged as SDRs fed by a dike swarm imaged on seismic data. The addition of magma within the crust had a prominent effect on the thermal history and hydrocarbon maturation of the sedimentary basin, causing uplift, delayed subsidence, and possibly contributing to the triggering of global warming during the Paleocene-Eocene Thermal Maximum (PETM).

  16. Seismic field measurements in Kylylahti, Finland, in support of the further development of geophysical seismic techniques for CTBT On-site Inspections

    NASA Astrophysics Data System (ADS)

    Labak, Peter; Lindblom, Pasi; Malich, Gregor

    2017-04-01

    The Integrated Field Exercise of 2014 (IFE14) was a field event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) during which the operational and technical capabilities of a Comprehensive Test Ban Treaty's (CTBT) on-site inspection (OSI) were tested in integrated manner. Many of the inspection techniques permitted by the CTBT were applied during IFE14 including a range of geophysical techniques, however, one of the techniques foreseen by the CTBT but not yet developed is resonance seismometry. During August and September 2016, seismic field measurements have been conducted in the region of Kylylahti, Finland, in support of the further development of geophysical seismic techniques for OSIs. 45 seismic stations were used to continuously acquire seismic signals. During that period, data from local, regional and teleseismic natural events and man-made events were acquired, including from a devastating earthquake in Italy and the nuclear explosion announced by the Democratic People's Republic of Korea on 9 September 2016. Also, data were acquired following the small-scale use of man-made chemical explosives in the area and of vibratory sources. This presentation will show examples from the data set and will discuss its use for the development of resonance seimometry for OSIs.

  17. Cross-Scale Modelling of Subduction from Minute to Million of Years Time Scale

    NASA Astrophysics Data System (ADS)

    Sobolev, S. V.; Muldashev, I. A.

    2015-12-01

    Subduction is an essentially multi-scale process with time-scales spanning from geological to earthquake scale with the seismic cycle in-between. Modelling of such process constitutes one of the largest challenges in geodynamic modelling today.Here we present a cross-scale thermomechanical model capable of simulating the entire subduction process from rupture (1 min) to geological time (millions of years) that employs elasticity, mineral-physics-constrained non-linear transient viscous rheology and rate-and-state friction plasticity. The model generates spontaneous earthquake sequences. The adaptive time-step algorithm recognizes moment of instability and drops the integration time step to its minimum value of 40 sec during the earthquake. The time step is then gradually increased to its maximal value of 5 yr, following decreasing displacement rates during the postseismic relaxation. Efficient implementation of numerical techniques allows long-term simulations with total time of millions of years. This technique allows to follow in details deformation process during the entire seismic cycle and multiple seismic cycles. We observe various deformation patterns during modelled seismic cycle that are consistent with surface GPS observations and demonstrate that, contrary to the conventional ideas, the postseismic deformation may be controlled by viscoelastic relaxation in the mantle wedge, starting within only a few hours after the great (M>9) earthquakes. Interestingly, in our model an average slip velocity at the fault closely follows hyperbolic decay law. In natural observations, such deformation is interpreted as an afterslip, while in our model it is caused by the viscoelastic relaxation of mantle wedge with viscosity strongly varying with time. We demonstrate that our results are consistent with the postseismic surface displacement after the Great Tohoku Earthquake for the day-to-year time range. We will also present results of the modeling of deformation of the upper plate during multiple earthquake cycles at times of hundred thousand and million years and discuss effect of great earthquakes in changing long-term stress field in the upper plate.

  18. Synthetic seismic monitoring using reverse-time migration and Kirchhoff migration for CO2 sequestration in Korea

    NASA Astrophysics Data System (ADS)

    Kim, W.; Kim, Y.; Min, D.; Oh, J.; Huh, C.; Kang, S.

    2012-12-01

    During last two decades, CO2 sequestration in the subsurface has been extensively studied and progressed as a direct tool to reduce CO2 emission. Commercial projects such as Sleipner, In Salah and Weyburn that inject more than one million tons of CO2 per year are operated actively as well as test projects such as Ketzin to study the behavior of CO2 and the monitoring techniques. Korea also began the CCS (CO2 capture and storage) project. One of the prospects for CO2 sequestration in Korea is the southwestern continental margin of Ulleung basin. To monitor the behavior of CO2 underground for the evaluation of stability and safety, several geophysical monitoring techniques should be applied. Among various geophysical monitoring techniques, seismic survey is considered as the most effective tool. To verify CO2 migration in the subsurface more effectively, seismic numerical simulation is an essential process. Furthermore, the efficiency of the seismic migration techniques should be investigated for various cases because numerical seismic simulation and migration test help us accurately interpret CO2 migration. In this study, we apply the reverse-time migration and Kirchhoff migration to synthetic seismic monitoring data generated for the simplified model based on the geological structures of Ulleung basin in Korea. Synthetic seismic monitoring data are generated for various cases of CO2 migration in the subsurface. From the seismic migration images, we can investigate CO2 diffusion patterns indirectly. From seismic monitoring simulation, it is noted that while the reverse-time migration generates clear subsurface images when subsurface structures are steeply dipping, Kirchhoff migration has an advantage in imaging horizontal-layered structures such as depositional sediments appearing in the continental shelf. The reverse-time migration and Kirchhoff migration present reliable subsurface images for the potential site characterized by stratigraphical traps. In case of vertical CO2 migration at injection point, the reverse time migration yields better images than Kirchhoff migration does. On the other hand, Kirchhoff migration images horizontal CO2 migration clearer than the reverse time migration does. From these results, we can conclude that the reverse-time migration and Kirchhoff migration can complement with each other to describe the behavior of CO2 in the subsurface. Acknowledgement This work was financially supported by the Brain Korea 21 project of Energy Systems Engineering, the "Development of Technology for CO2 Marine Geological Storage" program funded by the Ministry of Land, Transport and Maritime Affairs (MLTM) of Korea and the Korea CCS R&D Center (KCRC) grant funded by the Korea government (Ministry of Education, Science and Technology) (No. 2012-0008926).

  19. Seismic Characterization of the Newberry and Cooper Basin EGS Sites

    NASA Astrophysics Data System (ADS)

    Templeton, D. C.; Wang, J.; Goebel, M.; Johannesson, G.; Myers, S. C.; Harris, D.; Cladouhos, T. T.

    2015-12-01

    To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance traditional microearthquake detection and location methodologies at two EGS systems: the Newberry EGS site and the Habanero EGS site in the Cooper Basin of South Australia. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP typically have smaller magnitudes or occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation is real, or simply within the anticipated error range. At the Newberry EGS site, 235 events were reported in the original catalog. MFP identified 164 additional events (an increase of over 70% more events). For the relocated events in the Newberry catalog, we can distinguish two distinct seismic swarms that fall outside of one another's 95% probability error ellipsoids.This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  20. Investigation of Potential Triggered Tremor in Latin America and the Caribbean

    NASA Astrophysics Data System (ADS)

    Gonzalez-Huizar, H.; Velasco, A. A.; Peng, Z.

    2012-12-01

    Recent observations have shown that seismic waves generate transient stresses capable of triggering earthquakes and tectonic (or non-volcanic) tremor far away from the original earthquake source. However, the mechanisms behind remotely triggered seismicity still remain unclear. Triggered tremor signals can be particularly useful in investigating remote triggering processes, since in many cases, the tremor pulses are very clearly modulated by the passing surface waves. The temporal stress changes (magnitude and orientation) caused by seismic waves at the tremor source region can be calculated and correlated with tremor pulses, which allows for exploring the stresses involved in the triggering process. Some observations suggest that triggered and ambient tremor signals are generated under similar physical conditions; thus, investigating triggered tremor might also provide important clues on how and under what conditions ambient tremor signals generate. In this work we present some of the results and techniques we employ in the research of potential cases of triggered tectonic tremor in Latin America and the Caribbean. This investigation includes: (1) the triggered tremor detection, with the use of specific signal filters; (2) localization of the sources, using uncommon techniques like time reversal signals; (3) and the analysis of the stress conditions under which they are generated, by modeling the triggering waves related dynamic stress. Our results suggest that tremor can be dynamically triggered by both Love and Rayleigh waves and in broad variety of tectonic environments depending strongly on the dynamic stress amplitude and orientation. Investigating remotely triggered seismicity offers the opportunity to improve our knowledge about deformation mechanisms and the physics of rupture.

  1. Project WILAS: Seismic imaging of crustal and upper mantle structures beneath the western Iberian Peninsula by means of the receiver-function technique

    NASA Astrophysics Data System (ADS)

    Dündar, Süleyman; Dias, Nuno A.; Silveira, Graça; Vinnik, Lev; Haberland, Christian

    2013-04-01

    An accurate knowledge of the structure of the earth's interior is of great importance to our understanding of tectonic processes. The WILAS-project (REF: PTDC/CTE-GIX/097946/2008) is a three-year collaborative project developed to study the subsurface structure of the western Iberian Peninsula, putting the main emphases on the lithosphere-asthenosphere system beneath the mainland of Portugal. The tectonic evolution of the target area has been driven by major plate-tectonic processes such as the historical opening of the Central Atlantic and the subsequent African-Eurasian convergence. Still, very little is known about the spatial structure of the continental collision. Within the framework of this research, a temporary network of 30 broadband three-component digital stations was operated between 2010 and 2012 in the target area. To carry out a large-scale structural analysis and facilitate a dense station-coverage for the area under investigation, the permanent Global Seismic Network stations, and temporary broadband stations deployed within the scope of the several seismic experiments (e.g. Doctar Network, Portuguese National Seismic Network), were included in the research analysis. In doing so, an unprecedented volume of high-quality data of a ca. 60X60 km density along with a combined network of 65 temporary and permanent broadband seismic stations are currently available for research purposes. One of the tasks of the WILAS research project has been a study of seismic velocity discontinuities beneath the western Iberian Peninsula region, up to a depth range of 700 km, utilizing the P- and S-receiver function techniques (PRF, SRF). Both techniques are based mainly on mode conversion of the elastic body-waves at an interface dividing the layers with different elastic properties. In the first phase of the project, PRF analysis was conducted in order to image the crust-mantle interface (Moho) and the mantle-transition-zone discontinuities at a depth of 410 km and 660 km beneath the area under investigation. While applying the common data processing steps (e.g., rotation, deconvolution and moveout-correction) to the selected data-set, we were able to create approximately 4.500 PRFs. The signals from the Moho, 410-km and 660-km discontinuities are clearly visible in many PRF stacks. The Moho depth range is from 26 to 34 km, with an average value of 29 km. No significant lateral variations in the depths of the "410-km" and "660-km" discontinuities have been identified so far. In the second phase of this project, the S-receiver-function technique will be applied in order to map the thickness of the underlying mantle lithosphere. Additionally, joint inversion of PRFs and waveforms of SKS will be used to investigate depth-localized azimuthal anisotropy and the related past and present mantle flows.

  2. Processing of multichannel seismic reflection data acquired in 2013 for seismic investigations of gas hydrates in the Gulf of Mexico

    USGS Publications Warehouse

    Miller, John J.; Agena, Warren F.; Haines, Seth S.; Hart, Patrick E.

    2016-04-13

    As part of a cooperative effort among the U.S. Geological Survey (USGS), the U.S. Department of Energy, and the U.S. Department of the Interior Bureau of Ocean Energy Management, two grids of two-dimensional multichannel seismic reflection data were acquired in the Gulf of Mexico over lease blocks Green Canyon 955 and Walker Ridge 313 between April 18 and May 3, 2013. The purpose of the data acquisition was to fill knowledge gaps in an ongoing study of known gas hydrate accumulations in the area. These data were initially processed onboard the recording ship R/V Pelican for more quality control during the recording. The data were subsequently processed in detail by the U.S. Geological Survey in Denver, Colorado, in two phases. The first phase was to create a “kinematic” dataset that removed extensive noise present in the data but did not preserve relative amplitudes. The second phase was to create a true relative amplitude dataset that included noise removal and “wavelet” deconvolution that preserved the amplitude information. This report describes the processing techniques used to create both datasets.

  3. Investigating Brittle Rock Failure and Associated Seismicity Using Laboratory Experiments and Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Zhao, Qi

    Rock failure process is a complex phenomenon that involves elastic and plastic deformation, microscopic cracking, macroscopic fracturing, and frictional slipping of fractures. Understanding this complex behaviour has been the focus of a significant amount of research. In this work, the combined finite-discrete element method (FDEM) was first employed to study (1) the influence of rock discontinuities on hydraulic fracturing and associated seismicity and (2) the influence of in-situ stress on seismic behaviour. Simulated seismic events were analyzed using post-processing tools including frequency-magnitude distribution (b-value), spatial fractal dimension (D-value), seismic rate, and fracture clustering. These simulations demonstrated that at the local scale, fractures tended to propagate following the rock mass discontinuities; while at reservoir scale, they developed in the direction parallel to the maximum in-situ stress. Moreover, seismic signature (i.e., b-value, D-value, and seismic rate) can help to distinguish different phases of the failure process. The FDEM modelling technique and developed analysis tools were then coupled with laboratory experiments to further investigate the different phases of the progressive rock failure process. Firstly, a uniaxial compression experiment, monitored using a time-lapse ultrasonic tomography method, was carried out and reproduced by the numerical model. Using this combination of technologies, the entire deformation and failure processes were studied at macroscopic and microscopic scales. The results not only illustrated the rock failure and seismic behaviours at different stress levels, but also suggested several precursory behaviours indicating the catastrophic failure of the rock. Secondly, rotary shear experiments were conducted using a newly developed rock physics experimental apparatus ERDmu-T) that was paired with X-ray micro-computed tomography (muCT). This combination of technologies has significant advantages over conventional rotary shear experiments since it allowed for the direct observation of how two rough surfaces interact and deform without perturbing the experimental conditions. Some intriguing observations were made pertaining to key areas of the study of fault evolution, making possible for a more comprehensive interpretation of the frictional sliding behaviour. Lastly, a carefully calibrated FDEM model that was built based on the rotary experiment was utilized to investigate facets that the experiment was not able to resolve, for example, the time-continuous stress condition and the seismic activity on the shear surface. The model reproduced the mechanical behaviour observed in the laboratory experiment, shedding light on the understanding of fault evolution.

  4. A seismic data compression system using subband coding

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.; Pollara, F.

    1995-01-01

    This article presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The algorithm includes three stages: a decorrelation stage, a quantization stage that introduces a controlled amount of distortion to allow for high compression ratios, and a lossless entropy coding stage based on a simple but efficient arithmetic coding method. Subband coding methods are particularly suited to the decorrelation of nonstationary processes such as seismic events. Adaptivity to the nonstationary behavior of the waveform is achieved by dividing the data into separate blocks that are encoded separately with an adaptive arithmetic encoder. This is done with high efficiency due to the low overhead introduced by the arithmetic encoder in specifying its parameters. The technique could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.

  5. Trans-dimensional and hierarchical Bayesian approaches toward rigorous estimation of seismic sources and structures in the Northeast Asia

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean

    2016-04-01

    A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.

  6. Multimodal approach to seismic pavement testing

    USGS Publications Warehouse

    Ryden, N.; Park, C.B.; Ulriksen, P.; Miller, R.D.

    2004-01-01

    A multimodal approach to nondestructive seismic pavement testing is described. The presented approach is based on multichannel analysis of all types of seismic waves propagating along the surface of the pavement. The multichannel data acquisition method is replaced by multichannel simulation with one receiver. This method uses only one accelerometer-receiver and a light hammer-source, to generate a synthetic receiver array. This data acquisition technique is made possible through careful triggering of the source and results in such simplification of the technique that it is made generally available. Multiple dispersion curves are automatically and objectively extracted using the multichannel analysis of surface waves processing scheme, which is described. Resulting dispersion curves in the high frequency range match with theoretical Lamb waves in a free plate. At lower frequencies there are several branches of dispersion curves corresponding to the lower layers of different stiffness in the pavement system. The observed behavior of multimodal dispersion curves is in agreement with theory, which has been validated through both numerical modeling and the transfer matrix method, by solving for complex wave numbers. ?? ASCE / JUNE 2004.

  7. Magma migration at the onset of the 2012-13 Tolbachik eruption revealed by Seismic Amplitude Ratio Analyses

    NASA Astrophysics Data System (ADS)

    Taisne, B.; Caudron, C.; Kugaenko, Y.; Saltykov, V.

    2015-12-01

    In contrast of the 1975-76 Tolbachik eruption, the 2012-2013 Tolbachik eruption was not preceded by any striking change in seismic activity. By processing the Klyuchevskoy volcano group seismic data with the Seismic Amplitude Ratio Analysis (SARA) method, we gain insights into the dynamics of magma transfer prior to this important eruption. We highlighted a clear migration of the source of the microseismicity within the seismic swarm, starting 20 hours before the reported eruption onset (05:15 UTC, 26 November 2012). This migration proceeded in different phases and ended when eruptive tremor, corresponding to lava extrusion, was recorded (at ~11:00 UTC, 27 November 2012). In order to get a first order approximation of the location of the magma, we compare the calculated seismic intensity ratios with the theoretical ones. As expected, the observations suggest a migration toward the eruptive vent. However, we explain the pre-eruptive observed ratios by a vertical migration under the northern slope of Plosky Tolbachik volcano that would interact at shallower depth with an intermediate storage region and initiate the lateral migration toward the eruptive vents. Another migration is also captured by this technique and coincides with a seismic swarm that started 16-20 km to the south of Plosky Tolbachik at 20:31 UTC on November 28 and lasted for more than 2 days. This seismic swarm is very similar to the seismicity preceding the 1975-76 Tolbachik eruption and can be considered as a possible aborted eruption.

  8. Monitoring the Earthquake source process in North America

    USGS Publications Warehouse

    Herrmann, Robert B.; Benz, H.; Ammon, C.J.

    2011-01-01

    With the implementation of the USGS National Earthquake Information Center Prompt Assessment of Global Earthquakes for Response system (PAGER), rapid determination of earthquake moment magnitude is essential, especially for earthquakes that are felt within the contiguous United States. We report an implementation of moment tensor processing for application to broad, seismically active areas of North America. This effort focuses on the selection of regional crustal velocity models, codification of data quality tests, and the development of procedures for rapid computation of the seismic moment tensor. We systematically apply these techniques to earthquakes with reported magnitude greater than 3.5 in continental North America that are not associated with a tectonic plate boundary. Using the 0.02-0.10 Hz passband, we can usually determine, with few exceptions, moment tensor solutions for earthquakes with M w as small as 3.7. The threshold is significantly influenced by the density of stations, the location of the earthquake relative to the seismic stations and, of course, the signal-to-noise ratio. With the existing permanent broadband stations in North America operated for rapid earthquake response, the seismic moment tensor of most earthquakes that are M w 4 or larger can be routinely computed. As expected the nonuniform spatial pattern of these solutions reflects the seismicity pattern. However, the orientation of the direction of maximum compressive stress and the predominant style of faulting is spatially coherent across large regions of the continent.

  9. Probabilistic Reasoning Over Seismic Time Series: Volcano Monitoring by Hidden Markov Models at Mt. Etna

    NASA Astrophysics Data System (ADS)

    Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea; Montalto, Placido; Patanè, Domenico; Privitera, Eugenio

    2016-07-01

    From January 2011 to December 2015, Mt. Etna was mainly characterized by a cyclic eruptive behavior with more than 40 lava fountains from New South-East Crater. Using the RMS (Root Mean Square) of the seismic signal recorded by stations close to the summit area, an automatic recognition of the different states of volcanic activity (QUIET, PRE-FOUNTAIN, FOUNTAIN, POST-FOUNTAIN) has been applied for monitoring purposes. Since values of the RMS time series calculated on the seismic signal are generated from a stochastic process, we can try to model the system generating its sampled values, assumed to be a Markov process, using Hidden Markov Models (HMMs). HMMs analysis seeks to recover the sequence of hidden states from the observations. In our framework, observations are characters generated by the Symbolic Aggregate approXimation (SAX) technique, which maps RMS time series values with symbols of a pre-defined alphabet. The main advantages of the proposed framework, based on HMMs and SAX, with respect to other automatic systems applied on seismic signals at Mt. Etna, are the use of multiple stations and static thresholds to well characterize the volcano states. Its application on a wide seismic dataset of Etna volcano shows the possibility to guess the volcano states. The experimental results show that, in most of the cases, we detected lava fountains in advance.

  10. An extended stochastic method for seismic hazard estimation

    NASA Astrophysics Data System (ADS)

    Abd el-aal, A. K.; El-Eraki, M. A.; Mostafa, S. I.

    2015-12-01

    In this contribution, we developed an extended stochastic technique for seismic hazard assessment purposes. This technique depends on the hypothesis of stochastic technique of Boore (2003) "Simulation of ground motion using the stochastic method. Appl. Geophy. 160:635-676". The essential characteristics of extended stochastic technique are to obtain and simulate ground motion in order to minimize future earthquake consequences. The first step of this technique is defining the seismic sources which mostly affect the study area. Then, the maximum expected magnitude is defined for each of these seismic sources. It is followed by estimating the ground motion using an empirical attenuation relationship. Finally, the site amplification is implemented in calculating the peak ground acceleration (PGA) at each site of interest. We tested and applied this developed technique at Cairo, Suez, Port Said, Ismailia, Zagazig and Damietta cities to predict the ground motion. Also, it is applied at Cairo, Zagazig and Damietta cities to estimate the maximum peak ground acceleration at actual soil conditions. In addition, 0.5, 1, 5, 10 and 20 % damping median response spectra are estimated using the extended stochastic simulation technique. The calculated highest acceleration values at bedrock conditions are found at Suez city with a value of 44 cm s-2. However, these acceleration values decrease towards the north of the study area to reach 14.1 cm s-2 at Damietta city. This comes in agreement with the results of previous studies of seismic hazards in northern Egypt and is found to be comparable. This work can be used for seismic risk mitigation and earthquake engineering purposes.

  11. Seismic While Drilling Case Study in Shengli Oilfield, Eastern China

    NASA Astrophysics Data System (ADS)

    Wang, L.; Liu, H.; Tong, S.; Zou, Z.

    2015-12-01

    Seismic while drilling (SWD) is a promising borehole seismic technique with reduction of drilling risk, cost savings and increased efficiency. To evaluate the technical and economic benefits of this new technique, we carried out SWD survey at well G130 in Shengli Oilfield of Eastern China. Well G130 is an evaluation well, located in Dongying depression at depth more than 3500m. We used an array of portable seismometers to record the surface SWD-data, during the whole drilling progress. The pilot signal was being recorded continuously, by an accelerometer mounted on the top of the drill string. There were also two seismometers buried in the drill yard, one near diesel engine and another near derrick. All the data was being recorded continuously. According to mud logging data, we have processed and analyzed all the data. It demonstrates the drill yard noise is the primary noise among the whole surface wavefield and its dominant frequency is about 20Hz. Crosscorrelation of surface signal with the pilot signal shows its SNR is severely low and there is no any obvious event of drill-bit signals. Fortunately, the autocorrelation of the pilot signal shows clear BHA multiple and drill string multiple. The period of drill string multiple can be used for establishing the reference time (so-called zero time). We identified and removed different noises from the surface SWD-data, taking advantages of wavefield analysis. The drill-bit signal was retrieved from surface SWD-data, using seismic interferometry. And a reverse vertical seismic profile (RVSP) data set for the continuous drilling depth was established. The subsurface images derived from these data compare well with the corresponding images of 3D surface seismic survey cross the well.

  12. Study of a prehistoric landslide using seismic reflection methods integrated with geological data in the Wasatch Mountains, Utah, USA

    USGS Publications Warehouse

    Tingey, B.E.; McBride, J.H.; Thompson, T.J.; Stephenson, W.J.; South, J.V.; Bushman, M.

    2007-01-01

    An integration of geological and geophysical techniques characterizes the internal and basal structure of a landslide along the western margin of the Wasatch Mountains in northern Utah, USA. The study area is within a region of planned and continuing residential development. The Little Valley Landslide is a prehistoric landslide as old as 13??ka B.P. Drilling and trenching at the site indicate that the landslide consists of chaotic and disturbed weathered volcanic material derived from Tertiary age volcanic rocks that comprise a great portion of the Wasatch Range. Five short high-resolution common mid-point seismic reflection profiles over selected portions of the site examine the feasibility of using seismic reflection to study prehistoric landslides in the Wasatch Mountain region. Due to the expected complexity of the near-surface geology, we have pursued an experimental approach in the data processing, examining the effects of muting first arrivals, frequency filtering, model-based static corrections, and seismic migration. The results provide a framework for understanding the overall configuration of the landslide, its basal (failure) surface, and the structure immediately underlying this surface. A glide surface or de??collement is interpreted to underlie the landslide suggesting a large mass movement. The interpretation of a glide surface is based on the onset of coherent reflectivity, calibrated by information from a borehole located along one of the seismic profiles. The glide surface is deepest in the center portion of the landslide and shallows up slope, suggesting a trough-like feature. This study shows that seismic reflection techniques can be successfully used in complex alpine landslide regions to (1) provide a framework in which to link geological data and (2) reduce the need for an extensive trenching and drilling program. ?? 2007 Elsevier B.V. All rights reserved.

  13. Anomalous Induced Seismicity due to Hydraulic Fracturing. Case of study in the Montney Formation, Northeast British Columbia.

    NASA Astrophysics Data System (ADS)

    Longobardi, M.; Bustin, A. M. M.; Johansen, K.; Bustin, R. M.

    2017-12-01

    One of our goals is to investigate the variables and processes controlling the anomalous induced seismicity and its associated ground motions, to better understand the anomalous induced seismicity (AIS) due to hydraulic fracturing in Northeast British Columbia. Our other main objective is to optimize-completions and well design. Although the vast majority of earthquakes that occur in the world each year have natural causes, some of these earthquakes and a number of lesser magnitude seismic events are induced by human activities. The recorded induced seismicity resulting from the fluid injection during hydraulic fracturing is generally small in magnitude (< M 1). Shale gas operations in Northeast British Columbia (BC) have induced the largest recorded occurrence and magnitude of AIS because of hydraulic fracturing. Anomalous induced seismicity have been recorded in seven clusters within the Montney area, with magnitudes up to ML 4.6. Five of these clusters have been linked to hydraulic fracturing. To analyse our AIS data, we first have calculated the earthquakes hypocenters. The data was recorded on an array of real-time accelerometers. We built the array based on our modified design from the early earthquake detectors installed in BC schools for the Earthquake Early Warning System for British Columbia. We have developed a new technique for locating hypocenters and applied it to our dataset. The technique will enable near real-time event location, aiding in both mitigating induced events and adjusting completions to optimize the stimulation. Our hypocenter program assumes to consider a S wave speed, fitting the arrival times to the hypocenter, and using an "amoebae method" multivariate. We have used this method because it is well suited to minimizing of the chi-squared function of the arrival time deviation. We show some preliminary results on the Montney dataset.

  14. Exploring Large-Scale Cross-Correlation for Teleseismic and Regional Seismic Event Characterization

    NASA Astrophysics Data System (ADS)

    Dodge, Doug; Walter, William; Myers, Steve; Ford, Sean; Harris, Dave; Ruppert, Stan; Buttler, Dave; Hauk, Terri

    2013-04-01

    The decrease in costs of both digital storage space and computation power invites new methods of seismic data processing. At Lawrence Livermore National Laboratory(LLNL) we operate a growing research database of seismic events and waveforms for nuclear explosion monitoring and other applications. Currently the LLNL database contains several million events associated with tens of millions of waveforms at thousands of stations. We are making use of this database to explore the power of seismic waveform correlation to quantify signal similarities, to discover new events not in catalogs, and to more accurately locate events and identify source types. Building on the very efficient correlation methodologies of Harris and Dodge (2011) we computed the waveform correlation for event pairs in the LLNL database in two ways. First we performed entire waveform cross-correlation over seven distinct frequency bands. The correlation coefficient exceeds 0.6 for more than 40 million waveform pairs for several hundred thousand events at more than a thousand stations. These correlations reveal clusters of mining events and aftershock sequences, which can be used to readily identify and locate events. Second we determine relative pick times by correlating signals in time windows for distinct seismic phases. These correlated picks are then used to perform very high accuracy event relocations. We are examining the percentage of events that correlate as a function of magnitude and observing station distance in selected high seismicity regions. Combining these empirical results and those using synthetic data, we are working to quantify relationships between correlation and event pair separation (in epicenter and depth) as well as mechanism differences. Our exploration of these techniques on a large seismic database is in process and we will report on our findings in more detail at the meeting.

  15. Exploring Large-Scale Cross-Correlation for Teleseismic and Regional Seismic Event Characterization

    NASA Astrophysics Data System (ADS)

    Dodge, D.; Walter, W. R.; Myers, S. C.; Ford, S. R.; Harris, D.; Ruppert, S.; Buttler, D.; Hauk, T. F.

    2012-12-01

    The decrease in costs of both digital storage space and computation power invites new methods of seismic data processing. At Lawrence Livermore National Laboratory (LLNL) we operate a growing research database of seismic events and waveforms for nuclear explosion monitoring and other applications. Currently the LLNL database contains several million events associated with tens of millions of waveforms at thousands of stations. We are making use of this database to explore the power of seismic waveform correlation to quantify signal similarities, to discover new events not in catalogs, and to more accurately locate events and identify source types. Building on the very efficient correlation methodologies of Harris and Dodge (2011) we computed the waveform correlation for event pairs in the LLNL database in two ways. First we performed entire waveform cross-correlation over seven distinct frequency bands. The correlation coefficient exceeds 0.6 for more than 40 million waveform pairs for several hundred thousand events at more than a thousand stations. These correlations reveal clusters of mining events and aftershock sequences, which can be used to readily identify and locate events. Second we determine relative pick times by correlating signals in time windows for distinct seismic phases. These correlated picks are then used to perform very high accuracy event relocations. We are examining the percentage of events that correlate as a function of magnitude and observing station distance in selected high seismicity regions. Combining these empirical results and those using synthetic data, we are working to quantify relationships between correlation and event pair separation (in epicenter and depth) as well as mechanism differences. Our exploration of these techniques on a large seismic database is in process and we will report on our findings in more detail at the meeting.

  16. An Integrated Monitoring System of Pre-earthquake Processes in Peloponnese, Greece

    NASA Astrophysics Data System (ADS)

    Karastathis, V. K.; Tsinganos, K.; Kafatos, M.; Eleftheriou, G.; Ouzounov, D.; Mouzakiotis, E.; Papadopoulos, G. A.; Voulgaris, N.; Bocchini, G. M.; Liakopoulos, S.; Aspiotis, T.; Gika, F.; Tselentis, A.; Moshou, A.; Psiloglou, B.

    2017-12-01

    One of the controversial issues in the contemporary seismology is the ability of radon accumulation monitoring to provide reliable earthquake forecasting. Although there are many examples in the literature showing radon increase before earthquakes, skepticism arises from instability of the measurements, false alarms, difficulties in interpretation caused by the weather influence (eg. rainfall) and difficulties on the consideration an irrefutable theoretical background of the phenomenon.We have developed and extensively tested a multi parameter network aimed for studying of the pre-earthquake processes and operating as a part of integrated monitoring system in the high seismicity area of the Western Hellenic Arc (SW Peloponnese, Greece). The prototype consists of four components: A real-time monitoring system of Radon accumulation. It consists of three gamma radiation detectors [NaI(Tl) scintillators] A nine-station seismic array to monitor the microseismicity in the offshore area of the Hellenic arc. The processing of the data is based on F-K and beam-forming techniques. Real-time weather monitoring systems for air temperature, relative humidity, precipitation and pressure. Thermal radiation emission from AVHRR/NOAA-18 polar orbit satellite observation. The project revolved around the idea of jointly studying the emission of Radon that has been proven in many cases as a reliable indicator of the possible time of an event, with the accurate location of the foreshock activity detected by the seismic array that can be a more reliable indicator of the possible position of an event. In parallel a satellite thermal anomaly detection technique has been used for monitoring of larger magnitude events (possible indicator for strong events M ≥5.0.). The first year of operations revealed a number of pre-seismic radon variation anomalies before several local earthquakes (M>3.6). The Radon increases systematically before the larger events.Details about the overall performance in registration of pre-seismic signals in Peloponnese region, along with two distant but very strong earthquakes in Jun 12, 2017 M6.3 and Jul 20, 2017 M6.6 in Greece will be discussed.

  17. An application of LOTEM around salt dome near Houston, Texas

    NASA Astrophysics Data System (ADS)

    Paembonan, Andri Yadi; Arjwech, Rungroj; Davydycheva, Sofia; Smirnov, Maxim; Strack, Kurt M.

    2017-07-01

    A salt dome is an important large geologic structure for hydrocarbon exploration. It may seal a porous reservoir of rocks that form petroleum reservoirs. Several techniques such as seismic, gravity, and electromagnetic including magnetotelluric have successfully yielded salt dome interpretation. Seismic has difficulties seeing through the salt because the seismic energy gets trapped by the salt due to its high velocity. Gravity and electromagnetics are more ideal methods. Long Offset Transient Electromagnetic (LOTEM) and Focused Source Electromagnetic (FSEM) were tested over a salt dome near Houston, Texas. LOTEM data were recorded at several stations with varying offset, and the FSEM tests were also made at some receiver locations near a suspected salt overhang. The data were processed using KMS's processing software: First, for assurance, including calibration and header checking; then transmitter and receiver data are merged and microseismic data is separated; Finally, data analysis and processing follows. LOTEM processing leads to inversion or in the FSEM case 3D modeling. Various 3D models verify the sensitivity under the salt dome. In addition, the processing was conducted pre-stack, stack, and post-stack. After pre-stacking, the noise was reduced, but showed the ringing effect due to a low-pass filter. Stacking and post-stacking with applying recursive average could reduce the Gibbs effect and produce smooth data.

  18. Unsupervised Approaches for Post-Processing in Computationally Efficient Waveform-Similarity-Based Earthquake Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.

    2015-12-01

    Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.

  19. Source characteristics of 2000 small earthquakes nucleating on the Alto Tiberina fault system (central Italy).

    NASA Astrophysics Data System (ADS)

    Munafo, I.; Malagnini, L.; Tinti, E.; Chiaraluce, L.; Di Stefano, R.; Valoroso, L.

    2014-12-01

    The Alto Tiberina Fault (ATF) is a 60 km long east-dipping low-angle normal fault, located in a sector of the Northern Apennines (Italy) undergoing active extension since the Quaternary. The ATF has been imaged by analyzing the active source seismic reflection profiles, and the instrumentally recorded persistent background seismicity. The present study is an attempt to separate the contributions of source, site, and crustal attenuation, in order to focus on the mechanics of the seismic sources on the ATF, as well on the synthetic and the antithetic structures within the ATF hanging-wall (i.e. Colfiorito fault, Gubbio fault and Umbria Valley fault). In order to compute source spectra, we perform a set of regressions over the seismograms of 2000 small earthquakes (-0.8 < ML< 4) recorded between 2010 and 2014 at 50 permanent seismic stations deployed in the framework of the Alto Tiberina Near Fault Observatory project (TABOO) and equipped with three-components seismometers, three of which located in shallow boreholes. Because we deal with some very small earthquakes, we maximize the signal to noise ratio (SNR) with a technique based on the analysis of peak values of bandpass-filtered time histories, in addition to the same processing performed on Fourier amplitudes. We rely on a tool called Random Vibration Theory (RVT) to completely switch from peak values in the time domain to Fourier spectral amplitudes. Low-frequency spectral plateau of the source terms are used to compute moment magnitudes (Mw) of all the events, whereas a source spectral ratio technique is used to estimate the corner frequencies (Brune spectral model) of a subset of events chosen over the analysis of the noise affecting the spectral ratios. So far, the described approach provides high accuracy over the spectral parameters of earthquakes of localized seismicity, and may be used to gain insights into the underlying mechanics of faulting and the earthquake processes.

  20. Combined interpretation of 3D seismic reflection attributes for geothermal exploration in the Polish Basin using self-organizing maps

    NASA Astrophysics Data System (ADS)

    Bauer, Klaus; Pussak, Marcin; Stiller, Manfred; Bujakowski, Wieslaw

    2014-05-01

    Self-organizing maps (SOM) are neural network techniques which can be used for the joint interpretation of multi-disciplinary data sets. In this investigation we apply SOM within a geothermal exploration project using 3D seismic reflection data. The study area is located in the central part of the Polish basin. Several sedimentary target horizons were identified at this location based on fluid flow rate measurements in the geothermal research well Kompina-2. The general objective is a seismic facies analysis and characterization of the major geothermal target reservoir. A 3D seismic reflection experiment with a sparse acquisition geometry was carried out around well Kompina-2. Conventional signal processing (amplitude corrections, filtering, spectral whitening, deconvolution, static corrections, muting) was followed by normal-moveout (NMO) stacking, and, alternatively, by common-reflection-surface (CRS) stacking. Different signal attributes were then derived from the stacked images including root-mean-square (RMS) amplitude, instantaneous frequency and coherency. Furthermore, spectral decomposition attributes were calculated based on the continuous wavelet transform. The resulting attribute maps along major target horizons appear noisy after the NMO stack and clearly structured after the CRS stack. Consequently, the following SOM-based multi-parameter signal attribute analysis was applied only to the CRS images. We applied our SOM work flow, which includes data preparation, unsupervised learning, segmentation of the trained SOM using image processing techniques, and final application of the learned knowledge. For the Lower Jurassic target horizon Ja1 we derived four different clusters with distinct seismic attribute signatures. As the most striking feature, a corridor parallel to a fault system was identified, which is characterized by decreased RMS amplitudes and low frequencies. In our interpretation we assume that this combination of signal properties can be explained by increased fracture porosity and enhanced fluid saturation within this part of the Lower Jurassic sandstone horizon. Hence, we suggest that a future drilling should be carried out within this compartment of the reservoir.

  1. Towards monitoring the englacial fracture state using virtual-reflector seismology

    NASA Astrophysics Data System (ADS)

    Lindner, F.; Weemstra, C.; Walter, F.; Hadziioannou, C.

    2018-04-01

    In seismology, coda wave interferometry (CWI) is an effective tool to monitor time-lapse changes using later arriving, multiply scattered coda waves. Typically, CWI relies on an estimate of the medium's impulse response. The latter is retrieved through simple time-averaging of receiver-receiver cross-correlations of the ambient field, i.e. seismic interferometry (SI). In general, the coda are induced by heterogeneities in the Earth. Being comparatively homogeneous, however, ice bodies such as glaciers and ice sheets exhibit little scattering. In addition, the temporal stability of the time-averaged cross-correlations suffers from temporal variations in the distribution and amplitude of the passive seismic sources. Consequently, application of CWI to ice bodies is currently limited. Nevertheless, fracturing and changes in the englacial macroscopic water content alter the bulk elastic properties of ice bodies, which can be monitored with cryoseismological measurements. To overcome the current limited applicability of CWI to ice bodies, we therefore introduce virtual-reflector seismology (VRS). VRS relies on a so-called multidimensional deconvolution (MDD) process of the time-averaged crosscorrelations. The technique results in the retrieval of a medium response that includes virtual reflections from a contour of receivers enclosing the region of interest (i.e., the region to be monitored). The virtual reflections can be interpreted as artificial coda replacing the (lacking) natural scattered coda. Hence, this artificial coda might be exploited for the purpose of CWI. From an implementation point of view, VRS is similar to SI by MDD, which, as its name suggests, also relies on a multidimensional deconvolution process. SI by MDD, however, does not generate additional virtual reflections. Advantageously, both techniques mitigate spurious coda changes associated with temporal variations in the distribution and amplitude of the passive seismic sources. In this work, we apply SI by MDD and VRS to synthetic and active seismic surface-wave data. The active seismic data were acquired on Glacier de la Plaine Morte, Switzerland. We successfully retrieve virtual reflections through the application of VRS to this active seismic data. In application to both synthetic and active seismic data, we show the potential of VRS to monitor time-lapse changes. In addition, we find that SI by MDD allows for a more accurate determination of phase velocity.

  2. Applying the seismic interferometry method to vertical seismic profile data using tunnel excavation noise as source

    NASA Astrophysics Data System (ADS)

    Jurado, Maria Jose; Teixido, Teresa; Martin, Elena; Segarra, Miguel; Segura, Carlos

    2013-04-01

    In the frame of the research conducted to develop efficient strategies for investigation of rock properties and fluids ahead of tunnel excavations the seismic interferometry method was applied to analyze the data acquired in boreholes instrumented with geophone strings. The results obtained confirmed that seismic interferometry provided an improved resolution of petrophysical properties to identify heterogeneities and geological structures ahead of the excavation. These features are beyond the resolution of other conventional geophysical methods but can be the cause severe problems in the excavation of tunnels. Geophone strings were used to record different types of seismic noise generated at the tunnel head during excavation with a tunnelling machine and also during the placement of the rings covering the tunnel excavation. In this study we show how tunnel construction activities have been characterized as source of seismic signal and used in our research as the seismic source signal for generating a 3D reflection seismic survey. The data was recorded in vertical water filled borehole with a borehole seismic string at a distance of 60 m from the tunnel trace. A reference pilot signal was obtained from seismograms acquired close the tunnel face excavation in order to obtain best signal-to-noise ratio to be used in the interferometry processing (Poletto et al., 2010). The seismic interferometry method (Claerbout 1968) was successfully applied to image the subsurface geological structure using the seismic wave field generated by tunneling (tunnelling machine and construction activities) recorded with geophone strings. This technique was applied simulating virtual shot records related to the number of receivers in the borehole with the seismic transmitted events, and processing the data as a reflection seismic survey. The pseudo reflective wave field was obtained by cross-correlation of the transmitted wave data. We applied the relationship between the transmission response and the reflection response for a 1D multilayer structure, and next 3D approach (Wapenaar 2004). As a result of this seismic interferometry experiment the 3D reflectivity model (frequencies and resolution ranges) was obtained. We proved also that the seismic interferometry approach can be applied in asynchronous seismic auscultation. The reflections detected in the virtual seismic sections are in agreement with the geological features encountered during the excavation of the tunnel and also with the petrophysical properties and parameters measured in previous geophysical borehole logging. References Claerbout J.F., 1968. Synthesis of a layered medium from its acoustic transmision response. Geophysics, 33, 264-269 Flavio Poletto, Piero Corubolo and Paolo Comeli.2010. Drill-bit seismic interferometry whith and whitout pilot signals. Geophysical Prospecting, 2010, 58, 257-265. Wapenaar, K., J. Thorbecke, and D. Draganov, 2004, Relations between reflection and transmission responses of three-dimensional inhomogeneous media: Geophysical Journal International, 156, 179-194.

  3. The nature of noise wavefield and its applications for site effects studies: A literature review

    NASA Astrophysics Data System (ADS)

    Bonnefoy-Claudet, Sylvette; Cotton, Fabrice; Bard, Pierre-Yves

    2006-12-01

    The aim of this paper is to discuss the existing scientific literature in order to gather all the available information dealing with the origin and the nature of the ambient seismic noise wavefield. This issue is essential as the use of seismic noise is more and more popular for seismic hazard purposes with a growing number of processing techniques based on the assumption that the noise wavefield is predominantly consisting of fundamental mode Rayleigh waves. This survey reveals an overall agreement about the origin of seismic noise and its frequency dependence. At frequencies higher than 1 Hz, seismic noise systematically exhibits daily and weekly variations linked to human activities, whereas at lower frequencies (between 0.005 and 0.3 Hz) the variation of seismic noise is correlated to natural activities (oceanic, meteorological…). Such a surface origin clearly supports the interpretation of seismic noise wavefield consisting primarily of surface waves. However, the further, very common (though hidden) assumption according which almost all the noise energy would be carried by fundamental mode Rayleigh waves is not supported by the few available data: no "average" number can though be given concerning the actual proportion between surface and body waves, Love and Rayleigh waves (horizontal components), fundamental and higher modes (vertical components), since the few available investigations report a significant variability, which might be related with site conditions and noise source properties.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brian Toelle

    This project, 'Application of Time-Lapse Seismic Monitoring for the Control and Optimization of CO{sub 2} Enhanced Oil Recovery Operations', investigated the potential for monitoring CO{sub 2} floods in carbonate reservoirs through the use of standard p-wave seismic data. This primarily involved the use of 4D seismic (time lapse seismic) in an attempt to observe and map the movement of the injected CO{sub 2} through a carbonate reservoir. The differences between certain seismic attributes, such as amplitude, were used for this purpose. This technique has recently been shown to be effective in CO{sub 2} monitoring in Enhanced Oil Recovery (EOR) projects,more » such as Weyborne. This study was conducted in the Charlton 30/31 field in the northern Michigan Basin, which is a Silurian pinnacle reef that completed its primary production in 1997 and was scheduled for enhanced oil recovery using injected CO{sub 2}. Prior to injection an initial 'Base' 3D survey was obtained over the field and was then processed and interpreted. CO{sub 2} injection within the main portion of the reef was conducted intermittently during 13 months starting in August 2005. During this time, 29,000 tons of CO{sub 2} was injected into the Guelph formation, historically known as the Niagaran Brown formation. By September 2006, the reservoir pressure within the reef had risen to approximately 2000 lbs and oil and water production from the one producing well within the field had increased significantly. The determination of the reservoir's porosity distribution, a critical aspect of reservoir characterization and simulation, proved to be a significant portion of this project. In order to relate the differences observed between the seismic attributes seen on the multiple 3D seismic surveys and the actual location of the CO{sub 2}, a predictive reservoir simulation model was developed based on seismic attributes obtained from the base 3D seismic survey and available well data. This simulation predicted that the CO{sub 2} injected into the reef would remain in the northern portion of the field. Two new wells, the State Charlton 4-30 and the Larsen 3-31, were drilled into the field in 2006 and 2008 respectively and supported this assessment. A second (or 'Monitor') 3D seismic survey was acquired during September 2007 over most of the field and duplicated the first (Base) survey, as much as possible. However, as the simulation and new well data available at that time indicated that the CO{sub 2} was concentrated in the northern portion of the field, the second seismic survey was not acquired over the extreme southern end of the area covered by the original (or Base) 3D survey. Basic processing was performed on the second 3D seismic survey and, finally, 4D processing methods were applied to both the Base and the Monitor surveys. In addition to this 3D data, a shear wave seismic data set was obtained at the same time. Interpretation of the 4D seismic data indicated that a significant amplitude change, not attributable to differences in acquisition or processing, existed at the locations within the reef predicted by the reservoir simulation. The reservoir simulation was based on the porosity distribution obtained from seismic attributes from the Base 3D survey. Using this validated reservoir simulation the location of oil within the reef at the time the Monitor survey was obtained and recommendations made for the drilling of additional EOR wells. The economic impact of this project has been estimated in terms of both enhanced oil recovery and CO{sub 2} sequestration potential. In the northern Michigan Basin alone, the Niagaran reef play is comprised of over 700 Niagaran reefs with reservoirs already depleted by primary production. Potentially there is over 1 billion bbls of oil (original oil in place minus primary recovery) remains in the reefs in Michigan, much of which could be more efficiently mobilized utilizing techniques similar to those employed in this study.« less

  5. "Geo-statistics methods and neural networks in geophysical applications: A case study"

    NASA Astrophysics Data System (ADS)

    Rodriguez Sandoval, R.; Urrutia Fucugauchi, J.; Ramirez Cruz, L. C.

    2008-12-01

    The study is focus in the Ebano-Panuco basin of northeastern Mexico, which is being explored for hydrocarbon reservoirs. These reservoirs are in limestones and there is interest in determining porosity and permeability in the carbonate sequences. The porosity maps presented in this study are estimated from application of multiattribute and neural networks techniques, which combine geophysics logs and 3-D seismic data by means of statistical relationships. The multiattribute analysis is a process to predict a volume of any underground petrophysical measurement from well-log and seismic data. The data consist of a series of target logs from wells which tie a 3-D seismic volume. The target logs are neutron porosity logs. From the 3-D seismic volume a series of sample attributes is calculated. The objective of this study is to derive a set of attributes and the target log values. The selected set is determined by a process of forward stepwise regression. The analysis can be linear or nonlinear. In the linear mode the method consists of a series of weights derived by least-square minimization. In the nonlinear mode, a neural network is trained using the select attributes as inputs. In this case we used a probabilistic neural network PNN. The method is applied to a real data set from PEMEX. For better reservoir characterization the porosity distribution was estimated using both techniques. The case shown a continues improvement in the prediction of the porosity from the multiattribute to the neural network analysis. The improvement is in the training and the validation, which are important indicators of the reliability of the results. The neural network showed an improvement in resolution over the multiattribute analysis. The final maps provide more realistic results of the porosity distribution.

  6. Infrasound and Seismic Recordings of Rocket Launches from Kennedy Space Center, 2016-2017

    NASA Astrophysics Data System (ADS)

    McNutt, S. R.; Thompson, G.; Brown, R. G.; Braunmiller, J.; Farrell, A. K.; Mehta, C.

    2017-12-01

    We installed a temporary 3-station seismic-infrasound network at Kennedy Space Center (KSC) in February 2016 to test sensor calibrations and train students in field deployment and data acquisitions techniques. Each station featured a single broadband 3-component seismometer and a 3-element infrasound array. In May 2016 the network was scaled back to a single station due to other projects competing for equipment. To date 8 rocket launches have been recorded by the infrasound array, as well as 2 static tests, 1 aborted launch and 1 rocket explosion (see next abstract). Of the rocket launches recorded 4 were SpaceX Falcon-9, 2 were ULA Atlas-5 and 2 were ULA Delta-IV. A question we attempt to answer is whether the rocket engine type and launch trajectory can be estimated with appropriate travel-time, amplitude-ratio and spectral techniques. For example, there is a clear Doppler shift in seismic and infrasound spectrograms from all launches, with lower frequencies occurring later in the recorded signal as the rocket accelerates away from the array. Another question of interest is whether there are relationships between jet noise frequency, thrust and/or nozzle velocity. Infrasound data may help answer these questions. We are now in the process of deploying a permanent seismic and infrasound array at the Astronaut Beach House. 10 more rocket launches are schedule before AGU. NASA is also conducting a series of 33 sonic booms over KSC beginning on Aug 21st. Launches and other events at KSC have provided rich sources of signals that are useful to characterize and gain insight into physical processes and wave generation from man-made sources.

  7. Seismic lateral prediction in chalky limestone reservoirs offshore Qatar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rubbens, I.B.H.M.; Murat, R.C.; Vankeulen, J.

    Following the discovery of non-structurally trapped oil accumulations in Cretaceous chalky reservoirs on the northern flank of the North Dome offshore QATAR, a seismic lateral prediction study was carried out for QATAR GENERAL PETROLEUM CORPORATION (Offshore Operations). The objectives of this study were to assist in the appraisal of these oil accumulations by predicting their possible lateral extent and to investigate if the technique applied could be used as a basis for further exploration of similar oil prospects in the area. Wireline logs of eight wells and some 1000 km of high quality seismic data were processed into acoustic impedancemore » (A.I.) logs and seismic A.I. sections. Having obtained a satisfactory match of the A.I. well logs and the A.I. of the seismic traces at the well locations, relationships were established by the use of well log data which allowed the interpretation of the seismic A.I. in terms of reservoir quality. Measurements of the relevant A.I. characteristics were then carried out by computer along all seismic lines and porosity distribution maps prepared for some of the reservoirs. These maps, combined with detailed seismic depth contour maps at reservoir tops, lead to definition of good reservoir development areas downdip from poor reservoir quality zones i.e. of the stratigraphic trap areas, and drilling locations could thus be proposed. The system remains to be adequately calibrated when core material becomes available in the area of study.« less

  8. Seismic Anisotropy from Surface Refraction Measurements

    NASA Astrophysics Data System (ADS)

    Vilhelm, J.; Hrdá, J.; Klíma, K.; Lokajícek, T.; Pros, Z.

    2003-04-01

    The contribution deals with the methods of determining P and S wave velocities in the shallow refraction seismics. The comparison of a P-wave anisotropy from samples and field surface measurement is performed. The laboratory measurement of the P-wave velocity is realized as omni directional ultrasound measurement on oriented spherical samples (diameter 5 cm) under a hydrostatic pressure up to 400 MPa. The field measurement is based on the processing of at least one pair of reversed time-distance curves of refracted waves. Different velocity calculation techniques are involved including tomographic approach from the surface. It is shown that field seismic measurement can reflect internal rock fabric (lineation, mineral anisotropy) as well as effects connected with the fracturing and weathering. The elastic constants derived from laboratory measurements exhibit transversal isotropy. For the estimation of anisotropy influence we perform ray-tracing by the software package ANRAY (Consortium Seismic Waves in Complex 3-D Structures). The use of P and S wave anisotropy measurement to determine hard rock hydro-geological collector (water resource) is presented. In a relatively homogeneous lutaceous sedimentary medium we identified a transversally isotropic layer which exhibits increased value of permeability (transmisivity). The seismic measurement is realized by three component geophones with both vertical and shear seismic sources. VLF and resistivity profiling accompany the filed survey.

  9. Dynamic strain and rotation ground motions of the 2011 Tohoku earthquake from dense high-rate GPS observations in Taiwan

    NASA Astrophysics Data System (ADS)

    Huang, B. S.; Rau, R. J.; Lin, C. J.; Kuo, L. C.

    2017-12-01

    Seismic waves generated by the 2011 Mw 9.0 Tohoku, Japan earthquake were well recorded by continuous GPS in Taiwan. Those GPS were operated in one hertz sampling rate and densely distributed in Taiwan Island. Those continuous GPS observations and the precise point positioning technique provide an opportunity to estimate spatial derivatives from absolute ground motions of this giant teleseismic event. In this study, we process and investigate more than one and half hundred high-rate GPS displacements and its spatial derivatives, thus strain and rotations, to compare to broadband seismic and rotational sensor observations. It is shown that continuous GPS observations are highly consistent with broadband seismic observations during its surface waves across Taiwan Island. Several standard Geodesy and seismic array analysis techniques for spatial gradients have been applied to those continuous GPS time series to determine its dynamic strain and rotation time histories. Results show that those derivate GPS vertical axis ground rotations are consistent to seismic array determined rotations. However, vertical rotation-rate observations from the R1 rotational sensors have low resolutions and could not compared with GPS observations for this special event. For its dese spatial distribution of GPS stations in Taiwan Island, not only wavefield gradient time histories at individual site was obtained but also 2-D spatial ground motion fields were determined in this study also. In this study, we will report the analyzed results of those spatial gradient wavefields of the 2011 Tohoku earthquake across Taiwan Island and discuss its geological implications.

  10. Application of underground microseismic monitoring for ground failure and secure longwall coal mining operation: A case study in an Indian mine

    NASA Astrophysics Data System (ADS)

    Ghosh, G. K.; Sivakumar, C.

    2018-03-01

    Longwall mining technique has been widely used around the globe due to its safe mining process. However, mining operations are suspended when various problems arise like collapse of roof falls, cracks and fractures propagation in the roof and complexity in roof strata behaviors. To overcome these colossal problems, an underground real time microseismic monitoring technique has been implemented in the working panel-P2 in the Rajendra longwall underground coal mine at South Eastern Coalfields Limited (SECL), India. The target coal seams appears at the panel P-2 within a depth of 70 m to 76 m. In this process, 10 to 15 uniaxial geophones were placed inside a borehole at depth range of 40 m to 60 m located over the working panel-P2 with high rock quality designation value for better seismic signal. Various microseismic events were recorded with magnitude ranging from -5 to 2 in the Richter scale. The time-series processing was carried out to get various seismic parameters like activity rate, potential energy, viscosity rate, seismic moment, energy index, apparent volume and potential energy with respect to time. The used of these parameters helped tracing the events, understanding crack and fractures propagation and locating both high and low stress distribution zones prior to roof fall occurrence. In most of the cases, the events were divided into three stage processes: initial or preliminary, middle or building, and final or falling. The results of this study reveal that underground microseismic monitoring provides sufficient prior information of underground weighting events. The information gathered during the study was conveyed to the mining personnel in advance prior to roof fall event. This permits to take appropriate action for safer mining operations and risk reduction during longwall operation.

  11. Seismic reflection constraints on the glacial dynamics of Johnsons Glacier, Antarctica

    NASA Astrophysics Data System (ADS)

    Benjumea, Beatriz; Teixidó, Teresa

    2001-01-01

    During two Antarctic summers (1996-1997 and 1997-1998), five seismic refraction and two reflection profiles were acquired on the Johnsons Glacier (Livingston Island, Antarctica) in order to obtain information about the structure of the ice, characteristics of the ice-bed contact and basement topography. An innovative technique has been used for the acquisition of reflection data to optimise the field survey schedule. Different shallow seismic sources were used during each field season: Seismic Impulse Source System (SISSY) for the first field survey and low-energy explosives (pyrotechnic noisemakers) during the second one. A comparison between these two shallow seismic sources has been performed, showing that the use of the explosives is a better seismic source in this ice environment. This is one of the first studies where this type of source has been used. The analysis of seismic data corresponding to one of the reflection profiles (L3) allows us to delineate sectors with different glacier structure (accumulation and ablation zones) without using glaciological data. Moreover, vertical discontinuities were detected by the presence of back-scattered energy and the abrupt change in frequency content of first arrivals shown in shot records. After the raw data analysis, standard processing led us to a clear seismic image of the underlying bed topography, which can be correlated with the ice flow velocity anomalies. The information obtained from seismic data on the internal structure of the glacier, location of fracture zones and the topography of the ice-bed interface constrains the glacial dynamics of Johnsons Glacier.

  12. The R-package eseis - A toolbox to weld geomorphic, seismologic, spatial, and time series analysis

    NASA Astrophysics Data System (ADS)

    Dietze, Michael

    2017-04-01

    Environmental seismology is the science of investigating the seismic signals that are emitted by Earth surface processes. This emerging field provides unique opportunities to identify, locate, track and inspect a wide range of the processes that shape our planet. Modern broadband seismometers are sensitive enough to detect signals from sources as weak as wind interacting with the ground and as powerful as collapsing mountains. This places the field of environmental seismology at the seams of many geoscientific disciplines and requires integration of a series of specialised analysis techniques. R provides the perfect environment for this challenge. The package eseis uses the foundations laid by a series of existing packages and data types tailored to solve specialised problems (e.g., signal, sp, rgdal, Rcpp, matrixStats) and thus provides access to efficiently handling large streams of seismic data (> 300 million samples per station and day). It supports standard data formats (mseed, sac), preparation techniques (deconvolution, filtering, rotation), processing methods (spectra, spectrograms, event picking, migration for localisation) and data visualisation. Thus, eseis provides a seamless approach to the entire workflow of environmental seismology and passes the output to related analysis fields with temporal, spatial and modelling focus in R.

  13. Global high-frequency source imaging accounting for complexity in Green's functions

    NASA Astrophysics Data System (ADS)

    Lambert, V.; Zhan, Z.

    2017-12-01

    The general characterization of earthquake source processes at long periods has seen great success via seismic finite fault inversion/modeling. Complementary techniques, such as seismic back-projection, extend the capabilities of source imaging to higher frequencies and reveal finer details of the rupture process. However, such high frequency methods are limited by the implicit assumption of simple Green's functions, which restricts the use of global arrays and introduces artifacts (e.g., sweeping effects, depth/water phases) that require careful attention. This motivates the implementation of an imaging technique that considers the potential complexity of Green's functions at high frequencies. We propose an alternative inversion approach based on the modest assumption that the path effects contributing to signals within high-coherency subarrays share a similar form. Under this assumption, we develop a method that can combine multiple high-coherency subarrays to invert for a sparse set of subevents. By accounting for potential variability in the Green's functions among subarrays, our method allows for the utilization of heterogeneous global networks for robust high resolution imaging of the complex rupture process. The approach also provides a consistent framework for examining frequency-dependent radiation across a broad frequency spectrum.

  14. Designing and Implementing a Retrospective Earthquake Detection Framework at the U.S. Geological Survey National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Patton, J.; Yeck, W.; Benz, H.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

  15. The assessment of seismic hazard for Gori, (Georgia) and preliminary studies of seismic microzonation

    NASA Astrophysics Data System (ADS)

    Gogoladze, Z.; Moscatelli, M.; Giallini, S.; Avalle, A.; Gventsadze, A.; Kvavadze, N.; Tsereteli, N.

    2016-12-01

    Seismic risk is a crucial issue for South Caucasus, which is the main gateway between Asia and Europe. The goal of this work is to propose new methods and criteria for defining an overall approach aimed at assessing and mitigating seismic risk in Georgia. In this reguard seismic microzonation represents a highly useful tool for seismic risk assessmentin land management, for design of buildings or structures and for emergency planning.Seismic microzonation assessment of local seismic hazard,which is a component of seismicity resulting from specific local characteristics which cause local amplification and soil instability, through identification of zones with seismically homogeneous behavior. This paper presents the results of preliminary study of seismic microzonation of Gori, Georgia. Gori is and is located in the Shida Kartli region and on both sides of Liachvi and Mtkvari rivers, with area of about 135 km2around the Gori fortress. Gori is located in Achara-Trialeti fold-thrust belt, that is tectonically unstable. Half of all earthquakes in Gori area with magnitude M≥3.5 have happened along this fault zone and on basis of damage caused by previous earthquakes, this territory show the highest level of risk (the maximum value of direct losses) in central part of the town. The seismic microzonation map of level 1 for Gori was carried out using: 1) Already available data (i.e., topographic map and boreholes data), 2) Results of new geological surveys and 3) Geophysical measurements (i.e., MASW and noise measurements processed with HVSR technique). Our preliminary results highlight the presence of both stable zones susceptible to local amplifications and unstable zones susceptible to geological instability. Our results are directed to establish set of actions aimed at risk mitigation before initial onset of emergency, and to management of the emergency once the seismic event has occurred. The products obtained, will contain the basic elements of an integrated system aimed at reducing risk and improving over all safety of people and infrastructure in Georgia.

  16. 3D AUV Microseismic Implementation for Deepwater Seabed Investigations

    NASA Astrophysics Data System (ADS)

    George, R.; Taylor, M. W.; Gravely, J. G.

    2005-05-01

    Autonomous Underwater Vehicle (AUV) technology, developed commercially over the past 5 years, allows for the geophysical investigation of the seabed on the deepwater continental slope at resolutions, data densities and timelines not previously attainable. High-resolution geophysical systems normally employed on deepwater survey AUVs consist of multibeam bathymetry, side scan sonar and subbottom profiler. Inertial navigation allows positioning accuracies on the order of plus or minus 3 meters in depths up to 2,000 meters. C & C Technologies, Inc. owns and operates the C-Surveyor I AUV, which has collected more than 40,000 km of geohazard survey data on the continental slopes of the Gulf of Mexico, Mediterranean Sea, Brazil and West Africa. The oil and gas industry routinely engineers deepwater platform-mooring systems and other bottom founded subsea systems for exploration and production developments. Resolute subbottom imaging of the foundation zone in order to identify the near-seafloor geologic conditions at these deepwater development sites is critical in order to maintain system integrity. The paper describes the methodology and post-processing techniques used to create a high-resolution (2-8 kHz) 3D seismic cube from subbottom profiler data collected from an AUV system. Data examples of the multibeam bathymetry, side scan sonar and 2D seismic profiles will be provided to complement the results of the 3D seismic cube processing. Examples of inlines, crosslines, arbitrary lines, seafloor amplitude extraction and time slices are presented for the 4-meter binned data set. Advantages, disadvantages and suggested improvements for the survey acquisition technique and post processing are discussed.

  17. Passive (Micro-) Seismic Event Detection by Identifying Embedded "Event" Anomalies Within Statistically Describable Background Noise

    NASA Astrophysics Data System (ADS)

    Baziw, Erick; Verbeek, Gerald

    2012-12-01

    Among engineers there is considerable interest in the real-time identification of "events" within time series data with a low signal to noise ratio. This is especially true for acoustic emission analysis, which is utilized to assess the integrity and safety of many structures and is also applied in the field of passive seismic monitoring (PSM). Here an array of seismic receivers are used to acquire acoustic signals to monitor locations where seismic activity is expected: underground excavations, deep open pits and quarries, reservoirs into which fluids are injected or from which fluids are produced, permeable subsurface formations, or sites of large underground explosions. The most important element of PSM is event detection: the monitoring of seismic acoustic emissions is a continuous, real-time process which typically runs 24 h a day, 7 days a week, and therefore a PSM system with poor event detection can easily acquire terabytes of useless data as it does not identify crucial acoustic events. This paper outlines a new algorithm developed for this application, the so-called SEED™ (Signal Enhancement and Event Detection) algorithm. The SEED™ algorithm uses real-time Bayesian recursive estimation digital filtering techniques for PSM signal enhancement and event detection.

  18. Seismic Analysis Capability in NASTRAN

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.

    1984-01-01

    Seismic analysis is a technique which pertains to loading described in terms of boundary accelerations. Earthquake shocks to buildings is the type of excitation which usually comes to mind when one hears the word seismic, but this technique also applied to a broad class of acceleration excitations which are applied at the base of a structure such as vibration shaker testing or shocks to machinery foundations. Four different solution paths are available in NASTRAN for seismic analysis. They are: Direct Seismic Frequency Response, Direct Seismic Transient Response, Modal Seismic Frequency Response, and Modal Seismic Transient Response. This capability, at present, is invoked not as separate rigid formats, but as pre-packaged ALTER packets to existing RIGID Formats 8, 9, 11, and 12. These ALTER packets are included with the delivery of the NASTRAN program and are stored on the computer as a library of callable utilities. The user calls one of these utilities and merges it into the Executive Control Section of the data deck to perform any of the four options are invoked by setting parameter values in the bulk data.

  19. GEOMORPHIC CONTROLS ON MEADOW ECOSYSTEMS – INSIGHTS INTO LOCAL PROCESSES USING NEAR-SURFACE SEISMIC TECHNIQUES AND GROUND PENETRATING RADAR

    EPA Science Inventory

    Geomorphic controls on riparian meadows in the Central Great Basin of Nevada are an important aspect in determining the formation of and planning the management of these systems. The current hypothesis is that both alluvial fan sediment and faulted bedrock steps interact to cont...

  20. Volcanic Centers in the East Africa Rift: Volcanic Processes with Seismic Stresses to Identify Potential Hydrothermal Vents

    NASA Astrophysics Data System (ADS)

    Patlan, E.; Wamalwa, A. M.; Kaip, G.; Velasco, A. A.

    2015-12-01

    The Geothermal Development Company (GDC) in Kenya actively seeks to produce geothermal energy, which lies within the East African Rift System (EARS). The EARS, an active continental rift zone, appears to be a developing tectonic plate boundary and thus, has a number of active as well as dormant volcanoes throughout its extent. These volcanic centers can be used as potential sources for geothermal energy. The University of Texas at El Paso (UTEP) and the GDC deployed seismic sensors to monitor several volcanic centers: Menengai, Silali, and Paka, and Korosi. We identify microseismic, local events, and tilt like events using automatic detection algorithms and manual review to identify potential local earthquakes within our seismic network. We then perform the double-difference location method of local magnitude less than two to image the boundary of the magma chamber and the conduit feeding the volcanoes. In the process of locating local seismicity, we also identify long-period, explosion, and tremor signals that we interpret as magma passing through conduits of the magma chamber and/or fluid being transported as a function of magma movement or hydrothermal activity. We used waveform inversion and S-wave shear wave splitting to approximate the orientation of the local stresses from the vent or fissure-like conduit of the volcano. The microseismic events and long period events will help us interpret the activity of the volcanoes. Our goal is to investigate basement structures beneath the volcanoes and identify the extent of magmatic modifications of the crust. Overall, these seismic techniques will help us understand magma movement and volcanic processes in the region.

  1. Tunnel Detection Using Seismic Methods

    NASA Astrophysics Data System (ADS)

    Miller, R.; Park, C. B.; Xia, J.; Ivanov, J.; Steeples, D. W.; Ryden, N.; Ballard, R. F.; Llopis, J. L.; Anderson, T. S.; Moran, M. L.; Ketcham, S. A.

    2006-05-01

    Surface seismic methods have shown great promise for use in detecting clandestine tunnels in areas where unauthorized movement beneath secure boundaries have been or are a matter of concern for authorities. Unauthorized infiltration beneath national borders and into or out of secure facilities is possible at many sites by tunneling. Developments in acquisition, processing, and analysis techniques using multi-channel seismic imaging have opened the door to a vast number of near-surface applications including anomaly detection and delineation, specifically tunnels. Body waves have great potential based on modeling and very preliminary empirical studies trying to capitalize on diffracted energy. A primary limitation of all seismic energy is the natural attenuation of high-frequency energy by earth materials and the difficulty in transmitting a high- amplitude source pulse with a broad spectrum above 500 Hz into the earth. Surface waves have shown great potential since the development of multi-channel analysis methods (e.g., MASW). Both shear-wave velocity and backscatter energy from surface waves have been shown through modeling and empirical studies to have great promise in detecting the presence of anomalies, such as tunnels. Success in developing and evaluating various seismic approaches for detecting tunnels relies on investigations at known tunnel locations, in a variety of geologic settings, employing a wide range of seismic methods, and targeting a range of uniquely different tunnel geometries, characteristics, and host lithologies. Body-wave research at the Moffat tunnels in Winter Park, Colorado, provided well-defined diffraction-looking events that correlated with the subsurface location of the tunnel complex. Natural voids related to karst have been studied in Kansas, Oklahoma, Alabama, and Florida using shear-wave velocity imaging techniques based on the MASW approach. Manmade tunnels, culverts, and crawl spaces have been the target of multi-modal analysis in Kansas and California. Clandestine tunnels used for illegal entry into the U.S. from Mexico were studied at two different sites along the southern border of California. All these studies represent the empirical basis for suggesting surface seismic has a significant role to play in tunnel detection and that methods are under development and very nearly at hand that will provide an effective tool in appraising and maintaining parameter security. As broadband sources, gravity-coupled towed spreads, and automated analysis software continues to make advancements, so does the applicability of routine deployment of seismic imaging systems that can be operated by technicians with interpretation aids for nearly real-time target selection. Key to making these systems commercial is the development of enhanced imaging techniques in geologically noisy areas and highly variable surface terrain.

  2. From intuition to statistics in building subsurface structural models

    USGS Publications Warehouse

    Brandenburg, J.P.; Alpak, F.O.; Naruk, S.; Solum, J.

    2011-01-01

    Experts associated with the oil and gas exploration industry suggest that combining forward trishear models with stochastic global optimization algorithms allows a quantitative assessment of the uncertainty associated with a given structural model. The methodology is applied to incompletely imaged structures related to deepwater hydrocarbon reservoirs and results are compared to prior manual palinspastic restorations and borehole data. This methodology is also useful for extending structural interpretations into other areas of limited resolution, such as subsalt in addition to extrapolating existing data into seismic data gaps. This technique can be used for rapid reservoir appraisal and potentially have other applications for seismic processing, well planning, and borehole stability analysis.

  3. Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS

    NASA Astrophysics Data System (ADS)

    Ahmad, Raed; Adris, Ahmad; Singh, Ramesh

    2016-07-01

    In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.

  4. Ray Tracing Methods in Seismic Emission Tomography

    NASA Astrophysics Data System (ADS)

    Chebotareva, I. Ya.

    2018-03-01

    Highly efficient approximate ray tracing techniques which can be used in seismic emission tomography and in other methods requiring a large number of raypaths are described. The techniques are applicable for the gradient and plane-layered velocity sections of the medium and for the models with a complicated geometry of contrasting boundaries. The empirical results obtained with the use of the discussed ray tracing technologies and seismic emission tomography results, as well as the results of numerical modeling, are presented.

  5. Paleobathymetric Reconstruction of Ross Sea: seismic data processing and regional reflectors mapping

    NASA Astrophysics Data System (ADS)

    Olivo, Elisabetta; De Santis, Laura; Wardell, Nigel; Geletti, Riccardo; Busetti, Martina; Sauli, Chiara; Bergamasco, Andrea; Colleoni, Florence; Vanzella, Walter; Sorlien, Christopher; Wilson, Doug; De Conto, Robert; Powell, Ross; Bart, Phil; Luyendyk, Bruce

    2017-04-01

    PURPOSE: New maps of some major unconformities of the Ross Sea have been reconstructed, by using seismic data grids, combined with the acoustic velocities from previous works, from new and reprocessed seismic profiles. This work is carried out with the support of PNRA and in the frame of the bilateral Italy-USA project GLAISS (Global Sea Level Rise & Antarctic Ice Sheet Stability predictions), funded by the Ministry of Foreign Affairs. Paleobathymetric maps of 30, 14 and 4 million years ago, three 'key moments' for the glacial history of the Antarctic Ice Sheet, coinciding with global climatic changes. The paleobathymetric maps will then be used for numeric simulations focused on the width and thickness of the Ross Sea Ice Sheet. PRELIMINARY RESULTS: The first step was to create TWT maps of three main unconformity (RSU6, RSU4, and RSU2) of Ross Sea, revisiting and updating the ANTOSTRAT maps, through the interpretation of sedimentary bodies and erosional features, used to infer active or old processes along the slope, we identified the main seismic unconformities. We used the HIS Kingdom academic license. The different groups contribution was on the analysis of the Eastern Ross Sea continental slope and rise (OGS), of the Central Basin (KOPRI) of the western and central Ross Sea (Univ. of Santa Barbara and OGS), where new drill sites and seismic profiles were collected after the publication of the ANTOSTRAT maps. Than we joined our interpretation with previous interpretations. We examined previous processing of several seismic lines and all the old acoustic velocity analysis. In addiction we reprocessed some lines in order to have a higher data coverage. Then, combining the TWT maps of the unconformity with the old and new speed data we created new depth maps of the study area. The new depth maps will then be used for reconstructing the paleobathymetry of the Ross Sea by applying backstripping technique.

  6. Damage detection and quantification in a structural model under seismic excitation using time-frequency analysis

    NASA Astrophysics Data System (ADS)

    Chan, Chun-Kai; Loh, Chin-Hsiung; Wu, Tzu-Hsiu

    2015-04-01

    In civil engineering, health monitoring and damage detection are typically carry out by using a large amount of sensors. Typically, most methods require global measurements to extract the properties of the structure. However, some sensors, like LVDT, cannot be used due to in situ limitation so that the global deformation remains unknown. An experiment is used to demonstrate the proposed algorithms: a one-story 2-bay reinforce concrete frame under weak and strong seismic excitation. In this paper signal processing techniques and nonlinear identification are used and applied to the response measurements of seismic response of reinforced concrete structures subject to different level of earthquake excitations. Both modal-based and signal-based system identification and feature extraction techniques are used to study the nonlinear inelastic response of RC frame using both input and output response data or output only measurement. From the signal-based damage identification method, which include the enhancement of time-frequency analysis of acceleration responses and the estimation of permanent deformation using directly from acceleration response data. Finally, local deformation measurement from dense optical tractor is also use to quantify the damage of the RC frame structure.

  7. Observations of coupled seismicity and ground deformation at El Hierro Island (2011-2014)

    NASA Astrophysics Data System (ADS)

    Gonzalez, P. J.

    2015-12-01

    New insights into the magma storage and evolution at oceanic island volcanoes are now being achieved using remotely sensed space geodetic techniques, namely satellite radar interferometry. Differential radar interferometry is a technique tracking, at high spatial resolution, changes in the travel-time (distance) from the satellites to the ground surface, having wide applications in Earth sciences. Volcanic activity usually is accompanied by surface ground deformation. In many instances, modelling of surface deformation has the great advantage to estimate the magma volume change, a particularly interesting parameter prior to eruptions. Jointly interpreted with petrology, degassing and seismicity, it helps to understand the crustal magmatic systems as a whole. Current (and near-future) radar satellite missions will reduce the revisit time over global sub-aerial volcanoes to a sub-weekly basis, which will increase the potential for its operational use. Time series and filtering processing techniques of such streaming data would allow to track subsurface magma migration with high precision, and frequently update over vast areas (volcanic arcs, large caldera systems, etc.). As an example for the future potential monitoring scenario, we analyze multiple satellite radar data over El Hierro Island (Canary Islands, Spain) to measure and model surface ground deformation. El Hierro has been active for more than 3 years (2011 to 2014). Initial phases of the unrest culminated in a submarine eruption (late 2011 - early 2012). However, after the submarine eruption ended, its magmatic system still active and affected by pseudo-regular energetic seismic swarms, accompanied by surface deformation without resumed eruptions. Such example is a great opportunity to understand the crustal magmatic systems in low magma supply-rate oceanic island volcanoes. This new approach to measure surface deformation processes is yielding an ever richer level of information from volcanology to engineering and meteorological monitoring problems.

  8. Recent developments in seismic seabed oil reservoir monitoring applications using fibre-optic sensing networks

    NASA Astrophysics Data System (ADS)

    De Freitas, J. M.

    2011-05-01

    This review looks at recent developments in seismic seabed oil reservoir monitoring techniques using fibre-optic sensing networks. After a brief introduction covering the background and scope of the review, the following section focuses on state-of-the-art fibre-optic hydrophones and accelerometers used for seismic applications. Related metrology aspects of the sensor such as measurement of sensitivity, noise and cross-axis performance are addressed. The third section focuses on interrogation systems. Two main phase-based competing systems have emerged over the past two decades for seismic applications, with a third technique showing much promise; these have been compared in terms of general performance.

  9. Passive monitoring for near surface void detection using traffic as a seismic source

    NASA Astrophysics Data System (ADS)

    Zhao, Y.; Kuzma, H. A.; Rector, J.; Nazari, S.

    2009-12-01

    In this poster we present preliminary results based on our several field experiments in which we study seismic detection of voids using a passive array of surface geophones. The source of seismic excitation is vehicle traffic on nearby roads, which we model as a continuous line source of seismic energy. Our passive seismic technique is based on cross-correlation of surface wave fields and studying the resulting power spectra, looking for "shadows" caused by the scattering effect of a void. High frequency noise masks this effect in the time domain, so it is difficult to see on conventional traces. Our technique does not rely on phase distortions caused by small voids because they are generally too tiny to measure. Unlike traditional impulsive seismic sources which generate highly coherent broadband signals, perfect for resolving phase but too weak for resolving amplitude, vehicle traffic affords a high power signal a frequency range which is optimal for finding shallow structures. Our technique results in clear detections of an abandoned railroad tunnel and a septic tank. The ultimate goal of this project is to develop a technology for the simultaneous imaging of shallow underground structures and traffic monitoring near these structures.

  10. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed technique to a training dataset of induced earthquakes recorded by Berkeley-Geysers network, which is installed in The Geysers geothermal area in Northern California. The reliability of the techniques is then tested by using a different dataset performing seismic hazard analysis in a time-evolving approach, which provides with ground-motion values having fixed probabilities of exceedence. Those values can be finally compared with the observations by using appropriate statistical tests.

  11. Seismo-acoustic analysis of the near quarry blasts using Plostina small aperture array

    NASA Astrophysics Data System (ADS)

    Ghica, Daniela; Stancu, Iulian; Ionescu, Constantin

    2013-04-01

    Seismic and acoustic signals are important to recognize different type of industrial blasting sources in order to discriminate between them and natural earthquakes. We have analyzed the seismic events listed in the Romanian catalogue (Romplus) for the time interval between 2011 and 2012, and occurred in the Dobrogea region, in order to determine detection seismo-acoustic signals of quarry blasts by Plostina array stations. Dobrogea is known as a seismic region characterized by crustal earthquakes with low magnitudes; at the same time, over 40 quarry mines are located in the area, being sources of blasts recorded both with the seismic and infrasound sensors of the Romanian Seismic Network. Plostina seismo-acoustic array, deployed in the central part of Romania, consists of 7 seismic sites (3C broad-band instruments and accelerometers) collocated with 7 infrasound instruments. The array is particularly used for the seismic monitoring of the local and regional events, as well as for the detection of infrasonic signals produced by various sources. Considering the characteristics of the infrasound sensors (frequency range, dynamic, sensibility), the array proved its efficiency in observing the signals produced by explosions, mine explosion and quarry blasts. The quarry mines included for this study cover distances of two hundreds of kilometers from the station and routinely generate explosions that are detected as seismic and infrasonic signals with Plostina array. The combined seismo-acoustic analysis uses two types of detectors for signal identification: one, applied for the seismic signal identification, is based on array processing techniques (beamforming and frequency-wave number analysis), while the other one, which is used for infrasound detection and characterization, is the automatic detector DFX-PMCC (Progressive Multi-Channel Correlation Method). Infrasonic waves generated by quarry blasts have frequencies ranging from 0.05 Hz up to at least 6 Hz and amplitudes below 5 Pa. Seismic data analysis shows that the frequency range of the signals are above 2 Hz. Surface explosions such as quarry blasts are useful sources for checking detection and location efficiency, when seismic measurements are added. The process is crucial for discrimination purposes and for establishing of a set of ground-truth infrasound events. Ground truth information plays a key role in the interpretation of infrasound signals, by including near-field observations from industrial blasts.

  12. Signal Quality and the Reliability of Seismic Observations

    NASA Astrophysics Data System (ADS)

    Zeiler, C. P.; Velasco, A. A.; Pingitore, N. E.

    2009-12-01

    The ability to detect, time and measure seismic phases depends on the location, size, and quality of the recorded signals. Additional constraints are an analyst’s familiarity with a seismogenic zone and with the seismic stations that record the energy. Quantification and qualification of an analyst’s ability to detect, time and measure seismic signals has not been calculated or fully assessed. The fundamental measurement for computing the accuracy of a seismic measurement is the signal quality. Several methods have been proposed to measure signal quality; however, the signal-to-noise ratio (SNR) has been adopted as a short-term average over the long-term average. While the standard SNR is an easy and computationally inexpensive term, the overall statistical significance has not been computed for seismic measurement analysis. The prospect of canonizing the process of cataloging seismic arrivals hinges on the ability to repeat measurements made by different methods and analysts. The first step in canonizing phase measurements has been done by the IASPEI, which established a reference for accepted practices in naming seismic phases. The New Manual for Seismological Observatory Practices (NMSOP, 2002) outlines key observations for seismic phases recorded at different distances and proposes to quantify timing uncertainty with a user-specified windowing technique. However, this added measurement would not completely remove bias introduced by different techniques used by analysts to time seismic arrivals. The general guideline to time a seismic arrival is to record the time where a noted change in frequency and/or amplitude begins. This is generally achieved by enhancing the arrivals through filtering or beam forming. However, these enhancements can alter the characteristics of the arrival and how the arrival will be measured. Furthermore, each enhancement has user-specified parameters that can vary between analysts and this results in reduced ability to repeat measurements between analysts. The SPEAR project (Zeiler and Velasco, 2009) has started to explore the effects of comparing measurements from the same seismograms. Initial results showed that experience and the signal quality are the leading contributors to pick differences. However, the traditional SNR method of measuring signal quality was replaced by a Wide-band Spectral Ratio (WSR) due to a decrease in scatter. This observation brings up an important question of what is the best way to measure signal quality. We compare various methods (traditional SNR, WSR, power spectral density plots, Allan Variance) that have been proposed to measure signal quality and discuss which method provides the best tool to compare arrival time uncertainty.

  13. Prospective testing of neo-deterministic seismic hazard scenarios for the Italian territory

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Vaccari, Franco; Kossobokov, Vladimir; Panza, Giuliano F.

    2013-04-01

    A reliable and comprehensive characterization of expected seismic ground shaking, eventually including the related time information, is essential in order to develop effective mitigation strategies and increase earthquake preparedness. Moreover, any effective tool for SHA must demonstrate its capability in anticipating the ground shaking related with large earthquake occurrences, a result that can be attained only through rigorous verification and validation process. So far, the major problems in classical probabilistic methods for seismic hazard assessment, PSHA, consisted in the adequate description of the earthquake recurrence, particularly for the largest and sporadic events, and of the attenuation models, which may be unable to account for the complexity of the medium and of the seismic sources and are often weekly constrained by the available observations. Current computational resources and physical knowledge of the seismic waves generation and propagation processes allow nowadays for viable numerical and analytical alternatives to the use of attenuation relations. Accordingly, a scenario-based neo-deterministic approach, NDSHA, to seismic hazard assessment has been proposed, which allows considering a wide range of possible seismic sources as the starting point for deriving scenarios by means of full waveforms modeling. The method does not make use of attenuation relations and naturally supplies realistic time series of ground shaking, including reliable estimates of ground displacement readily applicable to seismic isolation techniques. Based on NDSHA, an operational integrated procedure for seismic hazard assessment has been developed, that allows for the definition of time dependent scenarios of ground shaking, through the routine updating of formally defined earthquake predictions. The integrated NDSHA procedure for seismic input definition, which is currently applied to the Italian territory, combines different pattern recognition techniques, designed for the space-time identification of strong earthquakes, with algorithms for the realistic modeling of ground motion. Accordingly, a set of deterministic scenarios of ground motion at bedrock, which refers to the time interval when a strong event is likely to occur within the alerted area, can be defined by means of full waveform modeling, both at regional and local scale. CN and M8S predictions, as well as the related time-dependent ground motion scenarios associated with the alarmed areas, are regularly updated every two months since 2006. The routine application of the time-dependent NDSHA approach provides information that can be useful in assigning priorities for timely mitigation actions and, at the same time, allows for a rigorous prospective testing and validation of the proposed methodology. As an example, for sites where ground shaking values greater than 0.2 g are estimated at bedrock, further investigations can be performed taking into account the local soil conditions, to assess the performances of relevant structures, such as historical and strategic buildings. The issues related with prospective testing and validation of the time-dependent NDSHA scenarios will be discussed, illustrating the results obtained for the recent strong earthquakes in Italy, including the May 20, 2012 Emilia earthquake.

  14. Coherent Waves in Seismic Researches

    NASA Astrophysics Data System (ADS)

    Emanov, A.; Seleznev, V. S.

    2013-05-01

    Development of digital processing algorithms of seismic wave fields for the purpose of useful event picking to study environment and other objects is the basis for the establishment of new seismic techniques. In the submitted paper a fundamental property of seismic wave field coherence is used. The authors extended conception of coherence types of observed wave fields and devised a technique of coherent component selection from observed wave field. Time coherence and space coherence are widely known. In this paper conception "parameter coherence" has been added. The parameter by which wave field is coherent can be the most manifold. The reason is that the wave field is a multivariate process described by a set of parameters. Coherence in the first place means independence of linear connection in wave field of parameter. In seismic wave fields, recorded in confined space, in building-blocks and stratified mediums time coherent standing waves are formed. In prospecting seismology at observation systems with multiple overlapping head waves are coherent by parallel correlation course or, in other words, by one measurement on generalized plane of observation system. For detail prospecting seismology at observation systems with multiple overlapping on basis of coherence property by one measurement of area algorithms have been developed, permitting seismic records to be converted to head wave time sections which have neither reflected nor other types of waves. Conversion in time section is executed on any specified observation base. Energy storage of head waves relative to noise on basis of multiplicity of observation system is realized within area of head wave recording. Conversion on base below the area of wave tracking is performed with lack of signal/noise ratio relative to maximum of this ratio, fit to observation system. Construction of head wave time section and dynamic plots a basis of automatic processing have been developed, similar to CDP procedure in method of reflected waves. With use of developed algorithms of head wave conversion in time sections a work of studying of refracting boundaries in Siberia have been executed. Except for the research by method of refracting waves, the conversion of head waves in time sections, applied to seismograms of reflected wave method, allows to obtain information about refracting horizons in upper part of section in addition to reflecting horizons data. Recovery method of wave field coherent components is the basis of the engineering seismology on the level of accuracy and detail. In seismic microzoning resonance frequency of the upper part of section are determined on the basis of this method. Maps of oscillation amplification and result accuracy are constructed for each of the frequencies. The same method makes it possible to study standing wave field in buildings and constructions with high accuracy and detail, realizing diagnostics of their physical state on set of natural frequencies and form of self-oscillations, examined with high detail. The method of standing waves permits to estimate a seismic stability of structure on new accuracy level.

  15. Full waveform inversion of combined towed streamer and limited OBS seismic data: a theoretical study

    NASA Astrophysics Data System (ADS)

    Yang, Huachen; Zhang, Jianzhong

    2018-06-01

    In marine seismic oil exploration, full waveform inversion (FWI) of towed-streamer data is used to reconstruct velocity models. However, the FWI of towed-streamer data easily converges to a local minimum solution due to the lack of low-frequency content. In this paper, we propose a new FWI technique using towed-streamer data, its integrated data sets and limited OBS data. Both integrated towed-streamer seismic data and OBS data have low-frequency components. Therefore, at early iterations in the new FWI technique, the OBS data combined with the integrated towed-streamer data sets reconstruct an appropriate background model. And the towed-streamer seismic data play a major role in later iterations to improve the resolution of the model. The new FWI technique is tested on numerical examples. The results show that when starting models are not accurate enough, the models inverted using the new FWI technique are superior to those inverted using conventional FWI.

  16. (Multi)fractality of Earthquakes by use of Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Enescu, B.; Ito, K.; Struzik, Z. R.

    2002-12-01

    The fractal character of earthquakes' occurrence, in time, space or energy, has by now been established beyond doubt and is in agreement with modern models of seismicity. Moreover, the cascade-like generation process of earthquakes -with one "main" shock followed by many aftershocks, having their own aftershocks- may well be described through multifractal analysis, well suited for dealing with such multiplicative processes. The (multi)fractal character of seismicity has been analysed so far by using traditional techniques, like the box-counting and correlation function algorithms. This work introduces a new approach for characterising the multifractal patterns of seismicity. The use of wavelet analysis, in particular of the wavelet transform modulus maxima, to multifractal analysis was pioneered by Arneodo et al. (1991, 1995) and applied successfully in diverse fields, such as the study of turbulence, the DNA sequences or the heart rate dynamics. The wavelets act like a microscope, revealing details about the analysed data at different times and scales. We introduce and perform such an analysis on the occurrence time of earthquakes and show its advantages. In particular, we analyse shallow seismicity, characterised by a high aftershock "productivity", as well as intermediate and deep seismic activity, known for its scarcity of aftershocks. We examine as well declustered (aftershocks removed) versions of seismic catalogues. Our preliminary results show some degree of multifractality for the undeclustered, shallow seismicity. On the other hand, at large scales, we detect a monofractal scaling behaviour, clearly put in evidence for the declustered, shallow seismic activity. Moreover, some of the declustered sequences show a long-range dependent (LRD) behaviour, characterised by a Hurst exponent, H > 0.5, in contrast with the memory-less, Poissonian model. We demonstrate that the LRD is a genuine characteristic and is not an effect of the time series probability distribution function. One of the most attractive features of wavelet analysis is its ability to determine a local Hurst exponent. We show that this feature together with the possibility of extending the analysis to spatial patterns may constitute a valuable approach to search for anomalous (precursory?) patterns of seismic activity.

  17. Seismic site-response characterization of high-velocity sites using advanced geophysical techniques: application to the NAGRA-Net

    NASA Astrophysics Data System (ADS)

    Poggi, V.; Burjanek, J.; Michel, C.; Fäh, D.

    2017-08-01

    The Swiss Seismological Service (SED) has recently finalised the installation of ten new seismological broadband stations in northern Switzerland. The project was led in cooperation with the National Cooperative for the Disposal of Radioactive Waste (Nagra) and Swissnuclear to monitor micro seismicity at potential locations of nuclear-waste repositories. To further improve the quality and usability of the seismic recordings, an extensive characterization of the sites surrounding the installation area was performed following a standardised investigation protocol. State-of-the-art geophysical techniques have been used, including advanced active and passive seismic methods. The results of all analyses converged to the definition of a set of best-representative 1-D velocity profiles for each site, which are the input for the computation of engineering soil proxies (traveltime averaged velocity and quarter-wavelength parameters) and numerical amplification models. Computed site response is then validated through comparison with empirical site amplification, which is currently available for any station connected to the Swiss seismic networks. With the goal of a high-sensitivity network, most of the NAGRA stations have been installed on stiff-soil sites of rather high seismic velocity. Seismic characterization of such sites has always been considered challenging, due to lack of relevant velocity contrast and the large wavelengths required to investigate the frequency range of engineering interest. We describe how ambient vibration techniques can successfully be applied in these particular conditions, providing practical recommendations for best practice in seismic site characterization of high-velocity sites.

  18. Automated detection and characterization of harmonic tremor in continuous seismic data

    NASA Astrophysics Data System (ADS)

    Roman, Diana C.

    2017-06-01

    Harmonic tremor is a common feature of volcanic, hydrothermal, and ice sheet seismicity and is thus an important proxy for monitoring changes in these systems. However, no automated methods for detecting harmonic tremor currently exist. Because harmonic tremor shares characteristics with speech and music, digital signal processing techniques for analyzing these signals can be adapted. I develop a novel pitch-detection-based algorithm to automatically identify occurrences of harmonic tremor and characterize their frequency content. The algorithm is applied to seismic data from Popocatepetl Volcano, Mexico, and benchmarked against a monthlong manually detected catalog of harmonic tremor events. During a period of heightened eruptive activity from December 2014 to May 2015, the algorithm detects 1465 min of harmonic tremor, which generally precede periods of heightened explosive activity. These results demonstrate the algorithm's ability to accurately characterize harmonic tremor while highlighting the need for additional work to understand its causes and implications at restless volcanoes.

  19. Instantaneous phase estimation to measure weak velocity variations: application to noise correlation on seismic data at the exploration scale

    NASA Astrophysics Data System (ADS)

    Corciulo, M.; Roux, P.; Campillo, M.; Dubucq, D.

    2010-12-01

    Passive imaging from noise cross-correlation is a consolidated analysis applied at continental and regional scale whereas its use at local scale for seismic exploration purposes is still uncertain. The development of passive imaging by cross-correlation analysis is based on the extraction of the Green’s function from seismic noise data. In a completely random field in time and space, the cross-correlation permits to retrieve the complete Green’s function whatever the complexity of the medium. At the exploration scale and at frequency above 2 Hz, the noise sources are not ideally distributed around the stations which strongly affect the extraction of the direct arrivals from the noise cross-correlation process. In order to overcome this problem, the coda waves extracted from noise correlation could be useful. Coda waves describe long and scattered paths sampling the medium in different ways such that they become sensitive to weak velocity variations without being dependent on the noise source distribution. Indeed, scatters in the medium behave as a set of secondary noise sources which randomize the spatial distribution of noise sources contributing to the coda waves in the correlation process. We developed a new technique to measure weak velocity changes based on the computation of the local phase variations (instantaneous phase variation or IPV) of the cross-correlated signals. This newly-developed technique takes advantage from the doublet and stretching techniques classically used to monitor weak velocity variation from coda waves. We apply IPV to data acquired in Northern America (Canada) on a 1-km side square seismic network laid out by 397 stations. Data used to study temporal variations are cross-correlated signals computed on 10-minutes ambient noise in the frequency band 2-5 Hz. As the data set was acquired over five days, about 660 files are processed to perform a complete temporal analysis for each stations pair. The IPV permits to estimate the phase shift all over the signal length without any assumption on the medium velocity. The instantaneous phase is computed using the Hilbert transform of the signal. For each stations pair, we measure the phase difference between successive correlation functions calculated for 10 minutes of ambient noise. We then fit the instantaneous phase shift using a first-order polynomial function. The measure of the velocity variation corresponds to the slope of this fit. Compared to other techniques, the advantage of IPV is a very fast procedure which efficiently provides the measure of velocity variation on large data sets. Both experimental results and numerical tests on synthetic signals will be presented to assess the reliability of the IPV technique, with comparison to the doublet and stretching methods.

  20. Using Seismic Interferometry to Investigate Seismic Swarms

    NASA Astrophysics Data System (ADS)

    Matzel, E.; Morency, C.; Templeton, D. C.

    2017-12-01

    Seismicity provides a direct means of measuring the physical characteristics of active tectonic features such as fault zones. Hundreds of small earthquakes often occur along a fault during a seismic swarm. This seismicity helps define the tectonically active region. When processed using novel geophysical techniques, we can isolate the energy sensitive to the fault, itself. Here we focus on two methods of seismic interferometry, ambient noise correlation (ANC) and the virtual seismometer method (VSM). ANC is based on the observation that the Earth's background noise includes coherent energy, which can be recovered by observing over long time periods and allowing the incoherent energy to cancel out. The cross correlation of ambient noise between a pair of stations results in a waveform that is identical to the seismogram that would result if an impulsive source located at one of the stations was recorded at the other, the Green function (GF). The calculation of the GF is often stable after a few weeks of continuous data correlation, any perturbations to the GF after that point are directly related to changes in the subsurface and can be used for 4D monitoring.VSM is a style of seismic interferometry that provides fast, precise, high frequency estimates of the Green's function (GF) between earthquakes. VSM illuminates the subsurface precisely where the pressures are changing and has the potential to image the evolution of seismicity over time, including changes in the style of faulting. With hundreds of earthquakes, we can calculate thousands of waveforms. At the same time, VSM collapses the computational domain, often by 2-3 orders of magnitude. This allows us to do high frequency 3D modeling in the fault region. Using data from a swarm of earthquakes near the Salton Sea, we demonstrate the power of these techniques, illustrating our ability to scale from the far field, where sources are well separated, to the near field where their locations fall within each other's uncertainty ellipse. We use ANC to create a 3D model of the crust in the region. VSM provides better illumination of the active fault zone. Measures of amplitude and shape are used to refine source properties and locations in space and waveform modeling allows us to estimate near-fault seismic structure.

  1. Evaluation of seismic testing for quality assurance of lime-stabilized soil.

    DOT National Transportation Integrated Search

    2013-08-01

    This study sought to determine the technical feasibility of using seismic techniques to measure the : laboratory and field seismic modulus of lime-stabilized soils (LSS), and to compare/correlate test results : from bench-top (free-free resonance) se...

  2. A simple algorithm for sequentially incorporating gravity observations in seismic traveltime tomography

    USGS Publications Warehouse

    Parsons, T.; Blakely, R.J.; Brocher, T.M.

    2001-01-01

    The geologic structure of the Earth's upper crust can be revealed by modeling variation in seismic arrival times and in potential field measurements. We demonstrate a simple method for sequentially satisfying seismic traveltime and observed gravity residuals in an iterative 3-D inversion. The algorithm is portable to any seismic analysis method that uses a gridded representation of velocity structure. Our technique calculates the gravity anomaly resulting from a velocity model by converting to density with Gardner's rule. The residual between calculated and observed gravity is minimized by weighted adjustments to the model velocity-depth gradient where the gradient is steepest and where seismic coverage is least. The adjustments are scaled by the sign and magnitude of the gravity residuals, and a smoothing step is performed to minimize vertical streaking. The adjusted model is then used as a starting model in the next seismic traveltime iteration. The process is repeated until one velocity model can simultaneously satisfy both the gravity anomaly and seismic traveltime observations within acceptable misfits. We test our algorithm with data gathered in the Puget Lowland of Washington state, USA (Seismic Hazards Investigation in Puget Sound [SHIPS] experiment). We perform resolution tests with synthetic traveltime and gravity observations calculated with a checkerboard velocity model using the SHIPS experiment geometry, and show that the addition of gravity significantly enhances resolution. We calculate a new velocity model for the region using SHIPS traveltimes and observed gravity, and show examples where correlation between surface geology and modeled subsurface velocity structure is enhanced.

  3. Seismicity within a propagating ice shelf rift: the relationship between icequake locations and ice shelf structure

    USGS Publications Warehouse

    Heeszel, David S.; Fricker, Helen A.; Bassis, Jeremy N.; O'Neel, Shad; Walter, Fabian

    2014-01-01

    Iceberg calving is a dominant mass loss mechanism for Antarctic ice shelves, second only to basal melting. An important known process involved in calving is the initiation and propagation of through-penetrating fractures called rifts; however, the mechanisms controlling rift propagation remain poorly understood. To investigate the mechanics of ice-shelf rifting, we analyzed seismicity associated with a propagating rift tip on the Amery Ice Shelf, using data collected during the Austral summers of 2004-2007. We investigated seismicity associated with fracture propagation using a suite of passive seismological techniques including icequake locations, back projection, and moment tensor inversion. We confirm previous results that show that seismicity is characterized by periods of relative quiescence punctuated by swarms of intense seismicity of one to three hours. However, even during periods of quiescence, we find significant seismic deformation around the rift tip. Moment tensors, calculated for a subset of the largest icequakes (MW > -2.0) located near the rift tip, show steeply dipping fault planes, horizontal or shallowly plunging stress orientations, and often have a significant volumetric component. They also reveal that much of the observed seismicity is limited to the upper 50 m of the ice shelf. This suggests a complex system of deformation that involves the propagating rift, the region behind the rift tip, and a system of rift-transverse crevasses. Small-scale variations in the mechanical structure of the ice shelf, especially rift-transverse crevasses and accreted marine ice, play an important role in modulating the rate and location of seismicity associated with propagating ice shelf rifts.

  4. Effect of strong elastic contrasts on the propagation of seismic wave in hard-rock environments

    NASA Astrophysics Data System (ADS)

    Saleh, R.; Zheng, L.; Liu, Q.; Milkereit, B.

    2013-12-01

    Understanding the propagation of seismic waves in a presence of strong elastic contrasts, such as topography, tunnels and ore-bodies is still a challenge. Safety in mining is a major concern and seismic monitoring is the main tool here. For engineering purposes, amplitudes (peak particle velocity/acceleration) and travel times of seismic events (mostly blasts or microseismic events) are critical parameters that have to be determined at various locations in a mine. These parameters are useful in preparing risk maps or to better understand the process of spatial and temporal stress distributions in a mine. Simple constant velocity models used for monitoring studies in mining, cannot explain the observed complexities in scattered seismic waves. In hard-rock environments modeling of elastic seismic wavefield require detailed 3D petrophysical, infrastructure and topographical data to simulate the propagation of seismic wave with a frequencies up to few kilohertz. With the development of efficient numerical techniques, and parallel computation facilities, a solution for such a problem is achievable. In this study, the effects of strong elastic contrasts such as ore-bodies, rough topography and tunnels will be illustrated using 3D modeling method. The main tools here are finite difference code (SOFI3D)[1] that has been benchmarked for engineering studies, and spectral element code (SPECFEM) [2], which was, developed for global seismology problems. The modeling results show locally enhanced peak particle velocity due to presence of strong elastic contrast and topography in models. [1] Bohlen, T. Parallel 3-D viscoelastic finite difference seismic modeling. Computers & Geosciences 28 (2002) 887-899 [2] Komatitsch, D., and J. Tromp, Introduction to the spectral-element method for 3-D seismic wave propagation, Geophys. J. Int., 139, 806-822, 1999.

  5. Ambient seismic noise monitoring of the Super-Sauze landslide from a very dense temporary seismic array

    NASA Astrophysics Data System (ADS)

    Chtouki, Toufik; Vergne, Jerome; Provost, Floriane; Malet, Jean-Philippe; Burtin, Arnaud; Hibert, Clément

    2017-04-01

    The Super-Sauze landslide is located on the southern part of the Barcelonnette Basin (French Alps) and has developed in a soft clay-shale environment. It is one of the four sites continuously monitored through a wide variety of geophysical and hydro-geological techniques in the framework of the OMIV French national landslide observatory. From early June to mid-July 2016, a temporary dense seismic array has been installed in the most active part of the landslide and at its surroundings. 50 different sites with an average inter-station distance of 50m have been instrumented with 150 miniaturized and autonomous seismic stations (Zland nodes), allowing a continuous record of the seismic signal at frequencies higher than 0.2Hz over an almost regular grid. Concurrently, a Ground-Based InSAR device allowed for a precise and continuous monitoring of the surface deformation. Overall, this experiment is intended to better characterize the spatio-temporal evolution of the deformation processes related to various type of forcing. We analyze the continuous records of ambient seismic noise recorded by the dense array. Using power spectral densities, we characterize the various types of natural and anthropogenic seismic sources, including the effect of water turbulence and bedload transport in the small nearby torrents. We also compute the correlation of the ambient diffuse seismic noise in various frequency bands for the 2448 station pairs to recover the empirical Green functions between them. The temporal evolution of the coda part of these noise correlation functions allows monitoring and localizing shear wave velocity variations in the sliding mass. Here we present some preliminary results of this analysis and compare the seismic variations to meteorological data and surface deformation.

  6. NON-INVASIVE DETERMINATION OF THE LOCATION AND DISTRIBUTION OF FREE-PHASE DENSE NONAQUEOUS PHASE LIQUIDS (DNAPL) BY SEISMIC REFLECTION TECHNIQUES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael G. Waddell; William J. Domoracki; Tom J. Temples

    2001-12-01

    This annual technical progress report is for part of Task 4 (site evaluation), Task 5 (2D seismic design, acquisition, and processing), and Task 6 (2D seismic reflection, interpretation, and AVO analysis) on DOE contact number DE-AR26-98FT40369. The project had planned one additional deployment to another site other than Savannah River Site (SRS) or DOE Hanford Site. After the SUBCON midyear review in Albuquerque, NM, it was decided that two additional deployments would be performed. The first deployment is to test the feasibility of using non-invasive seismic reflection and AVO analysis as a monitoring tool to assist in determining the effectivenessmore » of Dynamic Underground Stripping (DUS) in removal of DNAPL. The second deployment is to the Department of Defense (DOD) Charleston Naval Weapons Station Solid Waste Management Unit 12 (SWMU-12), Charleston, SC to further test the technique to detect high concentrations of DNAPL. The Charleston Naval Weapons Station SWMU-12 site was selected in consultation with National Energy Technology Laboratory (NETL) and DOD Naval Facilities Engineering Command Southern Division (NAVFAC) personnel. Based upon the review of existing data and due to the shallow target depth, the project team collected three Vertical Seismic Profiles (VSP) and an experimental P-wave seismic reflection line. After preliminary data analysis of the VSP data and the experimental reflection line data, it was decided to proceed with Task 5 and Task 6. Three high resolution P-wave reflection profiles were collected with two objectives; (1) design the reflection survey to image a target depth of 20 feet below land surface to assist in determining the geologic controls on the DNAPL plume geometry, and (2) apply AVO analysis to the seismic data to locate the zone of high concentration of DNAPL. Based upon the results of the data processing and interpretation of the seismic data, the project team was able to map the channel that is controlling the DNAPL plume geometry. The AVO analysis located a major amplitude anomaly, which was tested using a Geoprobe{trademark} direct push system. The Geoprobe{trademark} was equipped with a membrane interface probe (MIP) that was interfaced with a sorbent trap/gas chromatograph (GC) system. Both the Photo Ionization Detector (PID) and Electron Capture Detector (ECD) on the GC exceeded the maximum measurement values through the anomaly. A well was installed to collect a water sample. The concentration of chlorinated solvents in the water sample was in excess of 500 ppm. Other amplitude anomalies located directly under an asphalt road were also tested. Both the PID and ECD were zero. It appears that editing of poor quality near-offset traces during data processing caused these anomalies. Not having the full range of source to receiver offset traces in those areas resulted in a false anomaly during AVO analysis. This phenomenon was also observed at the beginning and end of each seismic profile also for the same reason. Based upon the water samples and MIP probes, it appears that surface seismic and AVO analysis were able to detect the area of highest concentration of DNAPL.« less

  7. Experience from the ECORS program in regions of complex geology

    NASA Astrophysics Data System (ADS)

    Damotte, B.

    1993-04-01

    The French ECORS program was launched in 1983 by a cooperation agreement between universities and petroleum companies. Crustal surveys have tried to find explanations for the formation of geological features, such as rifts, mountains ranges or subsidence in sedimentary basins. Several seismic surveys were carried out, some across areas with complex geological structures. The seismic techniques and equipment used were those developed by petroleum geophysicists, adapted to the depth aimed at (30-50 km) and to various physical constraints encountered in the field. In France, ECORS has recorded 850 km of deep seismic lines onshore across plains and mountains, on various kinds of geological formations. Different variations of the seismic method (reflection, refraction, long-offset seismic) were used, often simultaneously. Multiple coverage profiling constitutes the essential part of this data acquisition. Vibrators and dynamite shots were employed with a spread generally 15 km long, but sometimes 100 km long. Some typical seismic examples show that obtaining crustal reflections essentialy depends on two factors: (1) the type and structure of shallow formations, and (2) the sources used. Thus, when seismic energy is strongly absorbed across the first kilometers in shallow formations, or when these formations are highly structured, standard multiple-coverage profiling is not able to provide results beyond a few seconds. In this case, it is recommended to simultaneously carry out long-offset seismic in low multiple coverage. Other more methodological examples show: how the impact on the crust of a surface fault may be evaluated according to the seismic method implemented ( VIBROSEIS 96-fold coverage or single dynamite shot); that vibrators make it possible to implement wide-angle seismic surveying with an offset 80 km long; how to implement the seismic reflection method on complex formations in high mountains. All data were processed using industrial seismic software, which was not always appropriate for records at least 20 s long. Therefore, a specific procedure adapted to deep seismic surveys was developed for several processing steps. The long duration of the VIBROSEIS sweeps often makes it impossible to perform correlation and stack in the recording truck in the field. Such field records were first preprocessed, in order to be later correlated and stacked in the processing center. Because of the long duration of the recordings and the great length of the spread, several types of final sections were replayed, such as: (1) detailed surface sections (0-5 s), (2) entire sections (0-20 s) after data compression, (3) near-trace sections and far-trace sections, which often yield complementary information. Standard methods of reflection migration gave unsatisfactory results. Velocities in depth are inaccurate, the many diffractions do not all come from the vertical plane of the line, and the migration software is poorly adapted to deep crustal reflections. Therefore, migration is often performed graphically from arrivals picked in the time section. Some line-drawings of various onshore lines, especially those across the Alps and the Pyrenees, enable to judge the results obtained by ECORS.

  8. What defines an Expert? - Uncertainty in the interpretation of seismic data

    NASA Astrophysics Data System (ADS)

    Bond, C. E.

    2008-12-01

    Studies focusing on the elicitation of information from experts are concentrated primarily in economics and world markets, medical practice and expert witness testimonies. Expert elicitation theory has been applied in the natural sciences, most notably in the prediction of fluid flow in hydrological studies. In the geological sciences expert elicitation has been limited to theoretical analysis with studies focusing on the elicitation element, gaining expert opinion rather than necessarily understanding the basis behind the expert view. In these cases experts are defined in a traditional sense, based for example on: standing in the field, no. of years of experience, no. of peer reviewed publications, the experts position in a company hierarchy or academia. Here traditional indicators of expertise have been compared for significance on affective seismic interpretation. Polytomous regression analysis has been used to assess the relative significance of length and type of experience on the outcome of a seismic interpretation exercise. Following the initial analysis the techniques used by participants to interpret the seismic image were added as additional variables to the analysis. Specific technical skills and techniques were found to be more important for the affective geological interpretation of seismic data than the traditional indicators of expertise. The results of a seismic interpretation exercise, the techniques used to interpret the seismic and the participant's prior experience have been combined and analysed to answer the question - who is and what defines an expert?

  9. The application of refraction seismics in alpine permafrost studies

    NASA Astrophysics Data System (ADS)

    Draebing, Daniel

    2017-04-01

    Permafrost studies in alpine environments focus on landslides from permafrost-affected rockwalls, landslide deposits or periglacial sediment dynamics. Mechanical properties of soils or rocks are influenced by permafrost and changed strength properties affect these periglacial processes. To assess the effects of permafrost thaw and degradation, monitoring techniques for permafrost distribution and active-layer thaw are required. Seismic wave velocities are sensitive to freezing and, therefore, refraction seismics presents a valuable tool to investigate permafrost in alpine environments. In this study, (1) laboratory and field applications of refraction seismics in alpine environments are reviewed and (2) data are used to quantify effects of rock properties (e.g. lithology, porosity, anisotropy, saturation) on p-wave velocities. In the next step, (3) influence of environmental factors are evaluated and conclusions drawn on permafrost differentiation within alpine periglacial landforms. This study shows that p-wave velocity increase is susceptible to porosity which is pronounced in high-porosity rocks. In low-porosity rocks, p-wave velocity increase is controlled by anisotropy decrease due to ice pressure (Draebing and Krautblatter, 2012) which enables active-layer and permafrost differentiation at rockwall scale (Krautblatter and Draebing, 2014; Draebing et al., 2016). However, discontinuity distribution can result in high anisotropy effects on seismic velocities which can impede permafrost differentiation (Phillips et al., 2016). Due to production or deposition history, porosity can show large spatial differences in deposited landforms. Landforms with large boulders such as rock glaciers and moraines show highest p-wave velocity differences between active-layer and permafrost which facilitates differentiation (Draebing, 2016). Saturation with water is essential for the successful application of refraction seismics for permafrost detection and can be controlled at laboratory scale. At landform scale, saturation shows temporal and spatial variation which is partially reflected in variation of seismic velocities of the active-layer (Draebing, 2016). Environmental factors result in a high spatial variation of rock or soil properties that affect seismic velocities. However, in landforms such as rock glaciers and moraines active-layer and permafrost can be distinguished based on seismic velocities alone while p-wave velocity differences of these layers in talus slopes and debris-covered slopes decrease and, therefore, require additional geophysical techniques or boreholes for layer differentiation (Draebing, 2016). Draebing, D., Krautblatter, M. 2012. P-wave velocity changes in freezing hard low-porosity rocks: a laboratory- based time-average model. The Cryosphere 6, 1163-1174. Draebing, D. 2016. Application of refraction seismics in alpine permafrost studies: A review. Earth-Science Reviews 155, 136-152. Draebing D., Haberkorn A., Krautblatter M., Kenner R., Phillips M. 2016. Spatial and temporal snow cover variability and resulting thermal and mechanical response in a permafrost rock wall. Permafrost and Periglacial Processes. Krautblatter M., Draebing D. 2014. Pseudo 3D - P-wave refraction seismic monitoring of permafrost in steep unstable bedrock. Journal of Geophysical Research: Earth Surface 119, 287-99. Phillips M., Haberkorn A., Draebing D., Krautblatter M., Rhyner H., Kenner R. 2016. Seasonally intermittent water flow through deep fractures in an Alpine rock ridge: Gemsstock, central Swiss Alps. Cold Regions Science and Technology 125, 117-127.

  10. Ground roll attenuation by synchrosqueezed curvelet transform

    NASA Astrophysics Data System (ADS)

    Liu, Zhao; Chen, Yangkang; Ma, Jianwei

    2018-04-01

    Ground roll is a type of coherent noise in land seismic data that has low frequency, low velocity and high amplitude. It damages reflection events that contain important information about subsurface structures, hence the removal of ground roll is a crucial step in seismic data processing. A suitable transform is needed for removal of ground roll. Curvelet transform is an effective sparse transform that optimally represents seismic events. In addition, the curvelets can provide a multiscale and multidirectional decomposition of the input data in time-frequency and angular domain, which can help distinguish between ground roll and useful signals. In this paper, we apply synchrosqueezed curvelet transform (SSCT) for ground roll attenuation. The synchrosqueezing technique in SSCT is used to precisely reallocate the energy of local wave vectors in order to separate ground roll from the original data with higher resolution and higher fidelity. Examples of synthetic and field seismic data reveal that SSCT performs well in the suppression of aliased and non-aliased ground roll while preserving reflection waves, in comparison with high-pass filtering, wavelet and curvelet methods.

  11. Physical modeling of the formation and evolution of seismically active fault zones

    USGS Publications Warehouse

    Ponomarev, A.V.; Zavyalov, A.D.; Smirnov, V.B.; Lockner, D.A.

    1997-01-01

    Acoustic emission (AE) in rocks is studied as a model of natural seismicity. A special technique for rock loading has been used to help study the processes that control the development of AE during brittle deformation. This technique allows us to extend to hours fault growth which would normally occur very rapidly. In this way, the period of most intense interaction of acoustic events can be studied in detail. Characteristics of the acoustic regime (AR) include the Gutenberg-Richter b-value, spatial distribution of hypocenters with characteristic fractal (correlation) dimension d, Hurst exponent H, and crack concentration parameter Pc. The fractal structure of AR changes with the onset of the drop in differential stress during sample deformation. The change results from the active interaction of microcracks. This transition of the spatial distribution of AE hypocenters is accompanied by a corresponding change in the temporal correlation of events and in the distribution of event amplitudes as signified by a decrease of b-value. The characteristic structure that develops in the low-energy background AE is similar to the sequence of the strongest microfracture events. When the AR fractal structure develops, the variations of d and b are synchronous and d = 3b. This relation which occurs once the fractal structure is formed only holds for average values of d and b. Time variations of d and b are anticorrelated. The degree of temporal correlation of AR has time variations that are similar to d and b variations. The observed variations in laboratory AE experiments are compared with natural seismicity parameters. The close correspondence between laboratory-scale observations and naturally occurring seismicity suggests a possible new approach for understanding the evolution of complex seismicity patterns in nature. ?? 1997 Elsevier Science B.V. All rights reserved.

  12. Rigorous Approach in Investigation of Seismic Structure and Source Characteristicsin Northeast Asia: Hierarchical and Trans-dimensional Bayesian Inversion

    NASA Astrophysics Data System (ADS)

    Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.

    2015-12-01

    Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.

  13. 4-D High-Resolution Seismic Reflection Monitoring of Miscible CO2 Injected into a Carbonate Reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard D. Miller; Abdelmoneam E. Raef; Alan P. Byrnes

    2007-06-30

    The objective of this research project was to acquire, process, and interpret multiple high-resolution 3-D compressional wave and 2-D, 2-C shear wave seismic data in the hopes of observing changes in fluid characteristics in an oil field before, during, and after the miscible carbon dioxide (CO{sub 2}) flood that began around December 1, 2003, as part of the DOE-sponsored Class Revisit Project (DOE No.DE-AC26-00BC15124). Unique and key to this imaging activity is the high-resolution nature of the seismic data, minimal deployment design, and the temporal sampling throughout the flood. The 900-m-deep test reservoir is located in central Kansas oomoldic limestonesmore » of the Lansing-Kansas City Group, deposited on a shallow marine shelf in Pennsylvanian time. After 30 months of seismic monitoring, one baseline and eight monitor surveys clearly detected changes that appear consistent with movement of CO{sub 2} as modeled with fluid simulators and observed in production data. Attribute analysis was a very useful tool in enhancing changes in seismic character present, but difficult to interpret on time amplitude slices. Lessons learned from and tools/techniques developed during this project will allow high-resolution seismic imaging to be routinely applied to many CO{sub 2} injection programs in a large percentage of shallow carbonate oil fields in the midcontinent.« less

  14. Automatic classification of seismic events within a regional seismograph network

    NASA Astrophysics Data System (ADS)

    Tiira, Timo; Kortström, Jari; Uski, Marja

    2015-04-01

    A fully automatic method for seismic event classification within a sparse regional seismograph network is presented. The tool is based on a supervised pattern recognition technique, Support Vector Machine (SVM), trained here to distinguish weak local earthquakes from a bulk of human-made or spurious seismic events. The classification rules rely on differences in signal energy distribution between natural and artificial seismic sources. Seismic records are divided into four windows, P, P coda, S, and S coda. For each signal window STA is computed in 20 narrow frequency bands between 1 and 41 Hz. The 80 discrimination parameters are used as a training data for the SVM. The SVM models are calculated for 19 on-line seismic stations in Finland. The event data are compiled mainly from fully automatic event solutions that are manually classified after automatic location process. The station-specific SVM training events include 11-302 positive (earthquake) and 227-1048 negative (non-earthquake) examples. The best voting rules for combining results from different stations are determined during an independent testing period. Finally, the network processing rules are applied to an independent evaluation period comprising 4681 fully automatic event determinations, of which 98 % have been manually identified as explosions or noise and 2 % as earthquakes. The SVM method correctly identifies 94 % of the non-earthquakes and all the earthquakes. The results imply that the SVM tool can identify and filter out blasts and spurious events from fully automatic event solutions with a high level of confidence. The tool helps to reduce work-load in manual seismic analysis by leaving only ~5 % of the automatic event determinations, i.e. the probable earthquakes for more detailed seismological analysis. The approach presented is easy to adjust to requirements of a denser or wider high-frequency network, once enough training examples for building a station-specific data set are available.

  15. Reconstructing the Seismic Wavefield using Curvelets and Distributed Acoustic Sensing

    NASA Astrophysics Data System (ADS)

    Muir, J. B.; Zhan, Z.

    2017-12-01

    Distributed Acoustic Sensing (DAS) offers an opportunity to produce cost effective and uniquely dense images of the surface seismic wavefield - DAS also produces extremely large data volumes that require innovative methods of data reduction and seismic parameter inversion to handle efficiently. We leverage DAS and the super-Nyquist sampling enabled by compressed sensing of the wavefield in the curvelet domain to produce accurate images of the horizontal velocity within a target region, using only short ( 1-10 minutes) records of either active seismic sources or ambient seismic signals. Once the wavefield has been fully described, modern "tomographic" techniques, such as Helmholtz tomography or Wavefield Gradiometry, can be employed to determine seismic parameters of interest such as phase velocity. An additional practical benefit of employing a wavefield reconstruction step is that multiple heterogeneous forms of instrumentation can be naturally combined - therefore in this study we also explore the addition of three component nodal seismic data into the reconstructed wavefield. We illustrate these techniques using both synthetic examples and data taken from the Brady Geothermal Field in Nevada during the PoroTomo (U. Wisconsin Madison) experiment of 2016.

  16. A Technique to Determine the Self-Noise of Seismic Sensors for Performance Screening

    NASA Astrophysics Data System (ADS)

    Rademacher, H.; Hart, D.; Guralp, C.

    2012-04-01

    Seismic noise affects the performance of a seismic sensor and is thereby a limiting factor for the detection threshold of monitoring networks. Among the various sources of noise, the intrinsic self-noise of a seismic sensor is most diffcult to determine, because it is mostly masked by natural and anthropogenic ground noise and is also affected by the noise characteristic of the digitizer. Here we present a new technique to determine the self-noise of a seismic system (digitizer + sensors). It is based on a method introduced by Sleeman et al. (2005) to test the noise performance of digitizers. We infer the self-noise of a triplet of identical sensors by comparing coherent waveforms over a wide spectral band across the set-up. We will show first results from a proof-of-concept study done in a vault near Albuquerque, New Mexico. We will show, how various methods of shielding the sensors affect the results of this technique. This method can also be used as a means of quality control during sensor production, because poorly performing sensors can easily be identified.

  17. Rayleigh Wave Group Velocity Tomography from Microseisms in the Acambay Graben

    NASA Astrophysics Data System (ADS)

    Valderrama Membrillo, S.; Aguirre, J.; Zuñiga-Davila, R.; Iglesias, A.

    2017-12-01

    The Acambay graben is one of the most outstanding structures of the Trans-Mexican Volcanic Belt. The Acambay graben has a length of 80km and 15 to 18 km wide and reaches a maximum height of 400 m in its central part. We obtained the group velocity seismic tomography for the Acamaby graben for three different frequencies (f = 0.1, 0.2 and 0.3 Hz). The graben was divided into 6x6 km cells for the tomography and covered a total area of 1008 km2. Seismic noise data from 10 broadband seismic stations near the Acambay graben were used to extract the surface wave arrival-times between all station pairs. The Green's function was recovered in each stations pair by cross-correlation technique. This technique was applied to seismic recordings collected on the vertical component of 10 broadband stations for a continuous recording period of 5 months. Data processing consisted of removing instrumental response, mean, and trend. After that, we applied time domain normalization, a spectral whitening and applied band-pas filtering of 0.1 to 1 Hz. There are shallow studies of the Acambay graben. But little is known of the distribution of deep graben structures. This study estimated the surface wave velocity deep structure. The structures at the frequency 0.3 Hz indicate a lower depth than the remaining frequencies. The result for this frequency show consistencies with previous studies of gravimetry and resistivity, also defines the fault system of Temascalcingo.

  18. Applications of seismic spatial wavefield gradient and rotation data in exploration seismology

    NASA Astrophysics Data System (ADS)

    Schmelzbach, C.; Van Renterghem, C.; Sollberger, D.; Häusler, M.; Robertsson, J. O. A.

    2017-12-01

    Seismic spatial wavefield gradient and rotation data have the potential to open up new ways to address long-standing problems in land-seismic exploration such as identifying and separating P-, S-, and surface waves. Gradient-based acquisition and processing techniques could enable replacing large arrays of densely spaced receivers by sparse spatially-compact receiver layouts or even one single multicomponent station with dedicated instruments (e.g., rotational seismometers). Such approaches to maximize the information content of single-station recordings are also of significant interest for seismic measurements at sites with limited access such as boreholes, the sea bottom, and extraterrestrial seismology. Arrays of conventional three-component (3C) geophones enable measuring not only the particle velocity in three dimensions but also estimating their spatial gradients. Because the free-surface condition allows to express vertical derivatives in terms of horizontal derivatives, the full gradient tensor and, hence, curl and divergence of the wavefield can be computed. In total, three particle velocity components, three rotational components, and divergence, result seven-component (7C) seismic data. Combined particle velocity and gradient data can be used to isolate the incident P- or S-waves at the land surface or the sea bottom using filtering techniques based on the elastodynamic representation theorem. Alternatively, as only S-waves exhibit rotational motion, rotational measurements can directly be used to identify S-waves. We discuss the derivations of the gradient-based filters as well as their application to synthetic and field data, demonstrating that rotational data can be of particular interest to S-wave reflection and P-to-S-wave conversion imaging. The concept of array-derived gradient estimation can be extended to source arrays as well. Therefore, source arrays allow us to emulate rotational (curl) and dilatational (divergence) sources. Combined with 7C recordings, a total of 49 components of the seismic wavefield can be excited and recorded. Such data potentially allow to further improve wavefield separation and may find application in directional imaging and coherent noise suppression.

  19. Unveiling the signals from extremely noisy microseismic data for high-resolution hydraulic fracturing monitoring.

    PubMed

    Huang, Weilin; Wang, Runqiu; Li, Huijian; Chen, Yangkang

    2017-09-20

    Microseismic method is an essential technique for monitoring the dynamic status of hydraulic fracturing during the development of unconventional reservoirs. However, one of the challenges in microseismic monitoring is that those seismic signals generated from micro seismicity have extremely low amplitude. We develop a methodology to unveil the signals that are smeared in the strong ambient noise and thus facilitate a more accurate arrival-time picking that will ultimately improve the localization accuracy. In the proposed technique, we decompose the recorded data into several morphological multi-scale components. In order to unveil weak signal, we propose an orthogonalization operator which acts as a time-varying weighting in the morphological reconstruction. The orthogonalization operator is obtained using an inversion process. This orthogonalized morphological reconstruction can be interpreted as a projection of the higher-dimensional vector. We first test the proposed technique using a synthetic dataset. Then the proposed technique is applied to a field dataset recorded in a project in China, in which the signals induced from hydraulic fracturing are recorded by twelve three-component (3-C) geophones in a monitoring well. The result demonstrates that the orthogonalized morphological reconstruction can make the extremely weak microseismic signals detectable.

  20. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  1. Estimating tectonic history through basin simulation-enhanced seismic inversion: Geoinformatics for sedimentary basins

    USGS Publications Warehouse

    Tandon, K.; Tuncay, K.; Hubbard, K.; Comer, J.; Ortoleva, P.

    2004-01-01

    A data assimilation approach is demonstrated whereby seismic inversion is both automated and enhanced using a comprehensive numerical sedimentary basin simulator to study the physics and chemistry of sedimentary basin processes in response to geothermal gradient in much greater detail than previously attempted. The approach not only reduces costs by integrating the basin analysis and seismic inversion activities to understand the sedimentary basin evolution with respect to geodynamic parameters-but the technique also has the potential for serving as a geoinfomatics platform for understanding various physical and chemical processes operating at different scales within a sedimentary basin. Tectonic history has a first-order effect on the physical and chemical processes that govern the evolution of sedimentary basins. We demonstrate how such tectonic parameters may be estimated by minimizing the difference between observed seismic reflection data and synthetic ones constructed from the output of a reaction, transport, mechanical (RTM) basin model. We demonstrate the method by reconstructing the geothermal gradient. As thermal history strongly affects the rate of RTM processes operating in a sedimentary basin, variations in geothermal gradient history alter the present-day fluid pressure, effective stress, porosity, fracture statistics and hydrocarbon distribution. All these properties, in turn, affect the mechanical wave velocity and sediment density profiles for a sedimentary basin. The present-day state of the sedimentary basin is imaged by reflection seismology data to a high degree of resolution, but it does not give any indication of the processes that contributed to the evolution of the basin or causes for heterogeneities within the basin that are being imaged. Using texture and fluid properties predicted by our Basin RTM simulator, we generate synthetic seismograms. Linear correlation using power spectra as an error measure and an efficient quadratic optimization technique are found to be most effective in determining the optimal value of the tectonic parameters. Preliminary 1-D studies indicate that one can determine the geothermal gradient even in the presence of observation and numerical uncertainties. The algorithm succeeds even when the synthetic data has detailed information only in a limited depth interval and has a different dominant frequency in the synthetic and observed seismograms. The methodology presented here even works when the basin input data contains only 75 per cent of the stratigraphic layering information compared with the actual basin in a limited depth interval.

  2. New insights into the North Taranaki Basin from New Zealand's first broadband 3D survey

    NASA Astrophysics Data System (ADS)

    Uzcategui, Marjosbet; Francis, Malcolm; Kong, Wai Tin Vincent; Patenall, Richard; Fell, Dominic; Paxton, Andrea; Allen, Tristan

    2016-06-01

    The Taranaki Basin is the only hydrocarbon producing basin in New Zealand. The North Taranaki Basin has widespread two-dimensional (2D) seismic coverage and numerous wells that have not encountered commercial accumulations. This is attributed to the structural complexity in the central graben and the absence of necessary information to help understand the basin's evolution. An active petroleum system has been confirmed by hydrocarbon shows and non-commercial oil and gas discoveries (Karewa-1 and Kora-1). A broadband long offset three-dimensional (3D) seismic survey was acquired and processed by Schlumberger in 2013 to evaluate the hydrocarbon potential of the North Taranaki Basin. Innovative acquisition techniques were combined with advanced processing and imaging methods. Raypath distortions and depth uncertainty were significantly reduced by processing through tilted transverse isotropy (TTI) anisotropic Kirchhoff prestack depth migration with a geologically constrained velocity model. The survey provided the necessary information to understand the petroleum system and provide evidence for material hydrocarbon accumulations. In this investigation, we assessed the hydrocarbon potential of the North Taranaki Basin using the newly acquired data. 3D seismic interpretation and amplitude-versus-offset (AVO) analysis support the renewed potential of the basin and demonstrate effectiveness of these technologies that together can achieve encouraging results for hydrocarbon exploration.

  3. A PC-based computer package for automatic detection and location of earthquakes: Application to a seismic network in eastern sicity (Italy)

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio; Giampiccolo, Elisabetta; Gresta, Stefano

    Few automated data acquisition and processing systems operate on mainframes, some run on UNIX-based workstations and others on personal computers, equipped with either DOS/WINDOWS or UNIX-derived operating systems. Several large and complex software packages for automatic and interactive analysis of seismic data have been developed in recent years (mainly for UNIX-based systems). Some of these programs use a variety of artificial intelligence techniques. The first operational version of a new software package, named PC-Seism, for analyzing seismic data from a local network is presented in Patanè et al. (1999). This package, composed of three separate modules, provides an example of a new generation of visual object-oriented programs for interactive and automatic seismic data-processing running on a personal computer. In this work, we mainly discuss the automatic procedures implemented in the ASDP (Automatic Seismic Data-Processing) module and real time application to data acquired by a seismic network running in eastern Sicily. This software uses a multi-algorithm approach and a new procedure MSA (multi-station-analysis) for signal detection, phase grouping and event identification and location. It is designed for an efficient and accurate processing of local earthquake records provided by single-site and array stations. Results from ASDP processing of two different data sets recorded at Mt. Etna volcano by a regional network are analyzed to evaluate its performance. By comparing the ASDP pickings with those revised manually, the detection and subsequently the location capabilities of this software are assessed. The first data set is composed of 330 local earthquakes recorded in the Mt. Etna erea during 1997 by the telemetry analog seismic network. The second data set comprises about 970 automatic locations of more than 2600 local events recorded at Mt. Etna during the last eruption (July 2001) at the present network. For the former data set, a comparison of the automatic results with the manual picks indicates that the ASDP module can accurately pick 80% of the P-waves and 65% of S-waves. The on-line application on the latter data set shows that automatic locations are affected by larger errors, due to the preliminary setting of the configuration parameters in the program. However, both automatic ASDP and manual hypocenter locations are comparable within the estimated error bounds. New improvements of the PC-Seism software for on-line analysis are also discussed.

  4. Seismic &Infrasound Integrated Array "Apatity". Techniques, data processing, first results of observations.

    NASA Astrophysics Data System (ADS)

    Vinogradov, Y.; Baryshnikov, A.

    2003-04-01

    Since September 2001 3 infrasound membrane type sensors "K-304 AM" have been installed on the territory seismic array "Apatity" near the lake Imandra. A seismic array comprising 11 short-period sensors (type "Geotech S-500"), disposed on small and large circle (0.4 and 1 km diameter). Infrasound sensors located on small circle near the seismograths. All data are digitized at the array site and transmitted in real time to a processing center in Apatity to the Kola Regional Seismological Centre (KRSC). Common complex we are called - Seismic &Infrasound Integrated Array (SISIA) "Apatity". To support temporary storage the transmitting data in a disk loop and access to the data "NEWNORAC" program was created. This program replaced "NORAC" system developed by Norwegian Institute NORSAR, which was in use in KRSC before. A program package EL (event locator) for display and processing of the data has been modified. Now it includes the following : - quick access to the data stored in the disk loop (last two weeks); - data convertation from disk loop format to CSS 3.0 format; - data filtering using bandpass, highpass, lowpass, adaptive or rejector filters; - calculation of spectra and sonograms (spectral diagrams); - seismic events location with plotting on a map; - calculation of backazimuth and apparent velocity of acoustic wave by similar parts of wave recordings; - loading and processing CSS 3.0 seismic and acoustic data from KRSC archive. To store the acoustic data permanently the program BARCSS was made. It rewrites the data from the disk loop to KRSC archive in CSS 3.0 format. For comparison of acoustic noise level with wind we use data from meteorological station in Kandalaksha city, sampling rate is 3 hours. During the period from October 2001 to October 2002 more than 745 seismic events, which basically connected with mine technical activity of the large mining enterprises at the Kola Peninsula, were registered. The most part of events, caused by ground explosions, was registered by infrasound part of SISIA "Apatity". Their sources were at distances from 38 to 220 km. The result of observations during the first 1 year enabled us to estimate frequency range and main directions of arrivals of acoustic waves and noise level in the place of observations. In accordance with the results and relief a 4-rays wind-noise-reducing pipe array would be install at all 3 sensors at May 2003, for improvement the delectability during windy conditions. A schemes of the SISIA "Apatity", data transmitting and processing and samples of detected signals are shown in the presentation.

  5. Repeating ice-earthquakes beneath David Glacier from the 2012-2015 TAMNNET array

    NASA Astrophysics Data System (ADS)

    Walter, J. I.; Peng, Z.; Hansen, S. E.

    2017-12-01

    The continent of Antarctica has approximately the same surface area as the continental United States, though we know significantly less about its underlying geology and seismic activity. In recent years, improvements in seismic instrumentation, battery technology, and field deployment practices have allowed for continuous broadband stations throughout the dark Antarctic winter. We utilize broadband seismic data from a recent experiment (TAMNNET), which was originally proposed as a structural seismology experiment, for seismic event detection. Our target is to address fundamental questions about regional-scale crustal and environmental seismicity in the study region that comprises the Transantarctic Mountain area of Victoria and Oates Land. We identify most seismicity emanating from David Glacier, upstream of the Drygalski Ice Tongue, which has been documented by several other studies. In order to improve the catalog completeness for the David Glacier area, we utilize a matched-filter technique to identify potential missing earthquakes that may not have been originally detected. This technique utilizes existing cataloged waveforms as templates to scan through continuous data and to identify repeating or nearby earthquakes. With a more robust catalog, we evaluate relative changes in icequake positions, recurrence intervals, and other first-order information. In addition, we attempt to further refine locations of other regional seismicity using a variety of methods including body and surface wave polarization, beamforming, surface wave dispersion, and other seismological methods. This project highlights the usefulness of archiving raw datasets (i.e., passive seismic continuous data), so that researchers may apply new algorithms or techniques to test hypotheses not originally or specifically targeted by the original experimental design.

  6. SiSeRHMap v1.0: a simulator for mapped seismic response using a hybrid model

    NASA Astrophysics Data System (ADS)

    Grelle, G.; Bonito, L.; Lampasi, A.; Revellino, P.; Guerriero, L.; Sappa, G.; Guadagno, F. M.

    2015-06-01

    SiSeRHMap is a computerized methodology capable of drawing up prediction maps of seismic response. It was realized on the basis of a hybrid model which combines different approaches and models in a new and non-conventional way. These approaches and models are organized in a code-architecture composed of five interdependent modules. A GIS (Geographic Information System) Cubic Model (GCM), which is a layered computational structure based on the concept of lithodynamic units and zones, aims at reproducing a parameterized layered subsoil model. A metamodeling process confers a hybrid nature to the methodology. In this process, the one-dimensional linear equivalent analysis produces acceleration response spectra of shear wave velocity-thickness profiles, defined as trainers, which are randomly selected in each zone. Subsequently, a numerical adaptive simulation model (Spectra) is optimized on the above trainer acceleration response spectra by means of a dedicated Evolutionary Algorithm (EA) and the Levenberg-Marquardt Algorithm (LMA) as the final optimizer. In the final step, the GCM Maps Executor module produces a serial map-set of a stratigraphic seismic response at different periods, grid-solving the calibrated Spectra model. In addition, the spectra topographic amplification is also computed by means of a numerical prediction model. This latter is built to match the results of the numerical simulations related to isolate reliefs using GIS topographic attributes. In this way, different sets of seismic response maps are developed, on which, also maps of seismic design response spectra are defined by means of an enveloping technique.

  7. Signal frequency distribution and natural-time analyses from acoustic emission monitoring of an arched structure in the Castle of Racconigi

    NASA Astrophysics Data System (ADS)

    Niccolini, Gianni; Manuello, Amedeo; Marchis, Elena; Carpinteri, Alberto

    2017-07-01

    The stability of an arch as a structural element in the thermal bath of King Charles Albert (Carlo Alberto) in the Royal Castle of Racconigi (on the UNESCO World Heritage List since 1997) was assessed by the acoustic emission (AE) monitoring technique with application of classical inversion methods to recorded AE data. First, damage source location by means of triangulation techniques and signal frequency analysis were carried out. Then, the recently introduced method of natural-time analysis was preliminarily applied to the AE time series in order to reveal a possible entrance point to a critical state of the monitored structural element. Finally, possible influence of the local seismic and microseismic activity on the stability of the monitored structure was investigated. The criterion for selecting relevant earthquakes was based on the estimation of the size of earthquake preparation zones. The presented results suggest the use of the AE technique as a tool for detecting both ongoing structural damage processes and microseismic activity during preparation stages of seismic events.

  8. Applying Transmission Kikuchi Diffraction (TKD) to Understand Nanogranular Fault Rock Materials

    NASA Astrophysics Data System (ADS)

    Smith, S. A. F.; Demurtas, M.; Prior, D. J.; Di Toro, G.

    2017-12-01

    Nanoparticles (<< 1 µm) form in the localized slip zones of natural and experimental faults, but their origin (e.g. seismic vs. aseismic slip) and mechanical behaviour is still debated. Understanding the deformation processes that produce nanoparticles in faults requires an understanding of grain sizes, shapes and crystallographic orientations at higher spatial resolution than is currently possible using standard EBSD techniques. Transmission Kikuchi Diffraction (TKD) in the SEM is a technique that allows to overcome this spatial resolution issue by performing orientation mapping in a commercial EBSD system on electron transparent foils with resolutions that can be below 10 nm. Therefore, the potential of TKD to understand deformation processes in nanoparticles is very high. We present results of TKD analysis performed on mixed calcite-dolomite gouges deformed in a rotary-shear apparatus at slip rates ranging from sub-seismic to co-seismic (30 µm/s to 1 m/s). Samples for TKD were prepared by argon ion slicing, a method that yields relatively large (104 µm2) electron transparent areas, as well as standard argon ion milling. Coupled TKD-EDS analysis allows quantification of elemental contents at a scale of tens of nanometers. Preliminary results show that at a slip velocity of 1 m/s, the localized slip zone that forms in the gouges during shearing is composed of recrystallized grains of calcite and Mg-calcite (the latter being a decarbonation product of dolomite) with an average grain size of c. 300 nm. Individual grains are characterized by relatively straight boundaries, and many triple and quadruple grain junctions are present. The nanogranular aggregates show a polygonised texture with absence of clear porosity and shape preferred orientation. Orientation data show a random distribution of the calcite c-axes. Further investigation will help to obtain new insights into the deformation mechanisms active during seismic faulting in carbonate-bearing faults. The integration of grain size, grain shape and crystallographic information into flow laws will help to describe and predict the rheological behaviour of carbonate faults during seismic sliding.

  9. Seismic and mechanical studies of the artificially triggered rockfall at the Mount Néron (French Alps, December 2011)

    NASA Astrophysics Data System (ADS)

    Bottelin, P.; Jongmans, D.; Daudon, D.; Mathy, A.; Helmstetter, A.; Bonilla-Sierra, V.; Cadet, H.; Amitrano, D.; Richefeu, V.; Lorier, L.; Baillet, L.; Villard, P.; Donzé, F.

    2014-02-01

    The eastern limestone cliff of Mount Néron (French Alps) was the theatre of two medium-size rockfalls between summer and winter 2011. On 14 August 2011, a ~ 2000 m3 rock compartment detached from the cliff, fell 100 m below and propagated down the slope. Although most of the fallen rocks deposited in the upper part of the slope, about 15 meter-size blocks were stopped by a ditch and an earthen barrier after a runout of 800 m. An unstable overhanging ~ 2600 m3 compartment remained attached to the cliff and was blasted on 13 December 2011. During this artificially triggered event, 7 blocks reached the same ditch, with volumes ranging from 0.8 to 12 m3. A semi-permanent seismic array located about 2.5 km from the site recorded the two events, providing a unique opportunity to understand and to compare the seismic phases generated during natural and artificially triggered rockfalls. Both events have signal duration of ~ 100 s with comparable maximum amplitudes recorded at large distances (computed local magnitude of 1.14 and 1.05, respectively), most of the energy lying below 20 Hz. Remote sensing techniques (photogrammetry and LiDAR) were employed before and after the provoked rockfall, allowing the volume and fracturing to be characterized. This event was filmed by two video cameras and the generated ground motions were recorded using two temporary 3C seismic sensors and 3 seismic arrays deployed at the slope toe. Movie and seismogram processing provided estimates of the propagation velocity during the successive rockfall phases, which ranges from 12 m s-1 to 30 m s-1. The main seismic phases were obtained from combined video and seismic signal analyses. The two most energetic phases are related to the ground impact of fallen material after free-fall, and to individual rock block impacts into the ditch and the earthen barrier. These two phases are characterized by similar low-frequency content but show very different particle motions. The discrete element technique allowed reproducing the key features of the rockfall dynamics, yielding propagation velocities compatible with experimental observations.

  10. Continuous Seismic Threshold Monitoring

    DTIC Science & Technology

    1992-05-31

    Continuous threshold monitoring is a technique for using a seismic network to monitor a geographical area continuously in time. The method provides...area. Two approaches are presented. Site-specific monitoring: By focusing a seismic network on a specific target site, continuous threshold monitoring...recorded events at the site. We define the threshold trace for the network as the continuous time trace of computed upper magnitude limits of seismic

  11. Exploration Geophysics

    ERIC Educational Resources Information Center

    Espey, H. R.

    1977-01-01

    Describes geophysical techniques such as seismic, gravity, and magnetic surveys of offshare acreage, and land-data gathering from a three-dimensional representation made from closely spaced seismic lines. (MLH)

  12. Detecting Seismic Infrasound Signals on Balloon Platforms

    NASA Astrophysics Data System (ADS)

    Krishnamoorthy, S.; Komjathy, A.; Cutts, J. A.; Pauken, M.; Garcia, R.; Mimoun, D.; Jackson, J. M.; Kedar, S.; Smrekar, S. E.; Hall, J. L.

    2017-12-01

    The determination of the interior structure of a planet requires detailed seismic investigations - a process that entails the detection and characterization of seismic waves due to geological activities (e.g., earthquakes, volcanoes, etc.). For decades, this task has primarily been performed on Earth by an ever-expanding network of terrestrial seismic stations. However, on planets such as Venus, where the surface pressure and temperature can reach as high as 90 atmospheres and 450 degrees Celsius respectively, placing seismometers on the planet's surface poses a vexing technological challenge. However, the upper layers of the Venusian atmosphere are more benign and capable of hosting geophysical payloads for longer mission lifetimes. In order to achieve the aim of performing geophysical experiments from an atmospheric platform, JPL and its partners (ISAE-SUPAERO and California Institute of Technology) are in the process of developing technologies for detection of infrasonic waves generated by earthquakes from a balloon. The coupling of seismic energy into the atmosphere critically depends on the density differential between the surface of the planet and the atmosphere. Therefore, the successful demonstration of this technique on Earth would provide ample reason to expect success on Venus, where the atmospheric impedance is approximately 60 times that of Earth. In this presentation, we will share results from the first set of Earth-based balloon experiments performed in Pahrump, Nevada in June 2017. These tests involved the generation of artificial sources of known intensity using a seismic hammer and their detection using a complex network of sensors, including highly sensitive micro-barometers suspended from balloons, GPS receivers, geophones, microphones, and seismometers. This experiment was the first of its kind and was successful in detecting infrasonic waves from the earthquakes generated by the seismic hammer. We will present the first comprehensive analysis of the data obtained from these sensors and use these data to characterize the infrasound signal created by earthquakes. These data will also inform the design of future experiments, which will involve tropospheric and stratospheric flights above naturally occurring areas with high seismicity.

  13. The Complex Cepstrum - Revisited

    NASA Astrophysics Data System (ADS)

    Kemerait, R. C., Sr.

    2016-12-01

    Since this paper comes at the twilight of my career, it is appropriate to share my views on a subject very dear to my heart and to my long career. In 2004 "From Frequency to Quefrency: A History of the Cepstrum" was published in the IEEE Signal Processing magazine. There is no question that the authors, Alan V. Oppenheim and Ronald W. Schafer, were pioneers in this area of research, and this publication documents their involvement quite nicely. In parallel research also performed in the 1960's, Childers, et. al., renamed the original "Cepstrum" to the "Power Cepstrum" to avoid confusion with the principal topic of their research, that being the "Complex Cepstrum." The term "Power Cepstrum" has become widely used in the literature since that time. The Childers team, including Dr. Kemerait, published a summary of their work, as of that date, in the IEEE Proceedings of October 1977, and titled the article "The Cepstrum: A Guide to Processing." In the subsequent 40 years, Dr. Kemerait has continued to research cepstral techniques applied to many diverse problems; however, his primary research has been on estimating the depth of underground and underwater events. He has also applied these techniques to biomedical data: EEG, EKG, and Visua-evoked responses as well as on hydroacoustic data ; thereby, determining the "bubble pulse frequency", and the depths of the explosion and the ocean depth at the explosion point. He has also used cepstral techniques in the processing of ground penetrating radar, speech, machine diagnostics, and, throughout these years, seismic data. This paper emphasizes his recent improvements in processing primarily seismic and infrasound data associated with nuclear treaty monitoring. The emphasis is mainly on the recent improvements and the automation of the Complex Cepstrum process.

  14. Understanding Seismic Anisotropy in Hunt Well of Fort McMurray, Canada

    NASA Astrophysics Data System (ADS)

    Malehmir, R.; Schmitt, D. R.; Chan, J.

    2014-12-01

    Seismic imaging plays vital role in geothermal systems as a sustainable energy resource. In this paper, we acquired and processed zero-offset and walk-away VSP and logging as well as surface seismic in Athabasca oil sand area, Alberta. Seismic data were highly processed to make better image geothermal system. Through data processing, properties of natural fractures such as orientation and width were studied and high probable permeable zones were mapped along the deep drilled to the depth of 2363m deep into crystalline basement rocks. In addition to logging data, seismic data were processed to build a reliable image of underground. Velocity analysis in high resolution multi-component walk-away VSP informed us about the elastic anisotropy in place. Study of the natural and induced fracture as well as elastic anisotropy in the seismic data, led us to better map stress regime around the well bore. The seismic image and map of fractures optimizes enhanced geothermal stages through hydraulic stimulation. Keywords: geothermal, anisotropy, VSP, logging, Hunt well, seismic

  15. Seismic data compression speeds exploration projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galibert, P.Y.

    As part of an ongoing commitment to ensure industry-wide distribution of its revolutionary seismic data compression technology, Chevron Petroleum Technology Co. (CPTC) has entered into licensing agreements with Compagnie Generale de Geophysique (CGG) and other seismic contractors for use of its software in oil and gas exploration programs. CPTC expects use of the technology to be far-reaching to all of its industry partners involved in seismic data collection, processing, analysis and storage. Here, CGG--one of the world`s leading seismic acquisition and processing companies--talks about its success in applying the new methodology to replace full on-board seismic processing. Chevron`s technology ismore » already being applied on large off-shore 3-D seismic surveys. Worldwide, CGG has acquired more than 80,000 km of seismic data using the data compression technology.« less

  16. A New Moonquake Catalog from Apollo 17 Seismic Data II: Lunar Surface Gravimeter: Implications of Expanding the Passive Seismic Array

    NASA Astrophysics Data System (ADS)

    Phillips, D.; Dimech, J. L.; Weber, R. C.

    2017-12-01

    Apollo 17's Lunar Surface Gravimeter (LSG) was deployed on the Moon in 1972, and was originally intended to detect gravitational waves as a confirmation of Einstein's general theory of relativity. Due to a design problem, the instrument did not function as intended. However, remotely-issued reconfiguration commands permitted the instrument to act effectively as a passive seismometer. LSG recorded continuously until Sept. 1977, when all surface data recording was terminated. Because the instrument did not meet its primary science objective, little effort was made to archive the data. Most of it was eventually lost, with the exception of data spanning the period March 1976 until Sept. 1977, and a recent investigation demonstrated that LSG data do contain moonquake signals (Kawamura et al., 2015). The addition of useable seismic data at the Apollo 17 site has important implications for event location schemes, which improve with increasing data coverage. All previous seismic event location attempts were limited to the four stations deployed at the Apollo 12, 14, 15, and 16 sites. Apollo 17 extends the functional aperture of the seismic array significantly to the east, permitting more accurate moonquake locations and improved probing of the lunar interior. Using the standard location technique of linearized arrival time inversion through a known velocity model, Kawamura et al. (2015) used moonquake signals detected in the LSG data to refine location estimates for 49 deep moonquake clusters, and constrained new locations for five previously un-located clusters. Recent efforts of the Apollo Lunar Surface Experiments Package Data Recovery Focus Group have recovered some of the previously lost LSG data, spanning the time period April 2, 1975 to June 30, 1975. In this study, we expand Kawamura's analysis to the newly recovered data, which contain over 200 known seismic signals, including deep moonquakes, shallow moonquakes, and meteorite impacts. We have completed initial data processing and verified the presence of deep moonquake signals in the recovered data. This positions us well for the application of automated event-detection techniques that have been successfully applied to the Apollo 16 Passive Seismic Experiment data as well as the Apollo 17 Lunar Seismic Profiling Experiment data.

  17. The Time-Frequency Signatures of Advanced Seismic Signals Generated by Debris Flows

    NASA Astrophysics Data System (ADS)

    Chu, C. R.; Huang, C. J.; Lin, C. R.; Wang, C. C.; Kuo, B. Y.; Yin, H. Y.

    2014-12-01

    The seismic monitoring is expected to reveal the process of debris flow from the initial area to alluvial fan, because other field monitoring techniques, such as the video camera and the ultrasonic sensor, are limited by detection range. For this reason, seismic approaches have been used as the detection system of debris flows over the past few decades. The analysis of the signatures of the seismic signals in time and frequency domain can be used to identify the different phases of debris flow. This study dedicates to investigate the different stages of seismic signals due to debris flow, including the advanced signal, the main front, and the decaying tail. Moreover, the characteristics of the advanced signals forward to the approach of main front were discussed for the warning purpose. This study presents a permanent system, composed by two seismometers, deployed along the bank of Ai-Yu-Zi Creek in Nantou County, which is one of the active streams with debris flow in Taiwan. The three axes seismometer with frequency response of 7 sec - 200 Hz was developed by the Institute of Earth Sciences (IES), Academia Sinica for the purpose to detect debris flow. The original idea of replacing the geophone system with the seismometer technique was for catching the advanced signals propagating from the upper reach of the stream before debris flow arrival because of the high sensitivity. Besides, the low frequency seismic waves could be also early detected because of the low attenuation. However, for avoiding other unnecessary ambient vibrations, the sensitivity of seismometer should be lower than the general seismometer for detecting teleseism. Three debris flows with different mean velocities were detected in 2013 and 2014. The typical triangular shape was obviously demonstrated in time series data and the spectrograms of the seismic signals from three events. The frequency analysis showed that enormous debris flow bearing huge boulders would induce low frequency seismic waves. Owing to the less attenuation of low frequency waves, advanced signals mainly ranged between 2 and 10 Hz were detected in several minutes prior to the arrival of the main surge of a debris flow. As the results, the prior time of the advanced signals could be used not only to extend the warning time, but also to identify the initial location of a developing debris flow.

  18. Multisensor of Remotely Sensed Data for Characterizing Seismotectonic Activities in Malaysia

    NASA Astrophysics Data System (ADS)

    Abu Bakar, Rabieahtul; Azahari Razak, Khamarrul; Anuar Jamaludin, Tajul; Tongkul, Felix; Mohamad, Zakaria; Ramli, Zamri; Abd Manap, Mohamad; Rahman, Muhammad Zulkarnain Abdul

    2015-04-01

    Seismically induced events pose serious hazards yet are difficult to predict. Despite remarkable efforts of mapping, monitoring and modelling of such great events at regional or local scales, the understanding of the processes in the Earth's dynamic system remains elusive. Although Malaysia is in a relatively low seismic hazard zone, the current trend and pattern of seismotectonic activities triggered a series of fundamental study to better understand the relationship between the earthquakes, recent tectonics and seismically active fault zones. Several conventional mapping techniques have been intensively used but shown some limitations. Remote sensing is the preferable mean to quantify the seismic activity accurately in a larger area within a short period. Still, only few of such studies have been carried out in this subduction region. Characterization of seismotectonic activities from space in a tropical environment is very challenging given the complexity of its physiographic, climatic, geologic conditions and anthropogenic activities. There are many factors controlling the success rate of the implementation mainly due to the lack of historical earthquakes, geomorphological evidence, and proper identification of regional tectonic patterns. In this study, we aim at providing better insight to extract and characterize seismotectonic activities by integrating passive and active remotely-sensed data, geodetic data, historical records, GIS-based data analysis and in-situ measurements as well quantify them based on field investigation and expert knowledge. It is crucial to perform spatiotemporal analysis of its activities in the most seismically induced region in North-Western Sabah. A comprehensive geodatabase of seismotectonic events are developed and allowed us to analyse the spatiotemporal activities. A novelty of object-based image method for extracting tropical seismically active faults and related seismotectonic features are introduced and evaluated. We aim to develop the exchangeable and transferable rule-set with optimal parameterization for such aforementioned tasks. A geomorphometric-based remotely sensed approach is used to understand the tectonic geomorphology in processes affecting the environment at different spatial scales. As a result of this study, questions related to cascading natural disasters, e.g. landslides can be quantitatively answered. Development and applications of seismically induced landslide hazard and risk zonation at different scales are conceptually presented and critically discussed. So far, quantification evaluation of uncertainties associated to spatial seismic hazard and risks prediction remains very challenging to understand and it is an interest of on-going research. In the near-future, it is crucial to address the changes of climate and land-use-land-cover in relation to temporal and spatial pattern of seismically induced landslides. It is also important to assess, model and incorporate the changes due to natural disasters into a sustainable risk management. As a conclusion, the characteristics, development and function of tectonic movement, as one of the components for geomorphological process-response system is crucial for a regional seismic study. With newly emerging multi-sensor of remotely sensed data coupled with the satellite positioning system promises a better mapping and monitoring tool for seismotectonic activities in such a way that it can be used to map, monitor, and model related seismically induced processes for a comprehensive hazard and associated risk assessment.

  19. Assessment of offshore New Jersey sources of Beach replenishment sand by diversified application of geologic and geophysical methods

    USGS Publications Warehouse

    Waldner, J.S.; Hall, D.W.; Uptegrove, J.; Sheridan, R.E.; Ashley, G.M.; Esker, D.

    1999-01-01

    Beach replenishment serves the dual purpose of maintaining a source of tourism and recreation while protecting life and property. For New Jersey, sources for beach sand supply are increasingly found offshore. To meet present and future needs, geologic and geophysical techniques can be used to improve the identification, volume estimation, and determination of suitability, thereby making the mining and managing of this resource more effective. Current research has improved both data collection and interpretation of seismic surveys and vibracore analysis for projects investigating sand ridges offshore of New Jersey. The New Jersey Geological Survey in cooperation with Rutgers University is evaluating the capabilities of digital seismic data (in addition to analog data) to analyze sand ridges. The printing density of analog systems limits the dynamic range to about 24 dB. Digital acquisition systems with dynamic ranges above 100 dB can permit enhanced seismic profiles by trace static correction, deconvolution, automatic gain scaling, horizontal stacking and digital filtering. Problems common to analog data, such as wave-motion effects of surface sources, water-bottom reverberation, and bubble-pulse-width can be addressed by processing. More than 160 line miles of digital high-resolution continuous profiling seismic data have been collected at sand ridges off Avalon, Beach Haven, and Barnegat Inlet. Digital multichannel data collection has recently been employed to map sand resources within the Port of New York/New Jersey expanded dredge-spoil site located 3 mi offshore of Sandy Hook, New Jersey. Multichannel data processing can reduce multiples, improve signal-to-noise calculations, enable source deconvolution, and generate sediment acoustic velocities and acoustic impedance analysis. Synthetic seismograms based on empirical relationships among grain size distribution, density, and velocity from vibracores are used to calculate proxy values for density and velocity. The seismograms are then correlated to the digital seismic profile to confirm reflected events. They are particularly useful where individual reflection events cannot be detected but a waveform generated by several thin lithologic units can be recognized. Progress in application of geologic and geophysical methods provides advantages in detailed sediment analysis and volumetric estimation of offshore sand ridges. New techniques for current and ongoing beach replenishment projects not only expand our knowledge of the geologic processes involved in sand ridge origin and development, but also improve our assessment of these valuable resources. These reconnaissance studies provide extensive data to the engineer regarding the suitability and quantity of sand and can optimize placement and analysis of vibracore samples.Beach replenishment serves the dual purpose of maintaining a source of tourism and recreation while protecting life and property. Research has improved both data collection and interpretation of seismic surveys and vibracore analysis for projects investigating sand ridges offshore of New Jersey. The New Jersey Geological Survey in cooperation with Rutgers University is evaluating the capabilities of digital seismic data to analyze sand ridges. The printing density of analog systems limits the dynamic range to about 24 dB. Digital acquisition systems with dynamic ranges about 100 dB can permit enhanced seismic profiles by trace static correction, deconvolution, automatic gain scaling, horizontal stacking and digital filtering.

  20. Seismicity associated with quiescent-explosive transitions at dome forming eruptions: The July 2008 Vulcanian Explosion of Soufrière Hills Volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Rodgers, Mel; Smith, Patrick; Mather, Tamsin A.; Pyle, David M.

    2017-04-01

    During long-lived dome-forming eruptions volcanoes often transition between quiescent, effusive, and explosive behaviour. Soufrière Hills Volcano (SHV), Montserrat, has been erupting since 1995 and has repeatedly transitioned between these different phases of activity. At SHV many of the largest explosions have occurred either during periods of dome growth, or as major dome collapse events at the end of extrusion phases. However, on the 29th July 2008 a vulcanian explosion marked the transition from a quiescent phase (Pause 3) to explosion and then extrusion. This was one of the largest explosions by volume and the largest to occur outside a period of lava extrusion. The eruption was preceded by one of the most intense seismic swarms ever recorded at SHV. In this study we analysed precursory seismic data to investigate the subsurface volcanic processes that culminated in this eruption. We used spectral and multiplet analysis techniques, and applied a simple parameterization approach to relate monitoring observations (seismic, SO2, visual) to subsurface interpretations. These techniques would be available to most volcano observatories. Our study suggests that an initial VT swarm, coincident with ash-venting events, can be triggered by ascent of decoupled gas ahead of rising magma. A subsequent large LF swarm shows a coincident decrease in spectral content that we interpret as magma ascent through the upper conduit system. An ash-venting event on 27 July (a few hours before peak event rate) may have triggered rapid microlite growth. We observe an increase in the spectral content of the LF swarm that is concurrent with a decrease in event rates, suggesting pressurization of the magmatic system due to inhibited magmatic outgassing. Our results suggest that pressurization of the magmatic system may have occurred in the final 24 h before the vulcanian explosion. We also observe LP and Hybrid events within the same multiplet, suggesting that these events have very similar source processes and should be considered part of the same classification at SHV. Our study demonstrates the potential for using spectral and multiplet analysis to understand subsurface magmatic processes and for investigating the transition between quiescence and eruption.

  1. A stochastic approach for model reduction and memory function design in hydrogeophysical inversion

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Kellogg, A.; Terry, N.

    2009-12-01

    Geophysical (e.g., seismic, electromagnetic, radar) techniques and statistical methods are essential for research related to subsurface characterization, including monitoring subsurface flow and transport processes, oil/gas reservoir identification, etc. For deep subsurface characterization such as reservoir petroleum exploration, seismic methods have been widely used. Recently, electromagnetic (EM) methods have drawn great attention in the area of reservoir characterization. However, considering the enormous computational demand corresponding to seismic and EM forward modeling, it is usually a big problem to have too many unknown parameters in the modeling domain. For shallow subsurface applications, the characterization can be very complicated considering the complexity and nonlinearity of flow and transport processes in the unsaturated zone. It is warranted to reduce the dimension of parameter space to a reasonable level. Another common concern is how to make the best use of time-lapse data with spatial-temporal correlations. This is even more critical when we try to monitor subsurface processes using geophysical data collected at different times. The normal practice is to get the inverse images individually. These images are not necessarily continuous or even reasonably related, because of the non-uniqueness of hydrogeophysical inversion. We propose to use a stochastic framework by integrating minimum-relative-entropy concept, quasi Monto Carlo sampling techniques, and statistical tests. The approach allows efficient and sufficient exploration of all possibilities of model parameters and evaluation of their significances to geophysical responses. The analyses enable us to reduce the parameter space significantly. The approach can be combined with Bayesian updating, allowing us to treat the updated ‘posterior’ pdf as a memory function, which stores all the information up to date about the distributions of soil/field attributes/properties, then consider the memory function as a new prior and generate samples from it for further updating when more geophysical data is available. We applied this approach for deep oil reservoir characterization and for shallow subsurface flow monitoring. The model reduction approach reliably helps reduce the joint seismic/EM/radar inversion computational time to reasonable levels. Continuous inversion images are obtained using time-lapse data with the “memory function” applied in the Bayesian inversion.

  2. Vertical Cable Seismic Survey for SMS exploration

    NASA Astrophysics Data System (ADS)

    Asakawa, Eiichi; Murakami, Fumitoshi; Tsukahara, Hotoshi; Mizohata, Shigeharu

    2014-05-01

    The Vertical Cable Seismic (VCS) survey is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by sea-surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. Because the VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed it for the SMS survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We have been developing the VCS survey system, including not only data acquisition hardware but data processing and analysis technique. We carried out several VCS surveys combining with surface towed source, deep towed source and ocean bottom source. The water depths of these surveys are from 100m up to 2100 m. Through these experiments, our VCS data acquisition system has been also completed. But the data processing techniques are still on the way. One of the most critical issues is the positioning in the water. The uncertainty in the positions of the source and of the hydrophones in water degraded the quality of subsurface image. GPS navigation system is available on sea surface, but in case of deep-towed source or ocean bottom source, the accuracy of shot position with SSBL/USBL is not sufficient for the very high-resolution imaging. We have developed a new approach to determine the positions in water using the travel time data from the source to VCS hydrophones. In 2013, we have carried out the second VCS survey using the surface-towed high-voltage sparker and ocean bottom source in the Izena Cauldron, which is one of the most promising SMS areas around Japan. The positions of ocean bottom source estimated by this method are consistent with the VCS field records. The VCS data with the sparker have been processed with 3D PSTM. It gives the very high resolution 3D volume deeper than two hundred meters. Our VCS system has been demonstrated as a promising survey tool for the SMS exploration.

  3. Probabilistic reasoning over seismic RMS time series: volcano monitoring through HMMs and SAX technique

    NASA Astrophysics Data System (ADS)

    Aliotta, M. A.; Cassisi, C.; Prestifilippo, M.; Cannata, A.; Montalto, P.; Patanè, D.

    2014-12-01

    During the last years, volcanic activity at Mt. Etna was often characterized by cyclic occurrences of fountains. In the period between January 2011 and June 2013, 38 episodes of lava fountains has been observed. Automatic recognition of the volcano's states related to lava fountain episodes (Quiet, Pre-Fountaining, Fountaining, Post-Fountaining) is very useful for monitoring purposes. We discovered that such states are strongly related to the trend of RMS (Root Mean Square) of the seismic signal recorded in the summit area. In the framework of the project PON SIGMA (Integrated Cloud-Sensor System for Advanced Multirisk Management) work, we tried to model the system generating its sampled values (assuming to be a Markov process and assuming that RMS time series is a stochastic process), by using Hidden Markov models (HMMs), that are a powerful tool for modeling any time-varying series. HMMs analysis seeks to discover the sequence of hidden states from the observed emissions. In our framework, observed emissions are characters generated by SAX (Symbolic Aggregate approXimation) technique. SAX is able to map RMS time series values with discrete literal emissions. Our experiments showed how to predict volcano states by means of SAX and HMMs.

  4. Earthquake source imaging by high-resolution array analysis at regional distances: the 2010 M7 Haiti earthquake as seen by the Venezuela National Seismic Network

    NASA Astrophysics Data System (ADS)

    Meng, L.; Ampuero, J. P.; Rendon, H.

    2010-12-01

    Back projection of teleseismic waves based on array processing has become a popular technique for earthquake source imaging,in particular to track the areas of the source that generate the strongest high frequency radiation. The technique has been previously applied to study the rupture process of the Sumatra earthquake and the supershear rupture of the Kunlun earthquakes. Here we attempt to image the Haiti earthquake using the data recorded by Venezuela National Seismic Network (VNSN). The network is composed of 22 broad-band stations with an East-West oriented geometry, and is located approximately 10 degrees away from Haiti in the perpendicular direction to the Enriquillo fault strike. This is the first opportunity to exploit the privileged position of the VNSN to study large earthquake ruptures in the Caribbean region. This is also a great opportunity to explore the back projection scheme of the crustal Pn phase at regional distances,which provides unique complementary insights to the teleseismic source inversions. The challenge in the analysis of the 2010 M7.0 Haiti earthquake is its very compact source region, possibly shorter than 30km, which is below the resolution limit of standard back projection techniques based on beamforming. Results of back projection analysis using the teleseismic USarray data reveal little details of the rupture process. To overcome the classical resolution limit we explored the Multiple Signal Classification method (MUSIC), a high-resolution array processing technique based on the signal-noise orthognality in the eigen space of the data covariance, which achieves both enhanced resolution and better ability to resolve closely spaced sources. We experiment with various synthetic earthquake scenarios to test the resolution. We find that MUSIC provides at least 3 times higher resolution than beamforming. We also study the inherent bias due to the interferences of coherent Green’s functions, which leads to a potential quantification of biased uncertainty of the back projection. Preliminary results from the Venezuela data set shows an East to West rupture propagation along the fault with sub-Rayleigh rupture speed, consistent with a compact source with two significant asperities which are confirmed by source time function obtained from Green’s function deconvolution and other source inversion results. These efforts could lead the Venezuela National Seismic Network to play a prominent role in the timely characterization of the rupture process of large earthquakes in the Caribbean, including the future ruptures along the yet unbroken segments of the Enriquillo fault system.

  5. Seismic Methods of Identifying Explosions and Estimating Their Yield

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Ford, S. R.; Pasyanos, M.; Pyle, M. L.; Myers, S. C.; Mellors, R. J.; Pitarka, A.; Rodgers, A. J.; Hauk, T. F.

    2014-12-01

    Seismology plays a key national security role in detecting, locating, identifying and determining the yield of explosions from a variety of causes, including accidents, terrorist attacks and nuclear testing treaty violations (e.g. Koper et al., 2003, 1999; Walter et al. 1995). A collection of mainly empirical forensic techniques has been successfully developed over many years to obtain source information on explosions from their seismic signatures (e.g. Bowers and Selby, 2009). However a lesson from the three DPRK declared nuclear explosions since 2006, is that our historic collection of data may not be representative of future nuclear test signatures (e.g. Selby et al., 2012). To have confidence in identifying future explosions amongst the background of other seismic signals, and accurately estimate their yield, we need to put our empirical methods on a firmer physical footing. Goals of current research are to improve our physical understanding of the mechanisms of explosion generation of S- and surface-waves, and to advance our ability to numerically model and predict them. As part of that process we are re-examining regional seismic data from a variety of nuclear test sites including the DPRK and the former Nevada Test Site (now the Nevada National Security Site (NNSS)). Newer relative location and amplitude techniques can be employed to better quantify differences between explosions and used to understand those differences in term of depth, media and other properties. We are also making use of the Source Physics Experiments (SPE) at NNSS. The SPE chemical explosions are explicitly designed to improve our understanding of emplacement and source material effects on the generation of shear and surface waves (e.g. Snelson et al., 2013). Finally we are also exploring the value of combining seismic information with other technologies including acoustic and InSAR techniques to better understand the source characteristics. Our goal is to improve our explosion models and our ability to understand and predict where methods of identifying explosions and estimating their yield work well, and any circumstances where they may not.

  6. Recent progress and application on seismic isolation energy dissipation and control for structures in China

    NASA Astrophysics Data System (ADS)

    Zhou, Fulin; Tan, Ping

    2018-01-01

    China is a country where 100% of the territory is located in a seismic zone. Most of the strong earthquakes are over prediction. Most fatalities are caused by structural collapse. Earthquakes not only cause severe damage to structures, but can also damage non-structural elements on and inside of facilities. This can halt city life, and disrupt hospitals, airports, bridges, power plants, and other infrastructure. Designers need to use new techniques to protect structures and facilities inside. Isolation, energy dissipation and, control systems are more and more widely used in recent years in China. Currently, there are nearly 6,500 structures with isolation and about 3,000 structures with passive energy dissipation or hybrid control in China. The mitigation techniques are applied to structures like residential buildings, large or complex structures, bridges, underwater tunnels, historical or cultural relic sites, and industrial facilities, and are used for retrofitting of existed structures. This paper introduces design rules and some new and innovative devices for seismic isolation, energy dissipation and hybrid control for civil and industrial structures. This paper also discusses the development trends for seismic resistance, seismic isolation, passive and active control techniques for the future in China and in the world.

  7. Ambient Seismic Noise Interferometry on the Island of Hawai`i

    NASA Astrophysics Data System (ADS)

    Ballmer, Silke

    Ambient seismic noise interferometry has been successfully applied in a variety of tectonic settings to gain information about the subsurface. As a passive seismic technique, it extracts the coherent part of ambient seismic noise in-between pairs of seismic receivers. Measurements of subtle temporal changes in seismic velocities, and high-resolution tomographic imaging are then possible - two applications of particular interest for volcano monitoring. Promising results from other volcanic settings motivate its application in Hawai'i, with this work being the first to explore its potential. The dataset used for this purpose was recorded by the Hawaiian Volcano Observatory's permanent seismic network on the Island of Hawai'i. It spans 2.5 years from 5/2007 to 12/2009 and covers two distinct sources of volcanic tremor. After applying standard processing for ambient seismic noise interferometry, we find that volcanic tremor strongly affects the extracted noise information not only close to the tremor source, but unexpectedly, throughout the island-wide network. Besides demonstrating how this long-range observability of volcanic tremor can be used to monitor volcanic activity in the absence of a dense seismic array, our results suggest that care must be taken when applying ambient seismic noise interferometry in volcanic settings. In a second step, we thus exclude days that show signs of volcanic tremor, reducing the dataset to three months, and perform ambient seismic noise tomography. The resulting two-dimensional Rayleigh wave group velocity maps for 0.1 - 0.9 Hz compare very well with images from previous travel time tomography, both, for the main volcanic structures at low frequencies as well as for smaller features at mid-to-high frequencies - a remarkable observation for the temporally truncated dataset. These robust results suggest that ambient seismic noise tomography in Hawai'i is suitable 1) to provide a three-dimensional S-wave model for the volcanoes and 2) to be used for repeated time-sensitive tomography, even though volcanic tremor frequently obscures ambient noise analyses. However, the noise characteristics and the wavefield in Hawai'i in general remain to be investigated in more detail in order to measure unbiased temporal velocity changes.

  8. Classifying seismic waveforms from scratch: a case study in the alpine environment

    NASA Astrophysics Data System (ADS)

    Hammer, C.; Ohrnberger, M.; Fäh, D.

    2013-01-01

    Nowadays, an increasing amount of seismic data is collected by daily observatory routines. The basic step for successfully analyzing those data is the correct detection of various event types. However, the visually scanning process is a time-consuming task. Applying standard techniques for detection like the STA/LTA trigger still requires the manual control for classification. Here, we present a useful alternative. The incoming data stream is scanned automatically for events of interest. A stochastic classifier, called hidden Markov model, is learned for each class of interest enabling the recognition of highly variable waveforms. In contrast to other automatic techniques as neural networks or support vector machines the algorithm allows to start the classification from scratch as soon as interesting events are identified. Neither the tedious process of collecting training samples nor a time-consuming configuration of the classifier is required. An approach originally introduced for the volcanic task force action allows to learn classifier properties from a single waveform example and some hours of background recording. Besides a reduction of required workload this also enables to detect very rare events. Especially the latter feature provides a milestone point for the use of seismic devices in alpine warning systems. Furthermore, the system offers the opportunity to flag new signal classes that have not been defined before. We demonstrate the application of the classification system using a data set from the Swiss Seismological Survey achieving very high recognition rates. In detail we document all refinements of the classifier providing a step-by-step guide for the fast set up of a well-working classification system.

  9. Origin and nature of crystal reflections: Results from integrated seismic measurements at the KTB superdeep drilling site

    NASA Astrophysics Data System (ADS)

    Harjes, H.-P.; Bram, K.; Dürbaum, H.-J.; Gebrande, H.; Hirschmann, G.; Janik, M.; KlöCkner, M.; Lüschen, E.; Rabbel, W.; Simon, M.; Thomas, R.; Tormann, J.; Wenzel, F.

    1997-08-01

    For almost 10 years the KTB superdeep drilling project has offered an excellent field laboratory for adapting seismic techniques to crystalline environments and for testing new ideas for interpreting seismic reflections in terms of lithological or textural properties of metamorphic rock units. The seismic investigations culminated in a three-dimensional (3-D) reflection survey on a 19×19 km area with the drill site at its center. Interpretation of these data resulted in a detailed, structural model of the German Continental Deep Drilling Program (KTB) location with dominant, steep faults in the upper crust. The 3-D reflection survey was part of a suite of seismic experiments, ranging from wide-angle reflection and refraction profiles to standard vertical seismic profiles (VSP) and more sophisticated surface-to-borehole observations. It was predicted that the drill bit would meet the most prominent, steeply dipping, crustal reflector at a depth of about 6500-7000 m, and indeed, the borehole penetrated a major fault zone in the depth interval between 6850 and 7300 m. This reflector offered the rare opportunity to relate logging results, reflective properties, and geology to observed and modeled data. Post-Variscan thrusting caused cataclastic deformation, with partial, strong alterations within a steeply dipping reverse fault zone. This process generated impedance contrasts within the fault zone on a lateral scale large enough to cause seismic reflections. This was confirmed by borehole measurements along the whole 9.1 km deep KTB profile. The strongest, reflected signals originated from fluid-filled fractures and cataclastic fracture zones rather than from lithological boundaries (i.e., first-order discontinuities between different rock types) or from texture- and/or foliation-induced anisotropy. During the interpretation of seismic data at KTB several lessons were learned: Conventional processing of two-dimensional (2-D) reflection data from a presite survey showed predominantly subhorizontal layering in the upper crust with reflectivity striking in the Variscan direction. Drilling, however, revealed that all rock units are steeply dipping. This confirms that surface common depth point (CDP) seismics strongly enhances subhorizontal reflectivity and may thus produce a very misleading crustal image. Although this was shown for synthetic examples earlier, the KTB provides the experimental proof of how crucial this insight can be.

  10. Volcano seismology

    USGS Publications Warehouse

    Chouet, B.

    2003-01-01

    A fundamental goal of volcano seismology is to understand active magmatic systems, to characterize the configuration of such systems, and to determine the extent and evolution of source regions of magmatic energy. Such understanding is critical to our assessment of eruptive behavior and its hazardous impacts. With the emergence of portable broadband seismic instrumentation, availability of digital networks with wide dynamic range, and development of new powerful analysis techniques, rapid progress is being made toward a synthesis of high-quality seismic data to develop a coherent model of eruption mechanics. Examples of recent advances are: (1) high-resolution tomography to image subsurface volcanic structures at scales of a few hundred meters; (2) use of small-aperture seismic antennas to map the spatio-temporal properties of long-period (LP) seismicity; (3) moment tensor inversions of very-long-period (VLP) data to derive the source geometry and mass-transport budget of magmatic fluids; (4) spectral analyses of LP events to determine the acoustic properties of magmatic and associated hydrothermal fluids; and (5) experimental modeling of the source dynamics of volcanic tremor. These promising advances provide new insights into the mechanical properties of volcanic fluids and subvolcanic mass-transport dynamics. As new seismic methods refine our understanding of seismic sources, and geochemical methods better constrain mass balance and magma behavior, we face new challenges in elucidating the physico-chemical processes that cause volcanic unrest and its seismic and gas-discharge manifestations. Much work remains to be done toward a synthesis of seismological, geochemical, and petrological observations into an integrated model of volcanic behavior. Future important goals must include: (1) interpreting the key types of magma movement, degassing and boiling events that produce characteristic seismic phenomena; (2) characterizing multiphase fluids in subvolcanic regimes and determining their physical and chemical properties; and (3) quantitatively understanding multiphase fluid flow behavior under dynamic volcanic conditions. To realize these goals, not only must we learn how to translate seismic observations into quantitative information about fluid dynamics, but we also must determine the underlying physics that governs vesiculation, fragmentation, and the collapse of bubble-rich suspensions to form separate melt and vapor. Refined understanding of such processes-essential for quantitative short-term eruption forecasts-will require multidisciplinary research involving detailed field measurements, laboratory experiments, and numerical modeling.

  11. Fast principal component analysis for stacking seismic data

    NASA Astrophysics Data System (ADS)

    Wu, Juan; Bai, Min

    2018-04-01

    Stacking seismic data plays an indispensable role in many steps of the seismic data processing and imaging workflow. Optimal stacking of seismic data can help mitigate seismic noise and enhance the principal components to a great extent. Traditional average-based seismic stacking methods cannot obtain optimal performance when the ambient noise is extremely strong. We propose a principal component analysis (PCA) algorithm for stacking seismic data without being sensitive to noise level. Considering the computational bottleneck of the classic PCA algorithm in processing massive seismic data, we propose an efficient PCA algorithm to make the proposed method readily applicable for industrial applications. Two numerically designed examples and one real seismic data are used to demonstrate the performance of the presented method.

  12. Microearthquake Studies at the Salton Sea Geothermal Field

    DOE Data Explorer

    Templeton, Dennise

    2013-10-01

    The objective of this project is to detect and locate microearthquakes to aid in the characterization of reservoir fracture networks. Accurate identification and mapping of the large numbers of microearthquakes induced in EGS is one technique that provides diagnostic information when determining the location, orientation and length of underground crack systems for use in reservoir development and management applications. Conventional earthquake location techniques often are employed to locate microearthquakes. However, these techniques require labor-intensive picking of individual seismic phase onsets across a network of sensors. For this project we adapt the Matched Field Processing (MFP) technique to the elastic propagation problem in geothermal reservoirs to identify more and smaller events than traditional methods alone.

  13. Levee evaluation using MASW: Preliminary findings from the Citrus Lakefront Levee, New Orleans, Louisiana

    USGS Publications Warehouse

    Lane, John W.; Ivanov, Julian M.; Day-Lewis, Frederick D.; Clemens, Drew; Patev, Robert; Miller, Richard D.

    2008-01-01

    The utility of the multi‐channel analysis of surface waves (MASW) seismic method for non‐invasive assessment of earthen levees was evaluated for a section of the Citrus Lakefront Levee, New Orleans, Louisiana. This test was conducted after the New Orleans' area levee system had been stressed by Hurricane Katrina in 2005. The MASW data were acquired in a seismically noisy, urban environment using an accelerated weight‐drop seismic source and a towed seismic land streamer. Much of the seismic data were contaminated with higher‐order mode guided‐waves, requiring application of muting filtering techniques to improve interpretability of the dispersion curves. Comparison of shear‐wave velocity sections with boring logs suggests the existence of four distinct horizontal layers within and beneath the levee: (1) the levee core, (2) the levee basal layer of fat clay, (3) a sublevel layer of silty sand, and (4) underlying Pleistocene deposits of sandy lean clay. Along the surveyed section of levee, lateral variations in shear‐wave velocity are interpreted as changes in material rigidity, suggestive of construction or geologic heterogeneity, or possibly, that dynamic processes (such as differential settlement) are affecting discrete levee areas. The results of this study suggest that the MASW method is a geophysical tool with significant potential for non‐invasive characterization of vertical and horizontal variations in levee material shear strength. Additional work, however, is needed to fully understand and address the complex seismic wave propagation in levee structures.

  14. Classifying seismic noise and sources from OBS data using unsupervised machine learning

    NASA Astrophysics Data System (ADS)

    Mosher, S. G.; Audet, P.

    2017-12-01

    The paradigm of plate tectonics was established mainly by recognizing the central role of oceanic plates in the production and destruction of tectonic plates at their boundaries. Since that realization, however, seismic studies of tectonic plates and their associated deformation have slowly shifted their attention toward continental plates due to the ease of installation and maintenance of high-quality seismic networks on land. The result has been a much more detailed understanding of the seismicity patterns associated with continental plate deformation in comparison with the low-magnitude deformation patterns within oceanic plates and at their boundaries. While the number of high-quality ocean-bottom seismometer (OBS) deployments within the past decade has demonstrated the potential to significantly increase our understanding of tectonic systems in oceanic settings, OBS data poses significant challenges to many of the traditional data processing techniques in seismology. In particular, problems involving the detection, location, and classification of seismic sources occurring within oceanic settings are much more difficult due to the extremely noisy seafloor environment in which data are recorded. However, classifying data without a priori constraints is a problem that is routinely pursued via unsupervised machine learning algorithms, which remain robust even in cases involving complicated datasets. In this research, we apply simple unsupervised machine learning algorithms (e.g., clustering) to OBS data from the Cascadia Initiative in an attempt to classify and detect a broad range of seismic sources, including various noise sources and tremor signals occurring within ocean settings.

  15. Nature of the seismic crust at the Aegir Ridge: A downward continuation approach

    NASA Astrophysics Data System (ADS)

    Rai, Abhishek; Breivik, Asbj|rn; Mjelde, Rolf; Hanan, Barry; Ito, Garrett; Sayit, Kaan; Howell, Sam; Vogt, Peter; Pedersen, Rolf-Birger

    2013-04-01

    The marine seismic data are influenced by variations in the thickness and velocity of the water column which causes fluctuations in the arrival times of seismic phases. Downward continuation of the ocean-bottom seismometer data are used to remove the contributions of the water column by bring the shot and receiver at a common datum such as the seafloor. Additionally, the downward continuation focus the seismic energy and hence improves the resolution. We apply the downward continuation technique to analyze the OBS data collected along the eastern shoulder of the Aegir Ridge. The Aegir Ridge is an extinct spreading ridge in the North-East Atlantic ocean. Its proximity to the active Iceland hot-spot makes it important for understanding the process of hotspot-ridge interaction during the Oligocene. We present results of an OBS experiment, supported by single channel streamer, gravity and magnetic observations. Usable seismic data from 20 OBSs distributed along ~550 km length of the profile reveal the variations in crustal thickness and seismic velocities. Regional magnetic anomalies show a faster spreading rate towards the north and a slower spreading towards the southern end near the Iceland hotspot during the active period of the ridge. However, the observed and the predicted crustal thickness show an opposite trend. We interpret this anti-correlation between the seafloor spreading rate and the crustal thickness as a result of the interaction between the Iceland hotspot and the Aegir Ridge.

  16. Seismic Regionalization of Michoacan, Mexico and Recurrence Periods for Earthquakes

    NASA Astrophysics Data System (ADS)

    Magaña García, N.; Figueroa-Soto, Á.; Garduño-Monroy, V. H.; Zúñiga, R.

    2017-12-01

    Michoacán is one of the states with the highest occurrence of earthquakes in Mexico and it is a limit of convergence triggered by the subduction of Cocos plate over the North American plate, located in the zone of the Pacific Ocean of our country, in addition to the existence of active faults inside of the state like the Morelia-Acambay Fault System (MAFS).It is important to make a combination of seismic, paleosismological and geological studies to have good planning and development of urban complexes to mitigate disasters if destructive earthquakes appear. With statistical seismology it is possible to characterize the degree of seismic activity as well as to estimate the recurrence periods for earthquakes. For this work, seismicity catalog of Michoacán was compiled and homogenized in time and magnitude. This information was obtained from world and national agencies (SSN, CMT, etc), some data published by Mendoza and Martínez-López (2016) and starting from the seismic catalog homogenized by F. R. Zúñiga (Personal communication). From the analysis of the different focal mechanisms reported in the literature and geological studies, the seismic regionalization of the state of Michoacán complemented the one presented by Vázquez-Rosas (2012) and the recurrence periods for earthquakes within the four different seismotectonic regions. In addition, stable periods were determined for the b value of the Gutenberg-Richter (1944) using the Maximum Curvature and EMR (Entire Magnitude Range Method, 2005) techniques, which allowed us to determine recurrence periods: years for earthquakes upper to 7.5 for the subduction zone (A zone) with EMR technique and years with MAXC technique for the same years for earthquakes upper to 5 for B1 zone with EMR technique and years with MAXC technique; years for earthquakes upper to 7.0 for B2 zone with EMR technique and years with MAXC technique; and the last one, the Morelia-Acambay Fault Sistem zone (C zone) years for earthquakes upper to 5 with EMR technique and years with MAXC technique. This recurrence periods are very similar to periods calculated by Garduño-Monroy (2009) and Sunye-Puchol (2015) using paleoseismological methods. If we consider that the MAFS cross Zacapu, Pátzcuaro, Morelia, Cuitzeo, Maravatío and Acambay, the affected population would be around 1132807 habitants.

  17. Probing dynamic hydrologic system of slowly-creeping landslides with passive seismic imaging: A comprehensive landslide monitoring site at Lantai, Ilan area in Taiwan

    NASA Astrophysics Data System (ADS)

    Huang, H. H.; Hsu, Y. J.; Kuo, C. Y.; Chen, C. C.; Kuo, L. W.; Chen, R. F.; Lin, C. R.; Lin, P. P.; Lin, C. W.; Lin, M. L.; Wang, K. L.

    2017-12-01

    A unique landslide monitoring project integrating multidisciplinary geophysics experiments such as GPS, inclinometer, piezometer, and spontaneous potential log has been established at Lantai, Ilan area to investigating the possible detachment depth range and the physical mechanism of a slowly creeping landslide. In parallel with this, a lately deployed local seismic network also lends an opportunity to employ the passive seismic imaging technique to detect the time-lapse changes of seismic velocity in and around the landslide area. Such technique that retrieves Green's functions by cross-correlation of continuous ambient noise has opened new opportunities to seismologically monitoring the environmental and tectonic events such as ground water variation, magma intrusion under volcanos, and co-seismic medium damage in recent years. Integrating these geophysical observations, we explore the primary controls of derived seismic velocity changes and especially the hydrological response of the landslide to the passage of Megi typhoon in the last September 2016, which could potentially further our understanding of the dynamic system of landslides and in turn help the hazard mitigation.

  18. Applying new seismic analysis techniques to the lunar seismic dataset: New information about the Moon and planetary seismology on the eve of InSight

    NASA Astrophysics Data System (ADS)

    Dimech, J. L.; Weber, R. C.; Knapmeyer-Endrun, B.; Arnold, R.; Savage, M. K.

    2016-12-01

    The field of planetary science is poised for a major advance with the upcoming InSight mission to Mars due to launch in May 2018. Seismic analysis techniques adapted for use on planetary data are therefore highly relevant to the field. The heart of this project is in the application of new seismic analysis techniques to the lunar seismic dataset to learn more about the Moon's crust and mantle structure, with particular emphasis on `deep' moonquakes which are situated half-way between the lunar surface and its core with no surface expression. Techniques proven to work on the Moon might also be beneficial for InSight and future planetary seismology missions which face similar technical challenges. The techniques include: (1) an event-detection and classification algorithm based on `Hidden Markov Models' to reclassify known moonquakes and look for new ones. Apollo 17 gravimeter and geophone data will also be included in this effort. (2) Measurements of anisotropy in the lunar mantle and crust using `shear-wave splitting'. Preliminary measurements on deep moonquakes using the MFAST program are encouraging, and continued evaluation may reveal new structural information on the Moon's mantle. (3) Probabilistic moonquake locations using NonLinLoc, a non-linear hypocenter location technique, using a modified version of the codes designed to work with the Moon's radius. Successful application may provide a new catalog of moonquake locations with rigorous uncertainty information, which would be a valuable input into: (4) new fault plane constraints from focal mechanisms using a novel approach to Bayes' theorem which factor in uncertainties in hypocenter coordinates and S-P amplitude ratios. Preliminary results, such as shear-wave splitting measurements, will be presented and discussed.

  19. Terahertz reflection imaging using Kirchhoff migration.

    PubMed

    Dorney, T D; Johnson, J L; Van Rudd, J; Baraniuk, R G; Symes, W W; Mittleman, D M

    2001-10-01

    We describe a new imaging method that uses single-cycle pulses of terahertz (THz) radiation. This technique emulates data-collection and image-processing procedures developed for geophysical prospecting and is made possible by the availability of fiber-coupled THz receiver antennas. We use a simple migration procedure to solve the inverse problem; this permits us to reconstruct the location and shape of targets. These results demonstrate the feasibility of the THz system as a test-bed for the exploration of new seismic processing methods involving complex model systems.

  20. Automated classification of seismic sources in a large database: a comparison of Random Forests and Deep Neural Networks.

    NASA Astrophysics Data System (ADS)

    Hibert, Clement; Stumpf, André; Provost, Floriane; Malet, Jean-Philippe

    2017-04-01

    In the past decades, the increasing quality of seismic sensors and capability to transfer remotely large quantity of data led to a fast densification of local, regional and global seismic networks for near real-time monitoring of crustal and surface processes. This technological advance permits the use of seismology to document geological and natural/anthropogenic processes (volcanoes, ice-calving, landslides, snow and rock avalanches, geothermal fields), but also led to an ever-growing quantity of seismic data. This wealth of seismic data makes the construction of complete seismicity catalogs, which include earthquakes but also other sources of seismic waves, more challenging and very time-consuming as this critical pre-processing stage is classically done by human operators and because hundreds of thousands of seismic signals have to be processed. To overcome this issue, the development of automatic methods for the processing of continuous seismic data appears to be a necessity. The classification algorithm should satisfy the need of a method that is robust, precise and versatile enough to be deployed to monitor the seismicity in very different contexts. In this study, we evaluate the ability of machine learning algorithms for the analysis of seismic sources at the Piton de la Fournaise volcano being Random Forest and Deep Neural Network classifiers. We gather a catalog of more than 20,000 events, belonging to 8 classes of seismic sources. We define 60 attributes, based on the waveform, the frequency content and the polarization of the seismic waves, to parameterize the seismic signals recorded. We show that both algorithms provide similar positive classification rates, with values exceeding 90% of the events. When trained with a sufficient number of events, the rate of positive identification can reach 99%. These very high rates of positive identification open the perspective of an operational implementation of these algorithms for near-real time monitoring of mass movements and other environmental sources at the local, regional and even global scale.

  1. 2D Seismic Reflection Data across Central Illinois

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Valerie; Leetaru, Hannes

    In a continuing collaboration with the Midwest Geologic Sequestration Consortium (MGSC) on the Evaluation of the Carbon Sequestration Potential of the Cambro-Ordovician Strata of the Illinois and Michigan Basins project, Schlumberger Carbon Services and WesternGeco acquired two-dimensional (2D) seismic data in the Illinois Basin. This work included the design, acquisition and processing of approximately 125 miles of (2D) seismic reflection surveys running west to east in the central Illinois Basin. Schlumberger Carbon Services and WesternGeco oversaw the management of the field operations (including a pre-shoot planning, mobilization, acquisition and de-mobilization of the field personnel and equipment), procurement of the necessarymore » permits to conduct the survey, post-shoot closure, processing of the raw data, and provided expert consultation as needed in the interpretation of the delivered product. Three 2D seismic lines were acquired across central Illinois during November and December 2010 and January 2011. Traversing the Illinois Basin, this 2D seismic survey was designed to image the stratigraphy of the Cambro-Ordovician sections and also to discern the basement topography. Prior to this survey, there were no regionally extensive 2D seismic data spanning this section of the Illinois Basin. Between the NW side of Morgan County and northwestern border of Douglas County, these seismic lines ran through very rural portions of the state. Starting in Morgan County, Line 101 was the longest at 93 miles in length and ended NE of Decatur, Illinois. Line 501 ran W-E from the Illinois Basin – Decatur Project (IBDP) site to northwestern Douglas County and was 25 miles in length. Line 601 was the shortest and ran N-S past the IBDP site and connected lines 101 and 501. All three lines are correlated to well logs at the IBDP site. Originally processed in 2011, the 2D seismic profiles exhibited a degradation of signal quality below ~400 millisecond (ms) which made interpretation of the Mt. Simon and Knox sections difficult. The data quality also gradually decreased moving westward across the state. To meet evolving project objectives, in 2012 the seismic data was re-processed using different techniques to enhance the signal quality thereby rendering a more coherent seismic profile for interpreters. It is believed that the seismic degradation could be caused by shallow natural gas deposits and Quaternary sediments (which include abandoned river and stream channels, former ponds, and swamps with peat deposits) that may have complicated or changed the seismic wavelet. Where previously limited by seismic coverage, the seismic profiles have provided valuable subsurface information across central Illinois. Some of the interpretations based on this survey included, but are not limited to: - Stratigraphy generally gently dips to the east from Morgan to Douglas County. - The Knox Supergroup roughly maintains its thickness. There is little evidence for faulting in the Knox. However, at least one resolvable fault penetrates the entire Knox section. - The Eau Claire Formation, the primary seal for the Mt. Simon Sandstone, appears to be continuous across the entire seismic profile. - The Mt. Simon Sandstone thins towards the western edge of the basin. As a result, the highly porous lowermost Mt. Simon section is absent in the western part of the state. - Overall basement dip is from west to east. - Basement topography shows evidence of basement highs with on-lapping patterns by Mt. Simon sediments. - There is evidence of faults within the lower Mt. Simon Sandstone and basement rock that are contemporaneous with Mt. Simon Sandstone deposition. These faults are not active and do not penetrate the Eau Claire Shale. It is believed that these faults are associated with a possible failed rifting event 750 to 560 million years ago during the breakup of the supercontinent Rodinia.« less

  2. Receiver Function Imaging of Crustal and Lithospheric Structure Beneath the Jalisco Block and Western Michoacan, Mexico.

    NASA Astrophysics Data System (ADS)

    Reyes Alfaro, G.; Cruz-Atienza, V. M.; Perez-Campos, X.; Reyes Dávila, G. A.

    2014-12-01

    We used a receiver function technique for imaging western Mexico, a unique area with several active seismic and volcanic zones like the triple junction of Rivera, Cocos and North American plates and the Colima volcano complex (CVC), the most active in Mexico. Clear images of the distribution of the crust and the lithosphere-asthenosphere boundary are obtained using P-to-S receiver functions (RF) from around ~80 broadband stations recorded by the Mapping the Rivera Subduction Zone (MARS), the Colima Volcano Deep Seismic Experiment (CODEX) and a local network (RESCO) that allowed us to considerably increase the teleseismic database used in the project. For imaging, we constructed several 2-D profiles of depth transformed RFs to delineate the seismic discontinuities of the region. Low seismic velocities associated with the Michoacan-Guanajuato and the Mascota-Ayutla-Tapalpa volcanic fields are also observed. Most impressive, a large and well delineated magma body 100 km underneath CVC is recognized along a surely related depression of the moho discontinuity just above it. We bring more tools for a better understanding of the deep processes that ultimately control eruptive behavior in the region.

  3. New Version of SeismicHandler (SHX) based on ObsPy

    NASA Astrophysics Data System (ADS)

    Stammler, Klaus; Walther, Marcus

    2016-04-01

    The command line version of SeismicHandler (SH), a scientific analysis tool for seismic waveform data developed around 1990, has been redesigned in the recent years, based on a project funded by the Deutsche Forschungsgemeinschaft (DFG). The aim was to address new data access techniques, simplified metadata handling and a modularized software design. As a result the program was rewritten in Python in its main parts, taking advantage of simplicity of this script language and its variety of well developed software libraries, including ObsPy. SHX provides an easy access to waveforms and metadata via arclink and FDSN webservice protocols, also access to event catalogs is implemented. With single commands whole networks or stations within a certain area may be read in, the metadata are retrieved from the servers and stored in a local database. For data processing the large set of SH commands is available, as well as the SH scripting language. Via this SH language scripts or additional Python modules the command set of SHX is easily extendable. The program is open source, tested on Linux operating systems, documentation and download is found at URL "https://www.seismic-handler.org/".

  4. 3-D Characterization of Seismic Properties at the Smart Weapons Test Range, YPG

    NASA Astrophysics Data System (ADS)

    Miller, Richard D.; Anderson, Thomas S.; Davis, John C.; Steeples, Don W.; Moran, Mark L.

    2001-10-01

    The Smart Weapons Test Range (SWTR) lies within the Yuma Proving Ground (YPG), Arizona. SWTR is a new facility constructed specifically for the development and testing of futuristic intelligent battlefield sensor networks. In this paper, results are presented for an extensive high-resolution geophysical characterization study at the SWTR site along with validation using 3-D modeling. In this study, several shallow seismic methods and novel processing techniques were used to generate a 3-D grid of earth seismic properties, including compressional (P) and shear (S) body-wave speeds (Vp and Vs), and their associated body-wave attenuation parameters (Qp, and Qs). These experiments covered a volume of earth measuring 1500 m by 300 m by 25 m deep (11 million cubic meters), centered on the vehicle test track at the SWTR site. The study has resulted in detailed characterizations of key geophysical properties. To our knowledge, results of this kind have not been previously achieved, nor have the innovative methods developed for this effort been reported elsewhere. In addition to supporting materiel developers with important geophysical information at this test range, the data from this study will be used to validate sophisticated 3-D seismic signature models for moving vehicles.

  5. Medium effect on the characteristics of the coupled seismic and electromagnetic signals.

    PubMed

    Huang, Qinghua; Ren, Hengxin; Zhang, Dan; Chen, Y John

    2015-01-01

    Recently developed numerical simulation technique can simulate the coupled seismic and electromagnetic signals for a double couple point source or a finite fault planar source. Besides the source effect, the simulation results showed that both medium structure and medium property could affect the coupled seismic and electromagnetic signals. The waveform of coupled signals for a layered structure is more complicated than that for a simple uniform structure. Different from the seismic signals, the electromagnetic signals are sensitive to the medium properties such as fluid salinity and fluid viscosity. Therefore, the co-seismic electromagnetic signals may be more informative than seismic signals.

  6. Medium effect on the characteristics of the coupled seismic and electromagnetic signals

    PubMed Central

    HUANG, Qinghua; REN, Hengxin; ZHANG, Dan; CHEN, Y. John

    2015-01-01

    Recently developed numerical simulation technique can simulate the coupled seismic and electromagnetic signals for a double couple point source or a finite fault planar source. Besides the source effect, the simulation results showed that both medium structure and medium property could affect the coupled seismic and electromagnetic signals. The waveform of coupled signals for a layered structure is more complicated than that for a simple uniform structure. Different from the seismic signals, the electromagnetic signals are sensitive to the medium properties such as fluid salinity and fluid viscosity. Therefore, the co-seismic electromagnetic signals may be more informative than seismic signals. PMID:25743062

  7. High Resolution Near Surface 3D Seismic Experiments: A Carbonate Platform vs. a Siliciclastic Sequence

    NASA Astrophysics Data System (ADS)

    Filippidou, N.; Drijkoningen, G.; Braaksma, H.; Verwer, K.; Kenter, J.

    2005-05-01

    Interest in high-resolution 3D seismic experiments for imaging shallow targets has increased over the past years. Many case studies presented, show that producing clear seismic images with this non-evasive method, is still a challenge. We use two test-sites where nearby outcrops are present so that an accurate geological model can be built and the seismic result validated. The first so-called natural field laboratory is located in Boulonnais (N. France). It is an upper Jurassic siliciclastic sequence; age equivalent of the source rock of N. Sea. The second one is located in Cap Blanc,to the southwest of the Mallorca island(Spain); depicting an excellent example of Miocene prograding reef platform (Llucmajor Platform); it is a textbook analog for carbonate reservoirs. In both cases, the multidisciplinary experiment included the use of multicomponent and quasi- or 3D seismic recordings. The target depth does not exceed 120m. Vertical and shear portable vibrators were used as source. In the center of the setups, boreholes were drilled and Vertical Seismic Profiles were shot, along with core and borehole measurements both in situ and in the laboratory. These two geologically different sites, with different seismic stratigraphy have provided us with exceptionally high resolution seismic images. In general seismic data was processed more or less following standard procedures, a few innovative techniques on the Mallorca data, as rotation of horizontal components, 3D F-K filter and addition of parallel profiles, have improved the seismic image. In this paper we discuss the basic differences as seen on the seismic sections. The Boulonnais data present highly continuous reflection patterns of extremenly high resolution. This facilitated a high resolution stratigraphic description. Results from the VSP showed substantial wave energy attenuation. However, the high-fold (330 traces ) Mallorca seismic experiment returned a rather discontinuous pattern of possible reflectors, opposing to the predicted seismic stratigraphy/geology of the area. The Llumajor Platform has been buried only a few meters at most, therefore primary and secondary porocity remains intact, creating a fractal like environment of scatterers and diffractors. We have interpreted two possible reflections, the top of the reef and the water table; the former is nicely coupled with the VSP. The seismic wave attenuation observed is believed to be predominantly due to the scattering effects.

  8. SiSeRHMap v1.0: a simulator for mapped seismic response using a hybrid model

    NASA Astrophysics Data System (ADS)

    Grelle, Gerardo; Bonito, Laura; Lampasi, Alessandro; Revellino, Paola; Guerriero, Luigi; Sappa, Giuseppe; Guadagno, Francesco Maria

    2016-04-01

    The SiSeRHMap (simulator for mapped seismic response using a hybrid model) is a computerized methodology capable of elaborating prediction maps of seismic response in terms of acceleration spectra. It was realized on the basis of a hybrid model which combines different approaches and models in a new and non-conventional way. These approaches and models are organized in a code architecture composed of five interdependent modules. A GIS (geographic information system) cubic model (GCM), which is a layered computational structure based on the concept of lithodynamic units and zones, aims at reproducing a parameterized layered subsoil model. A meta-modelling process confers a hybrid nature to the methodology. In this process, the one-dimensional (1-D) linear equivalent analysis produces acceleration response spectra for a specified number of site profiles using one or more input motions. The shear wave velocity-thickness profiles, defined as trainers, are randomly selected in each zone. Subsequently, a numerical adaptive simulation model (Emul-spectra) is optimized on the above trainer acceleration response spectra by means of a dedicated evolutionary algorithm (EA) and the Levenberg-Marquardt algorithm (LMA) as the final optimizer. In the final step, the GCM maps executor module produces a serial map set of a stratigraphic seismic response at different periods, grid solving the calibrated Emul-spectra model. In addition, the spectra topographic amplification is also computed by means of a 3-D validated numerical prediction model. This model is built to match the results of the numerical simulations related to isolate reliefs using GIS morphometric data. In this way, different sets of seismic response maps are developed on which maps of design acceleration response spectra are also defined by means of an enveloping technique.

  9. Improved 3D seismic attribute mapping by CRS stacking instead of NMO stacking: Application to a geothermal reservoir in the Polish Basin

    NASA Astrophysics Data System (ADS)

    Pussak, Marcin; Bauer, Klaus; Stiller, Manfred; Bujakowski, Wieslaw

    2014-04-01

    Within a seismic reflection processing work flow, the common-reflection-surface (CRS) stack can be applied as an alternative for the conventional normal moveout (NMO) or the dip moveout (DMO) stack. The advantages of the CRS stack include (1) data-driven automatic determination of stacking operator parameters, (2) imaging of arbitrarily curved geological boundaries, and (3) significant increase in signal-to-noise (S/N) ratio by stacking far more traces than used in a conventional stack. In this paper we applied both NMO and CRS stackings to process a sparse 3D seismic data set acquired within a geothermal exploration study in the Polish Basin. The stacked images show clear enhancements in quality achieved by the CRS stack in comparison with the conventional stack. While this was expected from previous studies, we also found remarkable improvements in the quality of seismic attributes when the CRS stack was applied instead of the conventional stack. For the major geothermal target reservoir (Lower Jurassic horizon Ja1), we present a comparison between both stacking methods for a number of common attributes, including root-mean-square (RMS) amplitudes, instantaneous frequencies, coherency, and spectral decomposition attributes derived from the continuous wavelet transform. The attribute maps appear noisy and highly fluctuating after the conventional stack, and are clearly structured after the CRS stack. A seismic facies analysis was finally carried out for the Ja1 horizon using the attributes derived from the CRS stack by using self-organizing map clustering techniques. A corridor parallel to a fault system was identified, which is characterized by decreased RMS amplitudes and decreased instantaneous frequencies. In our interpretation, this region represents a fractured, fluid-bearing compartment within the sandstone reservoir, which indicates favorable conditions for geothermal exploitation.

  10. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    DOE PAGES

    Delorey, Andrew A.; van der Elst, Nicholas J.; Johnson, Paul Allan

    2016-12-28

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering ofmore » earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Lastly, our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.« less

  11. Report of reprocessing of reflection seismic profile X-5 Waste Isolation Pilot Plant site, Eddy County, New Mexico

    USGS Publications Warehouse

    Miller, John J.

    1983-01-01

    Seismic reflection profile X-5 exhibits a 7,700 ft long anomalous zone of poor quality to nonexistent reflections between shotpoints 100 and 170, compared to the high-quality, flat-lying, coherent reflections on either side. Results from drill holes in the area suggest 'layer cake' geology with no detectable abnormalities such as faults present. In an attempt to determine whether the anomalous zone of the seismic profile is an artifact or actually indicates a geologic condition, the data were extensively reprocessed using state-of-the-art processing techniques and the following conclusions were made: 1. The field-recorded data in the anomalous zone are of poor quality due to surface conditions and recording parameters used. 2. Reprocessing shows reflectors throughout the anomalous zone at all levels. However, it cannot prove that the reflectors are continuous throughout the anomalous zone. 3. Significant improvement in data quality may be achieved if the line is reshot using carefully determined recording parameters.

  12. Using Network Theory to Understand Seismic Noise in Dense Arrays

    NASA Astrophysics Data System (ADS)

    Riahi, N.; Gerstoft, P.

    2015-12-01

    Dense seismic arrays offer an opportunity to study anthropogenic seismic noise sources with unprecedented detail. Man-made sources typically have high frequency, low intensity, and propagate as surface waves. As a result attenuation restricts their measurable footprint to a small subset of sensors. Medium heterogeneities can further introduce wave front perturbations that limit processing based on travel time. We demonstrate a non-parametric technique that can reliably identify very local events within the array as a function of frequency and time without using travel-times. The approach estimates the non-zero support of the array covariance matrix and then uses network analysis tools to identify clusters of sensors that are sensing a common source. We verify the method on simulated data and then apply it to the Long Beach (CA) geophone array. The method exposes a helicopter traversing the array, oil production facilities with different characteristics, and the fact that noise sources near roads tend to be around 10-20 Hz.

  13. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delorey, Andrew A.; van der Elst, Nicholas J.; Johnson, Paul Allan

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering ofmore » earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Lastly, our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.« less

  14. Seismic to­mography; theory and practice

    USGS Publications Warehouse

    Iver, H.M.; Hirahara, Kazuro

    1993-01-01

    Although highly theoretical and computer-orientated, seismic tomography has created spectacular images of anomolies within the Earth with dimensions of thousands of kilometers to few tens of meters. These images have enabled Earth scientists working on diverse areas to attack fundamental problems relating to the deep dynamical processes within our planet. Additionally, this technique is being used extensively to study the Earth's hazardous regions such as earthquake fault zones and volcanoes, as well as features beneficial to man such as oil or mineral-bearing structures. This book has been written by world experts and describes the theories, experimental and analytical procedures and results of applying seismic tomography from global to purely local scale. It represents the collective global perspective on the state of the art and focusses not only on the theoretical and practical aspects, but also on the uses for hydrocarbon, mineral and geothermal exploitation. Students and researchers in the Earth sciences, and research and exploration geophysicists should find this a useful, practical reference book for all aspects of their work.

  15. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    USGS Publications Warehouse

    Delorey, Andrew; Van Der Elst, Nicholas; Johnson, Paul

    2017-01-01

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering of earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.

  16. ON-SITE CAVITY LOCATION-SEISMIC PROFILING AT NEVADA TEST SITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Forbes, C.B.; Peterson, R.A.; Heald, C.L.

    1961-10-25

    Experimental seismic studies were conducted at the Nevada Test Site for the purpose of designing and evaluating the most promising seismic techniques for on-site inspection. Post-explosion seismic profiling was done in volcanic tuff in the vicinity of the Rainier and Blanca underground explosions. Pre-explosion seismic profiling was done over granitic rock outcrops in the Climax Stock area, and over tuff at proposed location for Linen and Orchid. Near surface velocity profiling techniques based on measurements of seismic time-distance curves gave evidence of disturbances in near surface rock velocities over the Rainier and Refer als0 to abstract 30187. Blanca sites. Thesemore » disturbances appear to be related to near surface fracturing and spallation effects resulting from the reflection of the original intense compression wave pulse at the near surface as a tension pulse. Large tuned seismometer arrays were used for horizontal seismic ranging in an attempt to record back-scattered'' or reflected seismic waves from subsurface cavities or zones of rock fracturing around the underground explosions. Some possible seismic events were recorded from the near vicinities of the Rainier and Blanca sites. However, many more similar events were recorded from numerous other locations, presumably originating from naturally occurring underground geological features. No means was found for discriminating between artificial and natural events recorded by horizontal seismic ranging, and the results were, therefore, not immediately useful for inspection purposes. It is concluded that in some instances near surface velocity profiling methods may provide a useful tool in verifying the presence of spalled zones above underground nuclear explosion sites. In the case of horizontal seismic ranging it appears that successful application would require development of satisfactory means for recognition of and discrimination against seismic responses to naturally occurring geological features. It is further concluded that, although more sophisticated instrumentation systems can be conceived, the most promising returns for effort expended can be expected to come from increased experience, skill, and human ingenuity in applying existing techniques. The basic problem is in large part a geological one of differentiating seismic response to man made irregularities from that of natural features which are of a similar or greater size and universally proved. It would not appear realistic to consider the seismic tool as a proven routine device for giving clear answers in on-site inspection operations. Application must still be considered largely experimental. (auth)« less

  17. Investigating on the Differences between Triggered and Background Seismicity in Italy and Southern California.

    NASA Astrophysics Data System (ADS)

    Stallone, A.; Marzocchi, W.

    2017-12-01

    Earthquake occurrence may be approximated by a multidimensional Poisson clustering process, where each point of the Poisson process is replaced by a cluster of points, the latter corresponding to the well-known aftershock sequence (triggered events). Earthquake clusters and their parents are assumed to occur according to a Poisson process at a constant temporal rate proportional to the tectonic strain rate, while events within a cluster are modeled as generations of dependent events reproduced by a branching process. Although the occurrence of such space-time clusters is a general feature in different tectonic settings, seismic sequences seem to have marked differences from region to region: one example, among many others, is that seismic sequences of moderate magnitude in Italian Apennines seem to last longer than similar seismic sequences in California. In this work we investigate on the existence of possible differences in the earthquake clustering process in these two areas. At first, we separate the triggered and background components of seismicity in the Italian and Southern California seismic catalog. Then we study the space-time domain of the triggered earthquakes with the aim to identify possible variations in the triggering properties across the two regions. In the second part of the work we focus our attention on the characteristics of the background seismicity in both seismic catalogs. The assumption of time stationarity of the background seismicity (which includes both cluster parents and isolated events) is still under debate. Some authors suggest that the independent component of seismicity could undergo transient perturbations at various time scales due to different physical mechanisms, such as, for example, viscoelastic relaxation, presence of fluids, non-stationary plate motion, etc, whose impact may depend on the tectonic setting. Here we test if the background seismicity in the two regions can be satisfactorily described by the time-homogeneous Poisson process, and, in case, we characterize quantitatively possible discrepancies with this reference process, and the differences between the two regions.

  18. Seismic Risk Mitigation of Historical Minarets Using SMA Wire Dampers

    NASA Astrophysics Data System (ADS)

    El-Attar, Adel G.; Saleh, Ahmed M.; El-Habbal, Islam R.

    2008-07-01

    This paper presents the results of a research program sponsored by the European Commission through project WIND-CHIME (Wide Range Non-INtrusive Devices toward Conservation of HIstorical Monuments in the MEditerranean Area), in which the possibility of using advanced seismic protection technologies to preserve historical monuments in the Mediterranean area is investigated. In the current research, two outstanding Egyptian Mamluk-Style minarets, are investigated. The first is the southern minaret of Al-Sultaniya (1340 A.D, 739 Hijri Date (H.D.)), the second is the minaret of Qusun minaret (1337 A.D, 736 H.D.), both located within the city of Cairo. Based on previous studies on the minarets by the authors, a seismic retrofit technique is proposed. The technique utilizes shape memory alloy (SMA) wires as dampers for the upper, more flexible, parts of the minarets in addition to vertical pre-stressing of the lower parts found to be prone to tensile cracking under ground excitation. The effectiveness of the proposed technique is numerically evaluated via nonlinear transient dynamic analyses. The results indicate the effectiveness of the technique in mitigating the seismic hazard, demonstrated by the effective reduction in stresses and in dynamic response.

  19. Landslide maps and seismic noise: Rockmass weakening caused by shallow earthquakes

    NASA Astrophysics Data System (ADS)

    Uchida, Tara; Marc, Odin; Sens-Schönfelder, Christoph; Sawazaki, Kaoru; Hobiger, Manuel; Hovius, Niels

    2015-04-01

    Some studies have suggested that the shaking and deformation associated with earthquake would result in a temporary increased hillslope erodibility. However very few data have been able to clarify such effect. We present integrated geomorphic data constraining an elevated landslide rate following 4 continental shallow earthquakes, the Mw 6.9 Finisterre (1993), the Mw 7.6 ChiChi (1999), the Mw 6.6 Niigata (2004) and the Mw 6.8 Iwate-Miyagi (2008) earthquakes. We constrained the magnitude, the recovery time and somewhat the mechanism at the source of this higher landslide risk. We provide some evidences excluding aftershocks or rain forcing intensity as possible mechanism and leaving subsurface weakening as the most likely. The landslide data suggest that this ground strength weakening is not limited to the soil cover but also affect the shallow bedrock. Additionally, we used ambient noise autocorrelation techniques to monitor shallow subsurface seismic velocity within the epicentral area of three of those earthquakes. For most stations we observe a velocity drop followed by a recovery processes of several years in fair agreement with the recovery time estimated based on landslide observation. Thus a common processes could alter the strength of the first 10m of soil/rock and simultaneously drive the landslide rate increase and the seismic velocity drop. The ability to firmly demonstrate this link require additional constraints on the seismic signal interpretation but would provide a very useful tool for post-earthquake risk managment.

  20. Introducing Seismic Tomography with Computational Modeling

    NASA Astrophysics Data System (ADS)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  1. Imaging near surface mineral targets with ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Dales, P.; Audet, P.; Olivier, G.

    2017-12-01

    To keep up with global metal and mineral demand, new ore-deposits have to be discovered on a regular basis. This task is becoming increasingly difficult, since easily accessible deposits have been exhausted to a large degree. The typical procedure for mineral exploration begins with geophysical surveys followed by a drilling program to investigate potential targets. Since the retrieved drill core samples are one-dimensional observations, the many holes needed to interpolate and interpret potential deposits can lead to very high costs. To reduce the amount of drilling, active seismic imaging is sometimes used as an intermediary, however the active sources (e.g. large vibrating trucks or explosive shots) are expensive and unsuitable for operation in remote or environmentally sensitive areas. In recent years, passive seismic imaging using ambient noise has emerged as a novel, low-cost and environmentally sensitive approach for exploring the sub-surface. This technique dispels with active seismic sources and instead uses ambient seismic noise such as ocean waves, traffic or minor earthquakes. Unfortunately at this point, passive surveys are not capable of reaching the required resolution to image the vast majority of the ore-bodies that are being explored. In this presentation, we will show the results of an experiment where ambient seismic noise recorded on 60 seismic stations was used to image a near-mine target. The target consists of a known ore-body that has been partially exhausted by mining efforts roughly 100 years ago. The experiment examined whether ambient seismic noise interferometry can be used to image the intact and exhausted ore deposit. A drilling campaign was also conducted near the target which offers the opportunity to compare the two methods. If the accuracy and resolution of passive seismic imaging can be improved to that of active surveys (and beyond), this method could become an inexpensive intermediary step in the exploration process and result in a large decrease in the amount of drilling required to investigate and identify high-grade ore deposits.

  2. Submarine seismic monitoring of El Hierro volcanic eruption with a 3C-geophone string: applying new acquisition and data processing techniques to volcano monitoring

    NASA Astrophysics Data System (ADS)

    Jurado, Maria Jose; Ripepe, Maurizio; Lopez, Carmen; Blanco, Maria Jose; Crespo, Jose

    2015-04-01

    A submarine volcanic eruption took place near the southernmost emerged land of the El Hierro Island (Canary Islands, Spain), from October 2011 to February 2012. The Instituto Geografico Nacional (IGN) seismic stations network evidenced seismic unrest since July 2011 and was a reference also to follow the evolution of the seismic activity associated with the volcanic eruption. Right after the eruption onset, in October 2011 a geophone string was deployed by the CSIC-IGN to monitor seismic activity. Monitoring with the seismic array continued till May 2012. The array was installed less than 2 km away from the new vol¬cano, next to La Restinga village shore in the harbor from 6 to 12m deep into the water. Our purpose was to record seismic activity related to the volcanic activity, continuously and with special interest on high frequency events. The seismic array was endowed with 8, high frequency, 3 component, 250 Hz, geophone cable string with a separation of 6 m between them. Each geophone consists on a 3-component module based on 3 orthogonal independent sensors that measures ground velocity. Some of the geophones were placed directly on the seabed, some were buried. Due to different factors, as the irregular characteristics of the seafloor. The data was recorded on the surface with a seismometer and stored on a laptop computer. We show how acoustic data collected underwater show a great correlation with the seismic data recorded on land. Finally we compare our data analysis results with the observed sea surface activity (ash and lava emission and degassing). This evidence is disclosing new and innovative tecniques on monitoring submarine volcanic activity. Reference Instituto Geográfico Nacional (IGN), "Serie El Hierro." Internet: http://www.ign.es/ign/resources /volcanologia/HIERRO.html [May, 17. 2013

  3. Ultrasonic laboratory measurements of the seismic velocity changes due to CO2 injection

    NASA Astrophysics Data System (ADS)

    Park, K. G.; Choi, H.; Park, Y. C.; Hwang, S.

    2009-04-01

    Monitoring the behavior and movement of carbon dioxide (CO2) in the subsurface is a quite important in sequestration of CO2 in geological formation because such information provides a basis for demonstrating the safety of CO2 sequestration. Recent several applications in many commercial and pilot scale projects and researches show that 4D surface or borehole seismic methods are among the most promising techniques for this purpose. However, such information interpreted from the seismic velocity changes can be quite subjective and qualitative without petrophysical characterization for the effect of CO2 saturation on the seismic changes since seismic wave velocity depends on various factors and parameters like mineralogical composition, hydrogeological factors, in-situ conditions. In this respect, we have developed an ultrasonic laboratory measurement system and have carried out measurements for a porous sandstone sample to characterize the effects of CO2 injection to seismic velocity and amplitude. Measurements are done by ultrasonic piezoelectric transducer mounted on both ends of cylindrical core sample under various pressure, temperature, and saturation conditions. According to our fundamental experiments, injected CO2 introduces the decrease of seismic velocity and amplitude. We identified that the velocity decreases about 6% or more until fully saturated by CO2, but the attenuation of seismic amplitude is more drastically than the velocity decrease. We also identified that Vs/Vp or elastic modulus is more sensitive to CO2 saturation. We note that this means seismic amplitude and elastic modulus change can be an alternative target anomaly of seismic techniques in CO2 sequestration monitoring. Thus, we expect that we can estimate more quantitative petrophysical relationships between the changes of seismic attributes and CO2 concentration, which can provide basic relation for the quantitative assessment of CO2 sequestration by further researches.

  4. Toward Expanding Tremor Observations in the Northern San Andreas Fault System in the 1990s

    NASA Astrophysics Data System (ADS)

    Damiao, L. G.; Dreger, D. S.; Nadeau, R. M.; Taira, T.; Guilhem, A.; Luna, B.; Zhang, H.

    2015-12-01

    The connection between tremor activity and active fault processes continues to expand our understanding of deep fault zone properties and deformation, the tectonic process, and the relationship of tremor to the occurrence of larger earthquakes. Compared to tremors in subduction zones, known tremor signals in California are ~5 to ~10 smaller in amplitude and duration. These characteristics, in addition to scarce geographic coverage, lack of continuous data (e.g., before mid-2001 at Parkfield), and absence of instrumentation sensitive enough to monitor these events have stifled tremor detection. The continuous monitoring of these events over a relatively short time period in limited locations may lead to a parochial view of the tremor phenomena and its relationship to fault, tectonic, and earthquake processes. To help overcome this, we have embarked on a project to expand the geographic and temporal scope of tremor observation along the Northern SAF system using available continuous seismic recordings from a broad array of 100s of surface seismic stations from multiple seismic networks. Available data for most of these stations also extends back into the mid-1990s. Processing and analysis of tremor signal from this large and low signal-to-noise dataset requires a heavily automated, data-science type approach and specialized techniques for identifying and extracting reliable data. We report here on the automated, envelope based methodology we have developed. We finally compare our catalog results with pre-existing tremor catalogs in the Parkfield area.

  5. Slow Unlocking Processes Preceding the 2015 Mw 8.4 Illapel, Chile, Earthquake

    NASA Astrophysics Data System (ADS)

    Huang, Hui; Meng, Lingsen

    2018-05-01

    On 16 September 2015, the Mw 8.4 Illapel earthquake occurred in central Chile with no intense foreshock sequences documented in the regional earthquake catalog. Here we employ the matched-filter technique based on an enhanced template data set of previously catalogued events. We perform a continuous search over an 4-year period before the Illapel mainshock to recover the uncatalogued small events and repeating earthquakes. Repeating earthquakes are found both to the north and south of the mainshock rupture zone. To the south of the rupture zone, the seismicity and repeater-inferred aseismic slip progressively accelerate around the Illapel epicenter starting from 140 days before the mainshock. This may indicate an unlocking process involving the interplay of seismic and aseismic slip. The acceleration culminates in a M 5.3 event of low-angle thrust mechanism, which occurred 36 days before the Mw 8.4 mainshock. It is then followed by a relative quiescence in seismicity until the mainshock occurred. This quiescence might correspond to an intermediate period of stable slip before rupture initiation. In addition, to the north of the mainshock rupture area, the last aseismic-slip episode occurs within 175-95 days before the mainshock and accumulates the largest amount of slip in the observation period. The simultaneous occurrence of aseismic-slip transients over a large area is consistent with large-scale slow unlocking processes preceding the Illapel mainshock.

  6. Seismic experiment ross ice shelf 1990/91: Characteristics of the seismic reflection data

    USGS Publications Warehouse

    1993-01-01

    The Transantarctic Mountains, with a length of 3000-3500 km and elevations of up to 4500 m, are one of the major Cenozoic mountain ranges in the world and are by far the most striking example of rift-shoulder mountains. Over the 1990-1991 austral summer Seismic Experiment Ross Ice Shelf (SERIS) was carried out across the Transantarctic Mountain front, between latitudes 82 degrees to 83 degrees S, in order to investigate the transition zone between the rifted area of the Ross Embayment and the uplifted Transantarctic Mountains. This experiment involved a 140 km long seismic reflection profile together with a 96 km long coincident wide-angle reflection/refraction profile. Gravity and relative elevation (using barometric pressure) were also measured along the profile. The primary purpose was to examine the boundary between the rift system and the uplifted rift margin (represented by the Transantarctic Mountains) using modern multi-channel crustal reflection/refraction techniques. The results provide insight into crustal structure across the plate boundary. SERIS also represented one of the first large-scale and modern multi-channel seismic experiments in the remote interior of Antarctica. As such, the project was designed to test different seismic acquisition techniques which will be involved in future seismic exploration of the continent. This report describes the results from the analysis of the acquisition tests as well as detailing some of the characteristics of the reflection seismic data. (auths.)

  7. Illuminating Asset Value through New Seismic Technology

    NASA Astrophysics Data System (ADS)

    Brandsberg-Dahl, S.

    2007-05-01

    The ability to reduce risk and uncertainty across the full life cycle of an asset is directly correlated to creating an accurate subsurface image that enhances our understanding of the geology. This presentation focuses on this objective in areas of complex overburden in deepwater. Marine 3D seismic surveys have been acquired in essentially the same way for the past decade. This configuration of towed streamer acquisition, where the boat acquires data in one azimuth has been very effective in imaging areas in fairly benign geologic settings. As the industry has moved into more complicated geologic settings these surveys no longer meet the imaging objectives for risk reduction in exploration through production. In shallow water, we have seen increasing use of ocean bottom cables to meet this challenge. For deepwater, new breakthroughs in technology were required. This will be highlighted through examples of imaging below large salt bodies in the deep water Gulf of Mexico. GoM - Mad Dog: The Mad Dog field is located approximately 140 miles south of the Louisiana coastline in the southern Green Canyon area in water depths between 4100 feet to 6000 feet. The complex salt canopy overlying a large portion of the field results in generally poor seismic data quality. Advanced processing techniques improved the image, but gaps still remained even after several years of effort. We concluded that wide azimuth acquisition was required to illuminate the field in a new way. Results from the Wide Azimuth Towed Streamer (WATS) survey deployed at Mad Dog demonstrated the anticipated improvement in the subsalt image. GoM - Atlantis Field: An alternative approach to wide azimuth acquisition, ocean bottom seismic (OBS) node technology, was developed and tested. In 2001 deepwater practical experience was limited to a few nodes owned by academic institutions and there were no commercial solutions either available or in development. BP embarked on a program of sea trials designed to both evaluate technologies and subsequently encourage vendor activity to develop and deploy a commercial system. The 3D seismic method exploded into general usage in the 1990's. Our industry delivered 3D cheaper and faster, improving quality through improved acquisition specifications and new processing technology. The need to mitigate business risks in highly material subsalt plays led BP to explore the technical limits of the seismic method, testing novel acquisition techniques to improve illumination and signal to noise ratio. These were successful and are applicable to analogue seismic quality problems globally providing breakthroughs in illuminating previously hidden geology and hydrocarbon reservoirs. A focused business challenge, smart risk taking, investment in people and computing capability, partnerships, and rapid implementation are key themes that will be touched on through out the talk.

  8. Seismic and potential field studies over the East Midlands

    NASA Astrophysics Data System (ADS)

    Kirk, Wayne John

    A seismic refraction profile was undertaken to investigate the source of an aeromagnetic anomaly located above the Widmerpool Gulf, East Midlands. Ten shots were fired into 51 stations at c. 1.5km spacing in a 70km profile during 41 days recording. The refraction data were processed using standard techniques to improve the data quality. A new filtering technique, known as Correlated Adaptive Noise Cancellation was tested on synthetic data and successfully applied to controlled source and quarry blast data. Study of strong motion data reveals that the previous method of site calibration is invalid. A new calibration technique, known as the Scaled Amplitude method is presented to provide safer charge size estimation. Raytrace modelling of the refraction data and two dimensional gravity interpretation confirms the presence of the Widmerpool Gulf but no support is found for the postulated intrusion. Two dimensional magnetic interpretation revealed that the aeromagnetic anomaly could be modelled with a Carboniferous igneous source. A Lower Palaeozoic refractor with a velocity of 6.0 km/s is identified at a maximum depth of c. 2.85km beneath the Widmerpool Gulf. Carboniferous and post-Carboniferous sediments within the gulf have velocities between 2.6-5.5 km/s with a strong vertical gradient. At the gulf margins, a refractor with a constant velocity of 5.2 km/s is identified as Dinantian limestone. A low velocity layer of proposed unaltered Lower Palaeozoics is identified beneath the limestone at the eastern edge of the Derbyshire Dome. The existence and areal extent of this layer are also determined from seismic reflection data. Image analysis of potential field data, presents a model identifying 3 structural provinces, the Midlands Microcraton, the Welsh and English Caledonides and a central region of complex linears. This model is used to explain the distribution of basement rocks determined from seismic and gravity profiles.

  9. Variscan deformation along the Teisseyre-Tornquist Zone in SE Poland: Thick-skinned structural inheritance or thin-skinned thrusting?

    NASA Astrophysics Data System (ADS)

    Krzywiec, P.; Gągała, Ł.; Mazur, S.; Słonka, Ł.; Kufrasa, M.; Malinowski, M.; Pietsch, K.; Golonka, J.

    2017-10-01

    Recently acquired seismic reflection data provide better insight in the structural style of extensive sedimentary series overlying the SW slope of the East European Craton (EEC) in Poland. The two main seismic datasets - the POLCRUST-01 profile and PolandSPAN survey - yielded contrasting thick - and thin-skinned structural models for the same structural units in SE Poland. We reattempt an interpretation of the POLCRUST-01 profile using techniques of cross-section balancing and restoration aided by 2D forward seismic modelling. An outcome is the thin-skinned structural model is. This solution relies on a continuous top of the EEC crystalline basement well represented in the seismic data as well as on fragmentary, yet conclusive seismic geometries in shallow depth intervals proving the Ediacaran-Palaeozoic series to be thrust and folded. A Variscan (late Carboniferous) compressional regime is consequently invoked to explain thin-skinned structuring of the pre-Permian sedimentary pile and > 20 km of calculated shortening. We demonstrate an ambiguous nature of the top-basement irregularities previously used as indicators of basement-rooted vertical faulting. The tilt and abrupt increase of the top-basement taper under the thin-skinned belt are attributed to pre-Ordovician tectonic processes operating along the SW margin of the EEC. Post-rift subsidence and/or flexural loading giving rise to a broken foreland plate are invoked.

  10. Waveform classification of volcanic low-frequency earthquake swarms and its implication at Soufrière Hills Volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Green, David N.; Neuberg, Jürgen

    2006-05-01

    Low-frequency volcanic earthquakes are indicators of magma transport and activity within shallow conduit systems. At a number of volcanoes, these events exhibit a high degree of waveform similarity providing a criterion for classification. Using cross-correlation techniques to quantify the degree of similarity, we develop a method to sort events into families containing comparable waveforms. Events within a family have been triggered within one small source volume from which the seismic wave has then travelled along an identical path to the receiver. This method was applied to a series of 16 low-frequency earthquake swarms, well correlated with cyclic deformation recorded by tiltmeters, at Soufrière Hills Volcano, Montserrat, in June 1997. Nine waveform groups were identified containing more than 45 events each. The families are repeated across swarms with only small changes in waveform, indicating that the seismic source location is stable with time. The low-frequency seismic swarms begin prior to the point at which inflation starts to decelerate, suggesting that the seismicity indicates or even initiates a depressurisation process. A major dome collapse occurred within the time window considered, removing the top 100 m of the dome. This event caused activity within some families to pause for several cycles before reappearing. This shows that the collapse did not permanently disrupt the source mechanism or the path of the seismic waves.

  11. A review of seismoelectric data processing techniques

    NASA Astrophysics Data System (ADS)

    Warden, S. D.; Garambois, S.; Jouniaux, L.; Sailhac, P.

    2011-12-01

    Seismoelectric tomography is expected to combine the sensitivity of electromagnetic methods to hydrological properties such as water-content and permeability, to the high resolution of conventional seismic surveys. This innovative exploration technique seems very promising as it could characterize the fluids contained in reservoir rocks and detect thin layers invisible to other methods. However, it still needs to be improved before it can be successfully applied to real case problems. One of the main issues that need to be addressed is the development of wave separation techniques enabling to recover the signal of interest. Seismic waves passing through a fluid-saturated porous layered medium convert into at least two types of electromagnetic waves: the coseismic field (type I), accompanying seismic body and surface waves, and the independently propagating interface response (type II). The latter occurs when compressional waves encounter a contrast between electrical, chemical or mechanical properties in the subsurface, thus acting as a secondary source that can be generally approximated by a sum of electrical dipoles oscillating at the first Fresnel zone. Although properties of the medium in the vicinity of the receivers can be extracted from the coseismic waves, only the interface response provides subsurface information at depth, which makes it critical to separate both types of energy. This is a delicate problem, as the interface response may be several orders of magnitude weaker than the coseismic field. However, as reviewed by Haines et al. (2007), several properties of the interface response can be used to identify it: its dipolar amplitude pattern, its opposite polarity on opposite sides of the shot point and the electromagnetic velocity at which it travels, several orders of magnitude greater than seismic velocities. This latter attribute can be exploited to implement filtering techniques in frequency-wavenumber (f-k) and radon (tau-p) domain, which we have done on synthetic seismoelectric data created using SKB, a modeling program written by Stéphane Garambois, from LGIT (Laboratoire de Géophysique Interne et Tectonophysique, Grenoble, France). We will assess the efficiency of these methods, discuss how they affect signal amplitudes and how they can be improved by sparsity-promoting approaches.

  12. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including hanging wall and directivity effects) within modern ground motion prediction equations, can have an influence on the seismic hazard at a site. Yet we also illustrate the conditions under which these effects may be partially tempered when considering the full uncertainty in rupture behaviour within the fault system. The third challenge is the development of efficient means for representing both aleatory and epistemic uncertainties from active fault models in PSHA. In implementing state-of-the-art seismic hazard models into OpenQuake, such as those recently undertaken in California and Japan, new modeling techniques are needed that redefine how we treat interdependence of ruptures within the model (such as mutual exclusivity), and the propagation of uncertainties emerging from geology. Finally, we illustrate how OpenQuake, and GEM's additional toolkits for model preparation, can be applied to address long-standing issues in active fault modeling in PSHA. These include constraining the seismogenic coupling of a fault and the partitioning of seismic moment between the active fault surfaces and the surrounding seismogenic crust. We illustrate some of the possible roles that geodesy can play in the process, but highlight where this may introduce new uncertainties and potential biases into the seismic hazard process, and how these can be addressed.

  13. Seismic properties of fluid bearing formations in magmatic geothermal systems: can we directly detect geothermal activity with seismic methods?

    NASA Astrophysics Data System (ADS)

    Grab, Melchior; Scott, Samuel; Quintal, Beatriz; Caspari, Eva; Maurer, Hansruedi; Greenhalgh, Stewart

    2016-04-01

    Seismic methods are amongst the most common techniques to explore the earth's subsurface. Seismic properties such as velocities, impedance contrasts and attenuation enable the characterization of the rocks in a geothermal system. The most important goal of geothermal exploration, however, is to describe the enthalpy state of the pore fluids, which act as the main transport medium for the geothermal heat, and to detect permeable structures such as fracture networks, which control the movement of these pore fluids in the subsurface. Since the quantities measured with seismic methods are only indirectly related with the fluid state and the rock permeability, the interpretation of seismic datasets is difficult and usually delivers ambiguous results. To help overcome this problem, we use a numerical modeling tool that quantifies the seismic properties of fractured rock formations that are typically found in magmatic geothermal systems. We incorporate the physics of the pore fluids, ranging from the liquid to the boiling and ultimately vapor state. Furthermore, we consider the hydromechanics of permeable structures at different scales from small cooling joints to large caldera faults as are known to be present in volcanic systems. Our modeling techniques simulate oscillatory compressibility and shear tests and yield the P- and S-wave velocities and attenuation factors of fluid saturated fractured rock volumes. To apply this modeling technique to realistic scenarios, numerous input parameters need to be indentified. The properties of the rock matrix and individual fractures were derived from extensive literature research including a large number of laboratory-based studies. The geometries of fracture networks were provided by structural geologists from their published studies of outcrops. Finally, the physical properties of the pore fluid, ranging from those at ambient pressures and temperatures up to the supercritical conditions, were taken from the fluid physics literature. The results of this study allow us to describe the seismic properties as a function of hydrothermal and geological features. We use it in a forward seismic modeling study to examine how the seismic response changes with temporally and/or spatially varying fluid properties.

  14. A new method to estimate location and slip of simulated rock failure events

    NASA Astrophysics Data System (ADS)

    Heinze, Thomas; Galvan, Boris; Miller, Stephen Andrew

    2015-05-01

    At the laboratory scale, identifying and locating acoustic emissions (AEs) is a common method for short term prediction of failure in geomaterials. Above average AE typically precedes the failure process and is easily measured. At larger scales, increase in micro-seismic activity sometimes precedes large earthquakes (e.g. Tohoku, L'Aquilla, oceanic transforms), and can be used to assess seismic risk. The goal of this work is to develop a methodology and numerical algorithms for extracting a measurable quantity analogous to AE arising from the solution of equations governing rock deformation. Since there is no physical property to quantify AE derivable from the governing equations, an appropriate rock-mechanical analog needs to be found. In this work, we identify a general behavior of the AE generation process preceding rock failure. This behavior includes arbitrary localization of low magnitude events during pre-failure stage, followed by increase in number and amplitude, and finally localization around the incipient failure plane during macroscopic failure. We propose deviatoric strain rate as the numerical analog that mimics this behavior, and develop two different algorithms designed to detect rapid increases in deviatoric strain using moving averages. The numerical model solves a fully poro-elasto-plastic continuum model and is coupled to a two-phase flow model. We test our model by comparing simulation results with experimental data of drained compression and of fluid injection experiments. We find for both cases that occurrence and amplitude of our AE analog mimic the observed general behavior of the AE generation process. Our technique can be extended to modeling at the field scale, possibly providing a mechanistic basis for seismic hazard assessment from seismicity that occasionally precedes large earthquakes.

  15. a Comparative Case Study of Reflection Seismic Imaging Method

    NASA Astrophysics Data System (ADS)

    Alamooti, M.; Aydin, A.

    2017-12-01

    Seismic imaging is the most common means of gathering information about subsurface structural features. The accuracy of seismic images may be highly variable depending on the complexity of the subsurface and on how seismic data is processed. One of the crucial steps in this process, especially in layered sequences with complicated structure, is the time and/or depth migration of seismic data.The primary purpose of the migration is to increase the spatial resolution of seismic images by repositioning the recorded seismic signal back to its original point of reflection in time/space, which enhances information about complex structure. In this study, our objective is to process a seismic data set (courtesy of the University of South Carolina) to generate an image on which the Magruder fault near Allendale SC can be clearly distinguished and its attitude can be accurately depicted. The data was gathered by common mid-point method with 60 geophones equally spaced along an about 550 m long traverse over a nearly flat ground. The results obtained from the application of different migration algorithms (including finite-difference and Kirchhoff) are compared in time and depth domains to investigate the efficiency of each algorithm in reducing the processing time and improving the accuracy of seismic images in reflecting the correct position of the Magruder fault.

  16. 100 years of seismic research on the Moho

    NASA Astrophysics Data System (ADS)

    Prodehl, Claus; Kennett, Brian; Artemieva, Irina M.; Thybo, Hans

    2013-12-01

    The detection of a seismic boundary, the “Moho”, between the outermost shell of the Earth, the Earth's crust, and the Earth's mantle by A. Mohorovičić was the consequence of increased insight into the propagation of seismic waves caused by earthquakes. This short history of seismic research on the Moho is primarily based on the comprehensive overview of the worldwide history of seismological studies of the Earth's crust using controlled sources from 1850 to 2005, by Prodehl and Mooney (2012). Though the art of applying explosions, so-called “artificial events”, as energy sources for studies of the uppermost crustal layers began in the early 1900s, its effective use for studying the entire crust only began at the end of World War II. From 1945 onwards, controlled-source seismology has been the major approach to study details of the crust and underlying crust-mantle boundary, the Moho. The subsequent description of history of controlled-source crustal seismology and its seminal results is subdivided into separate chapters for each decade, highlighting the major advances achieved during that decade in terms of data acquisition, processing technology, and interpretation methods. Since the late 1980s, passive seismology using distant earthquakes has played an increasingly important role in studies of crustal structure. The receiver function technique exploiting conversions between P and SV waves at discontinuities in seismic wavespeed below a seismic station has been extensively applied to the increasing numbers of permanent and portable broad-band seismic stations across the globe. Receiver function studies supplement controlled source work with improved geographic coverage and now make a significant contribution to knowledge of the nature of the crust and the depth to Moho.

  17. The 2017 Maple Creek Seismic Swarm in Yellowstone National Park

    NASA Astrophysics Data System (ADS)

    Pang, G.; Hale, J. M.; Farrell, J.; Burlacu, R.; Koper, K. D.; Smith, R. B.

    2017-12-01

    The University of Utah Seismograph Stations (UUSS) performs near-real-time monitoring of seismicity in the region around Yellowstone National Park in partnership with the United States Geological Survey and the National Park Service. UUSS operates and maintains 29 seismic stations with network code WY (short-period, strong-motion, and broadband) and records data from five other seismic networks—IW, MB, PB, TA, and US—to enhance the location capabilities in the Yellowstone region. A seismic catalog is produced using a conventional STA/LTA detector and single-event location techniques (Hypoinverse). On June 12, 2017, a seismic swarm began in Yellowstone National Park about 5 km east of Hebgen Lake. The swarm is adjacent to the source region of the 1959 MW 7.3 Hebgen Lake earthquake, in an area corresponding to positive Coulumb stress change from that event. As of Aug. 1, 2017, the swarm consists of 1481 earthquakes with 1 earthquake above magnitude 4, 8 earthquakes in the magnitude 3 range, 115 earthquakes in the magnitude 2 range, 469 earthquakes in the magnitude 1 range, 856 earthquakes in the magnitude 0 range, 22 earthquakes with negative magnitudes, and 10 earthquakes with no magnitude. Earthquake depths are mostly between 3 and 10 km and earthquake depth increases toward the northwest. Moment tensors for the 2 largest events (3.6 MW and 4.4. MW) show strike-slip faulting with T axes oriented NE-SW, consistent with the regional stress field. We are currently using waveform cross-correlation methods to measure differential travel times that are being used with the GrowClust program to generate high-accuracy relative relocations. Those locations will be used to identify structures in the seismicity and make inferences about the tectonic and magmatic processes causing the swarm.

  18. Transient Electromagnetic Modelling and Imaging of Thin Resistive Structures: Applications for Gas Hydrate Assessment

    NASA Astrophysics Data System (ADS)

    Swidinsky, Andrei

    Gas hydrates are a solid, ice-like mixture of water and low molecular weight hydrocarbons. They are found under the permafrost and to a far greater extent under the ocean, usually at water depths greater than 300m. Hydrates are a potential energy resource, a possible factor in climate change, and a geohazard. For these reasons, it is critical that gas hydrate deposits are quantitatively assessed so that their concentrations, locations and distributions may be established. Due to their ice-like nature, hydrates are electrically insulating. Consequently, a method which remotely detects changes in seafloor electrical conductivity, such as marine controlled source electromagnetics (CSEM), is a useful geophysical tool for marine gas hydrate exploration. Hydrates are geometrically complex structures. Advanced electromagnetic modelling and imaging techniques are crucial for proper survey design and data interpretation. I develop a method to model thin resistive structures in conductive host media which may be useful in building approximate geological models of gas hydrate deposits using arrangements of multiple, bent sheets. I also investigate the possibility of interpreting diffusive electromagnetic data using seismic imaging techniques. To be processed in this way, such data must first be transformed into its non-diffusive, seismic-like counterpart. I examine such a transform from both an analytical and a numerical point of view, focusing on methods to overcome inherent numerical instabilities. This is the first step to applying seismic processing techniques to CSEM data to rapidly and efficiently image resistive gas hydrate structures. The University of Toronto marine electromagnetics group has deployed a permanent marine CSEM array offshore Vancouver Island, in the framework of the NEPTUNE Canada cabled observatory, for the purposes of monitoring gas hydrate deposits. In this thesis I also propose and examine a new CSEM survey technique for gas hydrate which would make use of the stationary seafloor transmitter already on the seafloor, along with a cabled receiver array, towed from a ship. I furthermore develop a modelling algorithm to examine the electromagnetic effects of conductive borehole casings which have been proposed to be placed in the vicinity of this permanent marine CSEM array, and make preliminary recommendations about their locations.

  19. Development of Deep-tow Autonomous Cable Seismic (ACS) for Seafloor Massive Sulfides (SMSs) Exploration.

    NASA Astrophysics Data System (ADS)

    Asakawa, Eiichi; Murakami, Fumitoshi; Tsukahara, Hitoshi; Saito, Shutaro; Lee, Sangkyun; Tara, Kenji; Kato, Masafumi; Jamali Hondori, Ehsan; Sumi, Tomonori; Kadoshima, Kazuyuki; Kose, Masami

    2017-04-01

    Within the EEZ of Japan, numerous surveys exploring ocean floor resources have been conducted. The exploration targets are gas hydrates, mineral resources (manganese, cobalt or rare earth) and especially seafloor massive sulphide (SMS) deposits. These resources exist in shallow subsurface areas in deep waters (>1500m). For seismic explorations very high resolution images are required. These cannot be effectively obtained with conventional marine seismic techniques. Therefore we have been developing autonomous seismic survey systems which record the data close to the seafloor to preserve high frequency seismic energy. Very high sampling rate (10kHz) and high accurate synchronization between recording systems and shot time are necessary. We adopted Cs-base atomic clock considering its power consumption. At first, we developed a Vertical Cable Seismic (VCS) system that uses hydrophone arrays moored vertically from the ocean bottom to record close to the target area. This system has been successfully applied to SMS exploration. Specifically it fixed over known sites to assess the amount of reserves with the resultant 3D volume. Based on the success of VCS, we modified the VCS system to use as a more efficient deep-tow seismic survey system. Although there are other examples of deep-tow seismic systems, signal transmission cables present challenges in deep waters. We use our autonomous recording system to avoid these problems. Combining a high frequency piezoelectric source (Sub Bottom Profiler:SBP) that automatically shots with a constant interval, we achieve the high resolution deep-tow seismic without data transmission/power cable to the board. Although the data cannot be monitored in real-time, the towing system becomes very simple. We have carried out survey trial, which showed the systems utility as a high-resolution deep-tow seismic survey system. Furthermore, the frequency ranges of deep-towed source (SBP) and surface towed sparker are 700-2300Hz and 10-200Hz respectively. Therefore we can use these sources simultaneously and distinguish the records of each source in the data processing stage. We have developed new marine seismic survey systems with autonomous recording for the exploration of the ocean floor resources. The applications are vertical cable seismic (VCS) and deep-tow seismic (ACS). These enable us the recording close to the seafloor and give the high resolution results with a simple, cost-effective configuration.

  20. Soft computing analysis of the possible correlation between temporal and energy release patterns in seismic activity

    NASA Astrophysics Data System (ADS)

    Konstantaras, Anthony; Katsifarakis, Emmanouil; Artzouxaltzis, Xristos; Makris, John; Vallianatos, Filippos; Varley, Martin

    2010-05-01

    This paper is a preliminary investigation of the possible correlation of temporal and energy release patterns of seismic activity involving the preparation processes of consecutive sizeable seismic events [1,2]. The background idea is that during periods of low-level seismic activity, stress processes in the crust accumulate energy at the seismogenic area whilst larger seismic events act as a decongesting mechanism releasing considerable energy [3,4]. A dynamic algorithm is being developed aiming to identify and cluster pre- and post- seismic events to the main earthquake following on research carried out by Zubkov [5] and Dobrovolsky [6,7]. This clustering technique along with energy release equations dependent on Richter's scale [8,9] allow for an estimate to be drawn regarding the amount of the energy being released by the seismic sequence. The above approach is being implemented as a monitoring tool to investigate the behaviour of the underlying energy management system by introducing this information to various neural [10,11] and soft computing models [1,12,13,14]. The incorporation of intelligent systems aims towards the detection and simulation of the possible relationship between energy release patterns and time-intervals among consecutive sizeable earthquakes [1,15]. Anticipated successful training of the imported intelligent systems may result in a real-time, on-line processing methodology [1,16] capable to dynamically approximate the time-interval between the latest and the next forthcoming sizeable seismic event by monitoring the energy release process in a specific seismogenic area. Indexing terms: pattern recognition, long-term earthquake precursors, neural networks, soft computing, earthquake occurrence intervals References [1] Konstantaras A., Vallianatos F., Varley M.R. and Makris J. P.: ‘Soft computing modelling of seismicity in the southern Hellenic arc', IEEE Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [2] Eneva M. and Ben-Zion Y.: ‘Techniques and parameters to analyze seismicity patterns associated with large earthquakes', Geophysics Res., vol. 102, pp. 17785-17795, 1997a [3] Habermann R. E.: ‘Precursory seismic quiescence: past, present and future', Pure Applied Geophysics, vol. 126, pp. 279-318, 1988 [4] Matthews M. V. and Reasenberg P. A.: ‘Statistical methods for investigating quiescence and other temporal seismicity patterns', Pure Applied Geophysics, vol. 126, pp. 357-372, 1988 [5] Zubkov S. I.: ‘The appearance times of earthquake precursors', Izv. Akad. Nauk SSSR Fiz. Zemli (Solid Earth), No. 5, pp. 87-91, 1987 [6] Dobrovolsky I. P., Zubkov S. I. and Miachkin V. I.: ‘Estimation of the size of earthquake preparation zones', Pageoph, vol. 117, pp. 1025-1044, 1979 [7] Dobrovolsky I. P., Gershenzon N. I. And Gokhberg M. B.: ‘Theory of electrokinetic effects occurring at the final stage in the preparation of a tectonic earthquake', Physics of the Earth and Planetary Interiors, vol. 57, pp. 144-156, 1989 [8] Richter C. F.: ‘Elementary Seismology', W.H.Freeman and Co., San Francisco, 1958 [9] Choy G. L. and Boatwright J. L.: ‘Global patterns of radiated seismic energy and apparent stress', Journal of Geophysical Research, vol. 84 (B5), pp. 2348-2350, 1995 [10] Haykin S.: ‘Neural Networks', 2nd Edition, Prentice Hall, 1999 [11] Jang J., Sun T. and Mizutany E.: ‘Neuro-fuzzy and soft computing', Prentice Hall, Upper Saddle River, NJ, 1997 [12] Konstantaras A., Varley M.R., Vallianatos F., Collins G. and Holifield P.: ‘Detection of weak seismo-electric signals upon the recordings of the electrotelluric field by means of neuron-fuzzy technology', IEEE Geoscience and Remote Sensing Letters, vol. 4 (1), 2007 [13] Konstantaras A., Varley M.R., Vallianatos F., Collins G. and Holifield P.: ‘Neuro-fuzzy prediction-based adaptive filtering applied to severely distorted magnetic field recordings', IEEE Geoscience and Remote Sensing Letters, vol. 3 (4), 2006 [14] Maravelakis E., Bilalis N., Keith J. and Antoniadis A.: ‘Measuring and Benchmarking the Innovativeness of SME's: a three dimensional Fuzzy Logic Approach', Production Planning and Control Journal, vol. 17 (3), pp. 283-292, 2006 [15] Bodri B.: ‘A neural-network model for earthquake occurrence', Geodynamics, vol. 32, pp. 289-310, 2001 [16] Skounakis E., Karagiannis V. and Vlissidis A.: ‘A Versatile System for Real-time Analyzing and Testing Objects Quality', Proceedings-CD of the 4th International Conference on "New Horizons in Industry, Business and Education" (NHIBE 2005), Corfu, Greece, pp. 701-708, 2005

  1. Different deformation patterns using GPS in the volcanic process of El Hierro (Canary Island) 2011-2013

    NASA Astrophysics Data System (ADS)

    García-Cañada, Laura; José García-Arias, María; Pereda de Pablo, Jorge; Lamolda, Héctor; López, Carmen

    2014-05-01

    Ground deformation is one of the most important parameter in volcano monitoring. The detected deformations in volcanic areas can be precursors of a volcanic activity and contribute with useful information to study the evolution of an unrest, eruption or any volcanic process. GPS is the most common technique used to measure volcano deformations. It can be used to detect slow displacement rates or much larger and faster deformations associated with any volcanic process. In volcanoes the deformation is expected to be a mixed of nature; during periods of quiescence it will be slow or not present, while increased activity slow displacement rates can be detected or much larger and faster deformations can be measure due to magma intrusion, for example in the hours to days prior a eruption beginning. In response to the anomalous seismicity detected at El Hierro in July 2011, the Instituto Geográfico Nacional (IGN) improved its volcano monitoring network in the island with continuous GPS that had been used to measure the ground deformation associated with the precursory unrest since summer 2011, submarine eruption (October 2011-March 2012) and the following unrest periods (2012-2013). The continuous GPS time series, together with other techniques, had been used to evaluate the activity and to detect changes in the process. We investigate changes in the direction and module of the deformation obtained by GPS and they show different patterns in every unrest period, very close to the seismicity locations and migrations.

  2. Synthetic seismograms from vibracores: A case study in correlating the late quaternary seismic stratigraphy of the New Jersey inner continental shelf

    USGS Publications Warehouse

    Esker, D.; Sheridan, R.E.; Ashley, G.M.; Waldner, J.S.; Hall, D.W.

    1996-01-01

    A new technique, using empirical relationships between median grain size and density and velocity to calculate proxy values for density and velocity, avoids many of the problems associated with the use of well logs and shipboard measurements to construct synthetic seismograms. This method was used to groundtruth and correlate across both analog and digital shallow high-resolution seismic data on the New Jersey shelf. Sampling dry vibracores to determine median grain size eliminates the detrimental effects that coring disturbances and preservation variables have on the sediment and water content of the core. The link between seismic response to lithology and bed spacing is more exact. The exact frequency of the field seismic data can be realistically simulated by a 10-20 cm sampling interval of the vibracores. The estimate of the percentage error inherent in this technique, 12% for acoustic impedance and 24% for reflection amplitude, is calculated to one standard deviation and is within a reasonable limit for such a procedure. The synthetic seismograms of two cores, 4-6 m long, were used to correlate specific sedimentary deposits to specific seismic reflection responses. Because this technique is applicable to unconsolidated sediments, it is ideal for upper Pleistocene and Holocene strata. Copyright ?? 1996, SEPM (Society for Sedimentary Geology).

  3. Analysis of the impact of large scale seismic retrofitting strategies through the application of a vulnerability-based approach on traditional masonry buildings

    NASA Astrophysics Data System (ADS)

    Ferreira, Tiago Miguel; Maio, Rui; Vicente, Romeu

    2017-04-01

    The buildings' capacity to maintain minimum structural safety levels during natural disasters, such as earthquakes, is recognisably one of the aspects that most influence urban resilience. Moreover, the public investment in risk mitigation strategies is fundamental, not only to promote social and urban and resilience, but also to limit consequent material, human and environmental losses. Despite the growing awareness of this issue, there is still a vast number of traditional masonry buildings spread throughout many European old city centres that lacks of adequate seismic resistance, requiring therefore urgent retrofitting interventions in order to both reduce their seismic vulnerability and to cope with the increased seismic requirements of recent code standards. Thus, this paper aims at contributing to mitigate the social and economic impacts of earthquake damage scenarios through the development of vulnerability-based comparative analysis of some of the most popular retrofitting techniques applied after the 1998 Azores earthquake. The influence of each technique individually and globally studied resorting to a seismic vulnerability index methodology integrated into a GIS tool and damage and loss scenarios are constructed and critically discussed. Finally, the economic balance resulting from the implementation of that techniques are also examined.

  4. The integration of gravity, magnetic and seismic data in delineating the sedimentary basins of northern Sinai and deducing their structural controls

    NASA Astrophysics Data System (ADS)

    Selim, El Sayed Ibrahim

    2016-01-01

    The Sinai Peninsula is a part of the Sinai sub-plate that located between the southeast Nubian-Arabian shield and the southeastern Mediterranean northward. The main objectives of this investigation are to deduce the main sedimentary basin and its subdivisions, identify the subsurface structural framework that affects the study area and determine the thickness of sedimentary cover of the basement surface. The total intensity magnetic map, Bouguer gravity map and seismic data were used to achieve the study aims. Structural interpretation of the gravity and magnetic data were done by applying advanced processing techniques. These techniques include; Reduce to the pole (RTP), Power spectrum, Tile derivative and Analytical Signal techniques were applied on gravity and magnetic data. Two dimensional gravity and magnetic modeling and interpretation of seismic sections were done to determine the thickness of sedimentary cover of the study area. The integration of our interpretation suggests that, the northern Sinai area consists of elongated troughs that contain many high structural trends. Four major structural trends have been identified, that, reflecting the influence of district regional tectonic movements. These trends are: (1) NE-SW trend; (2) NNW-SSE trend; (3) ENE-WSW trend and (4) WNW-ESE trend. There are also many minor trends, E-W, NW-SE and N-S structural trends. The main sedimentary basin of North Sinai is divided into four sub-basins; (1) Northern Maghara; (2) Northeastern Sinai; (3) Northwestern Sinai and (4) Central Sinai basin. The sedimentary cover ranges between 2 km and 7 km in the northern part of the study area.

  5. Using 3D visualization and seismic attributes to improve structural and stratigraphic resolution of reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerr, J.; Jones, G.L.

    1996-01-01

    Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less

  6. Using 3D visualization and seismic attributes to improve structural and stratigraphic resolution of reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerr, J.; Jones, G.L.

    1996-12-31

    Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less

  7. Earthquake Monitoring with the MyShake Global Smartphone Seismic Network

    NASA Astrophysics Data System (ADS)

    Inbal, A.; Kong, Q.; Allen, R. M.; Savran, W. H.

    2017-12-01

    Smartphone arrays have the potential for significantly improving seismic monitoring in sparsely instrumented urban areas. This approach benefits from the dense spatial coverage of users, as well as from communication and computational capabilities built into smartphones, which facilitate big seismic data transfer and analysis. Advantages in data acquisition with smartphones trade-off with factors such as the low-quality sensors installed in phones, high noise levels, and strong network heterogeneity, all of which limit effective seismic monitoring. Here we utilize network and array-processing schemes to asses event detectability with the MyShake global smartphone network. We examine the benefits of using this network in either triggered or continuous modes of operation. A global database of ground motions measured on stationary phones triggered by M2-6 events is used to establish detection probabilities. We find that the probability of detecting an M=3 event with a single phone located <10 km from the epicenter exceeds 70%. Due to the sensor's self-noise, smaller magnitude events at short epicentral distances are very difficult to detect. To increase the signal-to-noise ratio, we employ array back-projection techniques on continuous data recorded by thousands of phones. In this class of methods, the array is used as a spatial filter that suppresses signals emitted from shallow noise sources. Filtered traces are stacked to further enhance seismic signals from deep sources. We benchmark our technique against traditional location algorithms using recordings from California, a region with large MyShake user database. We find that locations derived from back-projection images of M 3 events recorded by >20 nearby phones closely match the regional catalog locations. We use simulated broadband seismic data to examine how location uncertainties vary with user distribution and noise levels. To this end, we have developed an empirical noise model for the metropolitan Los-Angeles (LA) area. We find that densities larger than 100 stationary phones/km2 are required to accurately locate M 2 events in the LA basin. Given the projected MyShake user distribution, that condition may be met within the next few years.

  8. Seismic Tomography

    NASA Astrophysics Data System (ADS)

    Nowack, Robert L.; Li, Cuiping

    The inversion of seismic travel-time data for radially varying media was initially investigated by Herglotz, Wiechert, and Bateman (the HWB method) in the early part of the 20th century [1]. Tomographic inversions for laterally varying media began in seismology starting in the 1970’s. This included early work by Aki, Christoffersson, and Husebye who developed an inversion technique for estimating lithospheric structure beneath a seismic array from distant earthquakes (the ACH method) [2]. Also, Alekseev and others in Russia performed early inversions of refraction data for laterally varying upper mantle structure [3]. Aki and Lee [4] developed an inversion technique using travel-time data from local earthquakes.

  9. Robust Satellite Techniques for monitoring earth emitted radiation in the Japanese seismic area by using MTSAT observations in the TIR spectral range

    NASA Astrophysics Data System (ADS)

    Genzano, Nicola; Filizzola, Carolina; Hattori, Katsumi; Lisi, Mariano; Paciello, Rossana; Pergola, Nicola; Tramutoli, Valerio

    2016-04-01

    Since eighties, the fluctuations of Earth's thermally emitted radiation, measured by satellite sensors operating in the thermal infrared (TIR) spectral range, have been associated with the complex process of preparation for major earthquakes. But, like other claimed earthquake precursors (seismological, physical, chemical, biological, etc.) they have been for long-time considered with some caution by scientific community. The lack of a rigorous definition of anomalous TIR signal fluctuations and the scarce attention paid to the possibility that other causes (e.g. meteorological) different from seismic activity could be responsible for the observed TIR variations were the main causes of such skepticism. Compared with previously proposed approaches the general change detection approach, named Robust Satellite Techniques (RST), showed good ability to discriminate anomalous TIR signals possibly associated to seismic activity, from the normal variability of TIR signal due to other causes. Thanks to its full exportability on different satellite packages, since 2001 RST has been implemented on TIR images acquired by polar (e.g. NOAA-AVHRR, EOS -MODIS) and geostationary (e.g. MSG-SEVIRI, NOAA-GOES/W, GMS-5/VISSR) satellite sensors, in order to verify the presence (or absence) of TIR anomalies in presence (absence) of earthquakes (with M>4) in different seismogenic areas around the world (e.g. Italy, Greece, Turkey, India, Taiwan, etc.). In this paper, the RST data analysis approach has been implemented on TIR satellite records collected over Japan by the geostationary satellite sensor MTSAT (Multifunctional Transport SATellites) and RETIRA (Robust Estimator of TIR Anomalies) index was used to identify Significant Sequences of TIR Anomalies (SSTAs) in a possible space-time relations with seismic events. Achieved results will be discussed in the perspective of a multi-parametric approach for a time-Dependent Assessment of Seismic Hazard (t-DASH).

  10. Full-waveform seismic tomography of the Vrancea, Romania, subduction region

    NASA Astrophysics Data System (ADS)

    Baron, Julie; Morelli, Andrea

    2017-12-01

    The Vrancea region is one of the few locations of deep seismicity in Europe. Seismic tomography has been able to map lithospheric downwelling, but has not been able yet to clearly discriminate between competing geodynamic interpretations of the geological and geophysical evidence available. We study the seismic structure of the Vrancea subduction zone, using adjoint-based, full-waveform tomography to map the 3D vP and vS structure in detail. We use the database that was built during the CALIXTO (Carpathian Arc Lithosphere X-Tomography) temporary experiment, restricted to the broadband sensors and local intermediate-depth events. We fit waveforms with a cross-correlation misfit criterion in separate time windows around the expected P and S arrivals, and perform 17 iterations of vP and vS model updates (altogether, requiring about 16 million CPU hours) before reaching stable convergence. Among other features, our resulting model shows a nearly vertical, high-velocity body, that overlaps with the distribution of seismicity in its northeastern part. In its southwestern part, a slab appears to dip less steeply to the NW, and is suggestive of ongoing - or recently concluded - subduction geodynamic processes. Joint inversion for vP and vS allow us to address the vP/vS ratio distribution, that marks high vP/vS in the crust beneath the Focsani sedimentary basin - possibly due to high fluid pressure - and a low vP/vS edge along the lower plane of the subducting lithosphere, that in other similar environment has been attributed to dehydration of serpentine in the slab. In spite of the restricted amount of data available, and limitations on the usable frequency pass-band, full-waveform inversion reveals its potential to improve the general quality of imaging with respect to other tomographic techniques - although at a sensible cost in terms of computing resources. Our study also shows that re-analysis of legacy data sets with up-to-date techniques may bring new, useful, information.

  11. A new moonquake catalog from Apollo 17 geophone data

    NASA Astrophysics Data System (ADS)

    Dimech, Jesse-Lee; Knapmeyer-Endrun, Brigitte; Weber, Renee

    2017-04-01

    New lunar seismic events have been detected on geophone data from the Apollo 17 Lunar Seismic Profile Experiment (LSPE). This dataset is already known to contain an abundance of thermal seismic events, and potentially some meteorite impacts, but prior to this study only 26 days of LSPE "listening mode" data has been analysed. In this new analysis, additional listening mode data collected between August 1976 and April 1977 is incorporated. To the authors knowledge these 8-months of data have not yet been used to detect seismic moonquake events. The geophones in question are situated adjacent to the Apollo 17 site in the Taurus-Littrow valley, about 5.5 km east of Lee-Lincoln scarp, and between the North and South Massifs. Any of these features are potential seismic sources. We have used an event-detection and classification technique based on 'Hidden Markov Models' to automatically detect and categorize seismic signals, in order to objectively generate a seismic event catalog. Currently, 2.5 months of the 8-month listening mode dataset has been processed, totaling 14,338 detections. Of these, 672 detections (classification "n1") have a sharp onset with a steep risetime suggesting they occur close to the recording geophone. These events almost all occur in association with lunar sunrise over the span of 1-2 days. One possibility is that these events originate from the nearby Apollo 17 lunar lander due to rapid heating at sunrise. A further 10,004 detections (classification "d1") show strong diurnal periodicity, with detections increasing during the lunar day and reaching a peak at sunset, and therefore probably represent thermal events from the lunar regolith immediately surrounding the Apollo 17 landing site. The final 3662 detections (classification "d2") have emergent onsets and relatively long durations. These detections have peaks associated with lunar sunrise and sunset, but also sometimes have peaks at seemingly random times. Their source mechanism has not yet been investigated. It's possible that many of these are misclassified d1/n1 events, and further QC work needs to be undertaken. But it is also possible that many of these represent more distant thermal moonquakes e.g. from the North and South massif, or even the ridge adjacent to the Lee-Lincoln scarp. The unknown event spikes will be the subject of closer inspection once the HMM technique has been refined.

  12. Constraints on temporal velocity variations associated with an underground gas storage in the Gulf of Valencia using earthquake and seismic ambient noise data

    NASA Astrophysics Data System (ADS)

    Ugalde, Arantza; Gaite, Beatriz; Villaseñor, Antonio

    2016-04-01

    During September 2013, the injection of the base gas in a depleted oil reservoir used as an underground natural gas storage (CASTOR) caused a sudden seismic activity increase in the eastern coast of Spain. As a result, a compact cluster of more than 550 earthquakes with magnitudes mbLg > 0.7 were located in the shallow offshore area of the Gulf of Valencia during two months. The strongest event, having a magnitude of Mw=4.2, was followed by two Mw=4.1 events the day after and took place once the gas injection activities had finished. Using the seismic data recorded by permanent stations at more than 25 km from the injection well, we applied coda wave interferometry to monitor changes in seismic velocity structure between similar earthquakes. Then we solved for a continuous function of velocity changes with time by combining observations from all the closely located earthquake sources. The rate of repeating events allowed measurements of relative velocity variations for about 30 days on a daily scale. To extend the analysis in time, we also processed the continuous data using the autocorrelation of band-pass filtered ambient seismic noise. A 10-day average was required to achieve a sufficient signal-to-noise ratio in the 0.2-0.5 Hz and 0.5-1 Hz frequency bands. We quantified the time lags between two traces in the frequency and time domains by means of the Moving Window Cross Spectral Analysis and a Dynamic Time Warping technique, respectively. Injection of fluids in geologic formations causes variations in seismic velocities associated to changes in fluid saturation, increase in pore pressure or opening or enlargement of cracks due to the injection process. Time delays associated with stress changes caused by moderate to large earthquakes have also been established. In this work, we found no velocity changes during the gas injection period nor on the occasion of the Mw 4.2 earthquake. The sensitivity of the method is dependent on the seismic network geometry and the lateral extent of the velocity anomaly. With the given network configuration we conclude that any temporal changes in seismic velocities in the CASTOR gas storage area must be smaller than 0.05%.

  13. Computer power fathoms the depths: billion-bit data processors illuminate the subsurface. [3-D Seismic techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ross, J.J.

    Some of the same space-age signal technology being used to track events 200 miles above the earth is helping petroleum explorationists track down oil and natural gas two miles and more down into the earth. The breakthroughs, which have come in a technique called three-dimensional seismic work, could change the complexion of exploration for oil and natural gas. Thanks to this 3-D seismic approach, explorationists can make dynamic maps of sites miles beneath the surface. Then explorationists can throw these maps on space-age computer systems and manipulate them every which way - homing in sharply on salt domes, faults, sandsmore » and traps associated with oil and natural gas. ''The 3-D seismic scene has exploded within the last two years,'' says, Peiter Tackenberg, Marathon technical consultant who deals with both domestic and international exploration. The 3-D technique has been around for more than a decade, he notes, but recent achievements in space-age computer hardware and software have unlocked its full potential.« less

  14. Seismic and Biological Sources of Ambient Ocean Sound

    NASA Astrophysics Data System (ADS)

    Freeman, Simon Eric

    Sound is the most efficient radiation in the ocean. Sounds of seismic and biological origin contain information regarding the underlying processes that created them. A single hydrophone records summary time-frequency information from the volume within acoustic range. Beamforming using a hydrophone array additionally produces azimuthal estimates of sound sources. A two-dimensional array and acoustic focusing produce an unambiguous two-dimensional `image' of sources. This dissertation describes the application of these techniques in three cases. The first utilizes hydrophone arrays to investigate T-phases (water-borne seismic waves) in the Philippine Sea. Ninety T-phases were recorded over a 12-day period, implying a greater number of seismic events occur than are detected by terrestrial seismic monitoring in the region. Observation of an azimuthally migrating T-phase suggests that reverberation of such sounds from bathymetric features can occur over megameter scales. In the second case, single hydrophone recordings from coral reefs in the Line Islands archipelago reveal that local ambient reef sound is spectrally similar to sounds produced by small, hard-shelled benthic invertebrates in captivity. Time-lapse photography of the reef reveals an increase in benthic invertebrate activity at sundown, consistent with an increase in sound level. The dominant acoustic phenomenon on these reefs may thus originate from the interaction between a large number of small invertebrates and the substrate. Such sounds could be used to take census of hard-shelled benthic invertebrates that are otherwise extremely difficult to survey. A two-dimensional `map' of sound production over a coral reef in the Hawaiian Islands was obtained using two-dimensional hydrophone array in the third case. Heterogeneously distributed bio-acoustic sources were generally co-located with rocky reef areas. Acoustically dominant snapping shrimp were largely restricted to one location within the area surveyed. This distribution of sources could reveal small-scale spatial ecological limitations, such as the availability of food and shelter. While array-based passive acoustic sensing is well established in seismoacoustics, the technique is little utilized in the study of ambient biological sound. With the continuance of Moore's law and advances in battery and memory technology, inferring biological processes from ambient sound may become a more accessible tool in underwater ecological evaluation and monitoring.

  15. Dissipative Intraplate Faulting During the 2016 Mw 6.2 Tottori, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Ross, Zachary E.; Kanamori, Hiroo; Hauksson, Egill; Aso, Naofumi

    2018-02-01

    The 2016 Mw 6.2 Tottori earthquake occurred on 21 October 2016 and produced thousands of aftershocks. Here we analyze high-resolution-relocated seismicity together with source properties of the mainshock to better understand the rupture process and energy budget. We use a matched-filter algorithm to detect and precisely locate >10,000 previously unidentified aftershocks, which delineate a network of sharp subparallel lineations exhibiting significant branching and segmentation. Seismicity below 8 km depth forms highly localized fault structures subparallel to the mainshock strike. Shallow seismicity near the main rupture plane forms more diffuse clusters and lineations that often are at a high angle (in map view) to the mainshock strike. An empirical Green's function technique is used to derive apparent source time functions for the mainshock, which show a large amplitude pulse 2-4 s long. We invert the apparent source time functions for a slip distribution and observe a 16 km2 patch with average slip 3.2 m. 93% of the seismic moment is below 8 km depth, which is approximately the depth below which the seismicity becomes very localized. These observations suggest that the mainshock rupture area was entirely within the lower half of the seismogenic zone. The radiated seismic energy is estimated to be 5.7 × 1013 J, while the static stress drop is estimated to be 18-27 MPa. These values yield a radiation efficiency of 5-7%, which indicates that the Tottori mainshock was extremely dissipative. We conclude that this inefficiency in energy radiation is likely a product of the immature intraplate environment and the underlying geometric complexity.

  16. Large Subduction Earthquake Simulations using Finite Source Modeling and the Offshore-Onshore Ambient Seismic Field

    NASA Astrophysics Data System (ADS)

    Viens, L.; Miyake, H.; Koketsu, K.

    2016-12-01

    Large subduction earthquakes have the potential to generate strong long-period ground motions. The ambient seismic field, also called seismic noise, contains information about the elastic response of the Earth between two seismic stations that can be retrieved using seismic interferometry. The DONET1 network, which is composed of 20 offshore stations, has been deployed atop the Nankai subduction zone, Japan, to continuously monitor the seismotectonic activity in this highly seismically active region. The surrounding onshore area is covered by hundreds of seismic stations, which are operated the National Research Institute for Earth Science and Disaster Prevention (NIED) and the Japan Meteorological Agency (JMA), with a spacing of 15-20 km. We retrieve offshore-onshore Green's functions from the ambient seismic field using the deconvolution technique and use them to simulate the long-period ground motions of moderate subduction earthquakes that occurred at shallow depth. We extend the point source method, which is appropriate for moderate events, to finite source modeling to simulate the long-period ground motions of large Mw 7 class earthquake scenarios. The source models are constructed using scaling relations between moderate and large earthquakes to discretize the fault plane of the large hypothetical events into subfaults. Offshore-onshore Green's functions are spatially interpolated over the fault plane to obtain one Green's function for each subfault. The interpolated Green's functions are finally summed up considering different rupture velocities. Results show that this technique can provide additional information about earthquake ground motions that can be used with the existing physics-based simulations to improve seismic hazard assessment.

  17. OCT structure, COB location and magmatic type of the S Angolan & SE Brazilian margins from integrated quantitative analysis of deep seismic reflection and gravity anomaly data

    NASA Astrophysics Data System (ADS)

    Cowie, Leanne; Kusznir, Nick; Horn, Brian

    2014-05-01

    Integrated quantitative analysis using deep seismic reflection data and gravity inversion have been applied to the S Angolan and SE Brazilian margins to determine OCT structure, COB location and magmatic type. Knowledge of these margin parameters are of critical importance for understanding rifted continental margin formation processes and in evaluating petroleum systems in deep-water frontier oil and gas exploration. The OCT structure, COB location and magmatic type of the S Angolan and SE Brazilian rifted continental margins are much debated; exhumed and serpentinised mantle have been reported at these margins. Gravity anomaly inversion, incorporating a lithosphere thermal gravity anomaly correction, has been used to determine Moho depth, crustal basement thickness and continental lithosphere thinning. Residual Depth Anomaly (RDA) analysis has been used to investigate OCT bathymetric anomalies with respect to expected oceanic bathymetries and subsidence analysis has been used to determine the distribution of continental lithosphere thinning. These techniques have been validated for profiles Lusigal 12 and ISE-01 on the Iberian margin. In addition a joint inversion technique using deep seismic reflection and gravity anomaly data has been applied to the ION-GXT BS1-575 SE Brazil and ION-GXT CS1-2400 S Angola deep seismic reflection lines. The joint inversion method solves for coincident seismic and gravity Moho in the time domain and calculates the lateral variations in crustal basement densities and velocities along the seismic profiles. Gravity inversion, RDA and subsidence analysis along the ION-GXT BS1-575 profile, which crosses the Sao Paulo Plateau and Florianopolis Ridge of the SE Brazilian margin, predict the COB to be located SE of the Florianopolis Ridge. Integrated quantitative analysis shows no evidence for exhumed mantle on this margin profile. The joint inversion technique predicts oceanic crustal thicknesses of between 7 and 8 km thickness with normal oceanic basement seismic velocities and densities. Beneath the Sao Paulo Plateau and Florianopolis Ridge, joint inversion predicts crustal basement thicknesses between 10-15km with high values of basement density and seismic velocities under the Sao Paulo Plateau which are interpreted as indicating a significant magmatic component within the crustal basement. The Sao Paulo Plateau and Florianopolis Ridge are separated by a thin region of crustal basement beneath the salt interpreted as a regional transtensional structure. Sediment corrected RDAs and gravity derived "synthetic" RDAs are of a similar magnitude on oceanic crust, implying negligible mantle dynamic topography. Gravity inversion, RDA and subsidence analysis along the S Angolan ION-GXT CS1-2400 profile suggests that exhumed mantle, corresponding to a magma poor margin, is absent..The thickness of earliest oceanic crust, derived from gravity and deep seismic reflection data, is approximately 7km consistent with the global average oceanic crustal thicknesses. The joint inversion predicts a small difference between oceanic and continental crustal basement density and seismic velocity, with the change in basement density and velocity corresponding to the COB independently determined from RDA and subsidence analysis. The difference between the sediment corrected RDA and that predicted from gravity inversion crustal thickness variation implies that this margin is experiencing approximately 500m of anomalous uplift attributed to mantle dynamic uplift.

  18. Development of a technique for long-term detection of precursors of strong earthquakes using high-resolution satellite images

    NASA Astrophysics Data System (ADS)

    Soto-Pinto, C. A.; Arellano-Baeza, A. A.; Ouzounov, D. P.

    2012-12-01

    Among a variety of processes involved in seismic activity, the principal process is the accumulation and relaxation of stress in the crust, which takes place at the depth of tens of kilometers. While the Earth's surface bears at most the indirect sings of the accumulation and relaxation of the crust stress, it has long been understood that there is a strong correspondence between the structure of the underlying crust and the landscape. We assume the structure of the lineaments reflects an internal structure of the Earth's crust, and the variation of the lineament number and arrangement reflects the changes in the stress patterns related to the seismic activity. Contrary to the existing assumptions that lineament structure changes only at the geological timescale, we have found that the much faster seismic activity strongly affects the system of lineaments extracted from the high-resolution multispectral satellite images. Previous studies have shown that accumulation of the stress in the crust previous to a strong earthquake is directly related to the number increment and preferential orientation of lineament configuration present in the satellite images of epicenter zones. This effect increases with the earthquake magnitude and can be observed approximately since one month before. To study in details this effect we have developed a software based on a series of algorithms for automatic detection of lineaments. It was found that the Hough transform implemented after the application of discontinuity detection mechanisms like Canny edge detector or directional filters is the most robust technique for detection and characterization of changes in the lineament patterns related to strong earthquakes, which can be used as a robust long-term precursor of earthquakes indicating regions of strong stress accumulation.

  19. Near Surface Seismic Hazard Characterization in the Presence of High Velocity Contrasts

    NASA Astrophysics Data System (ADS)

    Gribler, G.; Mikesell, D.; Liberty, L. M.

    2017-12-01

    We present new multicomponent surface wave processing techniques that provide accurate characterization of near-surface conditions in the presence of large lateral or vertical shear wave velocity boundaries. A common problem with vertical component Rayleigh wave analysis in the presence of high contrast subsurface conditions is Rayleigh wave propagation mode misidentification due to an overlap of frequency-phase velocity domain dispersion, leading to an overestimate of shear wave velocities. By using the vertical and horizontal inline component signals, we isolate retrograde and prograde particle motions to separate fundamental and higher mode signals, leading to more accurate and confident dispersion curve picks and shear wave velocity estimates. Shallow, high impedance scenarios, such as the case with shallow bedrock, are poorly constrained when using surface wave dispersion information alone. By using a joint inversion of dispersion and horizontal-to-vertical (H/V) curves within active source frequency ranges (down to 3 Hz), we can accurately estimate the depth to high impedance boundaries, a significant improvement compared to the estimates based on dispersion information alone. We compare our approach to body wave results that show comparable estimates of bedrock topography. For lateral velocity contrasts, we observe horizontal polarization of Rayleigh waves identified by an increase in amplitude and broadening of the horizontal spectra with little variation in the vertical component spectra. The horizontal spectra offer a means to identify and map near surface faults where there is no topographic or clear body wave expression. With these new multicomponent active source seismic data processing and inversion techniques, we better constrain a variety of near surface conditions critical to the estimation of local site response and seismic hazards.

  20. On the recovery of missing low and high frequency information from bandlimited reflectivity data

    NASA Astrophysics Data System (ADS)

    Sacchi, M. D.; Ulrych, T. J.

    2007-12-01

    During the last two decades, an important effort in the seismic exploration community has been made to retrieve broad-band seismic data by means of deconvolution and inversion. In general, the problem can be stated as a spectral reconstruction problem. In other words, given limited spectral information about the earth's reflectivity sequence, one attempts to create a broadband estimate of the Fourier spectra of the unknown reflectivity. Techniques based on the principle of parsimony can be effectively used to retrieve a sparse spike sequence and, consequently, a broad band signal. Alternatively, continuation methods, e.g., autoregressive modeling, can be used to extrapolate the recorded bandwidth of the seismic signal. The goal of this paper is to examine under what conditions the recovery of low and high frequencies from band-limited and noisy signals is possible. At the heart of the methods we discuss, is the celebrated non-Gaussian assumption so important in many modern signal processing methods, such as ICA, for example. Spectral recovery from limited information tends to work when the reflectivity consist of a few well isolated events. Results degrade with the number of reflectors, decreasing SNR and decreasing bandwidth of the source wavelet. Constrains and information-based priors can be used to stabilize the recovery but, as in all inverse problems, the solution is nonunique and effort is required to understand the level of recovery that is achievable, always keeping the physics of the problem in mind. We provide in this paper, a survey of methods to recover broad-band reflectivity sequences and examine the role that these techniques can play in the processing and inversion as applied to exploration and global seismology.

  1. Exploring the interior of Venus with seismic and infrasonic techniques

    NASA Astrophysics Data System (ADS)

    Jackson, J. M.; Cutts, J. A.; Pauken, M.; Komjathy, A.; Smrekar, S. E.; Kedar, S.; Mimoun, D.; Garcia, R.; Schubert, G.; Lebonnois, S.; Stevenson, D. J.; Lognonne, P. H.; Zhan, Z.; Ko, J. Y. T.; Tsai, V. C.

    2016-12-01

    The dense atmosphere of Venus, which efficiently couples seismic energy into the atmosphere as infrasonic waves, enables an alternative to conventional seismology: detection of infrasonic waves in the upper atmosphere using either high altitude balloons or orbiting spacecraft. Infrasonic techniques for probing the interior of Venus can be implemented without exposing sensors to the severe surface environments on Venus. This approach takes advantage of the fact that approximately sixty-times the energy from a seismic event on Venus is coupled into the atmosphere on Venus as would occur for a comparable event on Earth. The direct or epicentral wave propagates vertically above the event, and the indirect wave propagates through the planet as a Rayleigh wave and then couples to an infrasonic wave. Although there is abundant evidence of tectonic activity on Venus, questions remain as to whether the planet is still active and whether energy releases are seismic or aseismic. In recent years, seismologists have developed techniques for probing crustal and interior structure in parts of the Earth where there are very few quakes. We have begun an effort to determine if this is also possible for Venus. Just as seismic energy propagates more efficiently upward across the surface atmosphere interface, equally acoustic energy originating in the atmosphere will propagate downwards more effectively. Measurements from a balloon platform in the atmosphere of Venus could assess the nature and spectral content of such sources, while having the ability to identify and discriminate signatures from volcanic events, storm activity, and meteor impacts. We will discuss our ongoing assessment on the feasibility of a balloon acoustic monitoring system. In particular, we will highlight our results of the flight experiment on Earth that will focus on using barometer instruments on a tethered helium-filled balloon in the vicinity of a known seismic source generated by a seismic hammer. Implications for conducting such measurements on Venus, including seismic and aseismic energy sources and propagation through its atmosphere, will also be discussed.

  2. Patterns of significant seismic quiescence in the Pacific Mexican coast

    NASA Astrophysics Data System (ADS)

    Muñoz-Diosdado, Alejandro; Rudolf-Navarro, Adolfo; Barrera-Ferrer, Amilcar; Angulo-Brown, Fernando

    2014-05-01

    Mexico is one of the countries with higher seismicity. During the 20th century, 8% of all the earthquakes in the world of magnitude greater than or equal to 7.0 have taken place in Mexico. On average, an earthquake of magnitude greater than or equal to 7.0 occurred in Mexico every two and a half years. Great earthquakes in Mexico have their epicenters in the Pacific Coast in which some seismic gaps have been identified; for example, there is a mature gap in the Guerrero State Coast, which potentially can produce an earthquake of magnitude 8.2. With the purpose of making some prognosis, some researchers study the statistical behavior of certain physical parameters that could be related with the process of accumulation of stress in the Earth crust. Other researchers study seismic catalogs trying to find seismicity patterns that are manifested before the occurrence of great earthquakes. Many authors have proposed that the study of seismicity rates is an appropriate technique for evaluating how close a seismic gap may be to rupture. We designed an algorithm for identification of patterns of significant seismic quiescence by using the definition of seismic quiescence proposed by Schreider (1990). This algorithm shows the area of quiescence where an earthquake of great magnitude will probably occur. We apply our algorithm to the earthquake catalogue of the Mexican Pacific coast located between 14 and 21 degrees of North latitude and 94 and 106 degrees West longitude; with depths less or equal to 60 km and magnitude greater or equal to 4.2, which occurred from September, 1965 until December, 2014. We have found significant patterns of seismic quietude before the earthquakes of Oaxaca (November 1978, Mw = 7.8), Petatlán (March 1979, Mw = 7.6), Michoacán (September 1985, Mw = 8.0, and Mw = 7.6) and Colima (October 1995, Mw = 8.0). Fortunately, in this century have not occurred earthquakes of great magnitude in Mexico, however, we have identified well-defined seismic quiescence in the Guerrero seismic-gap, which are apparently correlated with the occurrence of silent earthquakes in 2002, 2006 and 2011 recently discovered by GPS technology. In fact, a possible silent earthquake with Mw =7.6 occurred at this gap in 2002 which lasted for approximately 4 months and was detected by continuous GPS receivers located over an area of ~550x250 square kilometers.

  3. Modelling framework developed for managing and forecasting the El Hierro 2011-2014 unrest processes based on the analysis of the seismicity and deformation data rate.

    NASA Astrophysics Data System (ADS)

    Garcia, Alicia; Fernandez-Ros, Alberto; Berrocoso, Manuel; Marrero, Jose Manuel; Prates, Gonçalo; De la Cruz-Reyna, Servando; Ortiz, Ramon

    2014-05-01

    In July 2011 at El Hierro (Canary Islands, Spain), a volcanic unrest was detected, with significant deformations followed by increased seismicity. A submarine eruption started on 10 October 2011 and ceased on 5 March 2012, after the volcanic tremor signals persistently weakened through February 2012. However, the seismic activity did not end when the eruption, as several other seismic crises followed since. The seismic episodes presented a characteristic pattern: over a few days the number and magnitude of seismic event increased persistently, culminating in seismic events severe enough to be felt all over the island. In all cases the seismic activity was preceded by significant deformations measured on the island's surface that continued during the whole episode. Analysis of the available GNSS-GPS and seismic data suggests that several magma injection processes occurred at depth from the beginning of the unrest. A model combining the geometry of the magma injection process and the variations in seismic energy released has allowed successful forecasting of the new-vent opening. The model presented here places special emphasis on phenomena associated to moderate eruptions, as well as on volcano-tectonic earthquakes and landslides, which in some cases, as in El Hierro, may be more destructive than an eruption itself.

  4. Spatial and temporal variation of seismic velocity during earthquakes and volcanic eruptions in western Japan: Insight into mechanism for seismic velocity variation

    NASA Astrophysics Data System (ADS)

    Tsuji, T.; Ikeda, T.; Nimiya, H.

    2017-12-01

    We report spatio-temporal variations of seismic velocity around the seismogenic faults in western Japan. We mainly focus on the seismic velocity variation during (1) the 2016 Off-Mie earthquake in the Nankai subduction zone (Mw5.8) and (2) the 2016 Kumamoto earthquake in Kyushu Island (Mw7.0). We applied seismic interferometry and surface wave analysis to the ambient noise data recorded by Hi-net and DONET seismometers of National Research Institute for Earth Science and Disaster Resilience (NIED). Seismic velocity near the rupture faults and volcano decreased during the earthquake. For example, we observed velocity reduction around the seismogenic Futagawa-Hinagu fault system and Mt Aso in the 2016 Kumamoto earthquake. We also identified velocity increase after the eruptions of Mt Aso. During the 2016 Off-Mie earthquake, we observed seismic velocity variation in the Nankai accretionary prism. After the earthquakes, the seismic velocity gradually returned to the pre-earthquake value. The velocity recovering process (healing process) is caused by several mechanisms, such as pore pressure reduction, strain change, and crack sealing. By showing the velocity variations obtained at different geologic settings (volcano, seismogenic fault, unconsolidated sediment), we discuss the mechanism of seismic velocity variation as well as the post-seismic fault healing process.

  5. Principal component analysis vs. self-organizing maps combined with hierarchical clustering for pattern recognition in volcano seismic spectra

    NASA Astrophysics Data System (ADS)

    Unglert, K.; Radić, V.; Jellinek, A. M.

    2016-06-01

    Variations in the spectral content of volcano seismicity related to changes in volcanic activity are commonly identified manually in spectrograms. However, long time series of monitoring data at volcano observatories require tools to facilitate automated and rapid processing. Techniques such as self-organizing maps (SOM) and principal component analysis (PCA) can help to quickly and automatically identify important patterns related to impending eruptions. For the first time, we evaluate the performance of SOM and PCA on synthetic volcano seismic spectra constructed from observations during two well-studied eruptions at Klauea Volcano, Hawai'i, that include features observed in many volcanic settings. In particular, our objective is to test which of the techniques can best retrieve a set of three spectral patterns that we used to compose a synthetic spectrogram. We find that, without a priori knowledge of the given set of patterns, neither SOM nor PCA can directly recover the spectra. We thus test hierarchical clustering, a commonly used method, to investigate whether clustering in the space of the principal components and on the SOM, respectively, can retrieve the known patterns. Our clustering method applied to the SOM fails to detect the correct number and shape of the known input spectra. In contrast, clustering of the data reconstructed by the first three PCA modes reproduces these patterns and their occurrence in time more consistently. This result suggests that PCA in combination with hierarchical clustering is a powerful practical tool for automated identification of characteristic patterns in volcano seismic spectra. Our results indicate that, in contrast to PCA, common clustering algorithms may not be ideal to group patterns on the SOM and that it is crucial to evaluate the performance of these tools on a control dataset prior to their application to real data.

  6. Qualitative and quantitative comparison of geostatistical techniques of porosity prediction from the seismic and logging data: a case study from the Blackfoot Field, Alberta, Canada

    NASA Astrophysics Data System (ADS)

    Maurya, S. P.; Singh, K. H.; Singh, N. P.

    2018-05-01

    In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.

  7. Spectral-element simulations of wave propagation in complex exploration-industry models: Imaging and adjoint tomography

    NASA Astrophysics Data System (ADS)

    Luo, Y.; Nissen-Meyer, T.; Morency, C.; Tromp, J.

    2008-12-01

    Seismic imaging in the exploration industry is often based upon ray-theoretical migration techniques (e.g., Kirchhoff) or other ideas which neglect some fraction of the seismic wavefield (e.g., wavefield continuation for acoustic-wave first arrivals) in the inversion process. In a companion paper we discuss the possibility of solving the full physical forward problem (i.e., including visco- and poroelastic, anisotropic media) using the spectral-element method. With such a tool at hand, we can readily apply the adjoint method to tomographic inversions, i.e., iteratively improving an initial 3D background model to fit the data. In the context of this inversion process, we draw connections between kernels in adjoint tomography and basic imaging principles in migration. We show that the images obtained by migration are nothing but particular kinds of adjoint kernels (mainly density kernels). Migration is basically a first step in the iterative inversion process of adjoint tomography. We apply the approach to basic 2D problems involving layered structures, overthrusting faults, topography, salt domes, and poroelastic regions.

  8. pySeismicDQA: open source post experiment data quality assessment and processing

    NASA Astrophysics Data System (ADS)

    Polkowski, Marcin

    2017-04-01

    Seismic Data Quality Assessment is python based, open source set of tools dedicated for data processing after passive seismic experiments. Primary goal of this toolset is unification of data types and formats from different dataloggers necessary for further processing. This process requires additional data checks for errors, equipment malfunction, data format errors, abnormal noise levels, etc. In all such cases user needs to decide (manually or by automatic threshold) if data is removed from output dataset. Additionally, output dataset can be visualized in form of website with data availability charts and waveform visualization with earthquake catalog (external). Data processing can be extended with simple STA/LTA event detection. pySeismicDQA is designed and tested for two passive seismic experiments in central Europe: PASSEQ 2006-2008 and "13 BB Star" (2013-2016). National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.

  9. Real-time Microseismic Processing for Induced Seismicity Hazard Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzel, Eric M.

    Induced seismicity is inherently associated with underground fluid injections. If fluids are injected in proximity to a pre-existing fault or fracture system, the resulting elevated pressures can trigger dynamic earthquake slip, which could both damage surface structures and create new migration pathways. The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterizationmore » phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.« less

  10. A new moonquake catalog from Apollo 17 seismic data I: Lunar Seismic Profiling Experiment: Thermal moonquakes and implications for surface processes

    NASA Astrophysics Data System (ADS)

    Weber, R. C.; Dimech, J. L.; Phillips, D.; Molaro, J.; Schmerr, N. C.

    2017-12-01

    Apollo 17's Lunar Seismic Profiling Experiment's (LSPE) primary objective was to constrain the near-surface velocity structure at the landing site using active sources detected by a 100 m-wide triangular geophone array. The experiment was later operated in "listening mode," and early studies of these data revealed the presence of thermal moonquakes - short-duration seismic events associated with terminator crossings. However, the full data set has never been systematically analyzed for natural seismic signal content. In this study, we analyze 8 months of continuous LSPE data using an automated event detection technique that has previously successfully been applied to the Apollo 16 Passive Seismic Experiment data. We detected 50,000 thermal moonquakes from three distinct event templates, representing impulsive, intermediate, and emergent onset of seismic energy, which we interpret as reflecting their relative distance from the array. Impulsive events occur largely at sunrise, possibly representing the thermal "pinging" of the nearby lunar lander, while emergent events occur at sunset, possibly representing cracking or slumping in more distant surface rocks and regolith. Preliminary application of an iterative event location algorithm to a subset of the impulsive waveforms supports this interpretation. We also perform 3D modeling of the lunar surface to explore the relative contribution of the lander, known rocks and surrounding topography to the thermal state of the regolith in the vicinity of the Apollo 17 landing site over the course of the lunar diurnal cycle. Further development of both this model and the event location algorithm may permit definitive discrimination between different types of local diurnal events e.g. lander noise, thermally-induced rock breakdown, or fault creep on the nearby Lee-Lincoln scarp. These results could place important constraints on both the contribution of seismicity to regolith production, and the age of young lobate scarps.

  11. Improving Thin Bed Identification in Sarawak Basin Field using Short Time Fourier Transform Half Cepstrum (STFTHC) method

    NASA Astrophysics Data System (ADS)

    Nizarul, O.; Hermana, M.; Bashir, Y.; Ghosh, D. P.

    2016-02-01

    In delineating complex subsurface geological feature, broad band of frequencies are needed to unveil the often hidden features of hydrocarbon basin such as thin bedding. The ability to resolve thin geological horizon on seismic data is recognized to be a fundamental importance for hydrocarbon exploration, seismic interpretation and reserve prediction. For thin bedding, high frequency content is needed to enable tuning, which can be done by applying the band width extension technique. This paper shows an application of Short Time Fourier Transform Half Cepstrum (STFTHC) method, a frequency bandwidth expansion technique for non-stationary seismic signal in increasing the temporal resolution to uncover thin beds and improve characterization of the basin. A wedge model and synthetic seismic data is used to quantify the algorithm as well as real data from Sarawak basin were used to show the effectiveness of this method in enhancing the resolution.

  12. Active and passive seismic methods for characterization and monitoring of unstable rock masses: field surveys, laboratory tests and modeling.

    NASA Astrophysics Data System (ADS)

    Colombero, Chiara; Baillet, Laurent; Comina, Cesare; Jongmans, Denis; Vinciguerra, Sergio

    2016-04-01

    Appropriate characterization and monitoring of potentially unstable rock masses may provide a better knowledge of the active processes and help to forecast the evolution to failure. Among the available geophysical methods, active seismic surveys are often suitable to infer the internal structure and the fracturing conditions of the unstable body. For monitoring purposes, although remote-sensing techniques and in-situ geotechnical measurements are successfully tested on landslides, they may not be suitable to early forecast sudden rapid rockslides. Passive seismic monitoring can help for this purpose. Detection, classification and localization of microseismic events within the prone-to-fall rock mass can provide information about the incipient failure of internal rock bridges. Acceleration to failure can be detected from an increasing microseismic event rate. The latter can be compared with meteorological data to understand the external factors controlling stability. On the other hand, seismic noise recorded on prone-to-fall rock slopes shows that the temporal variations in spectral content and correlation of ambient vibrations can be related to both reversible and irreversible changes within the rock mass. We present the results of the active and passive seismic data acquired at the potentially unstable granitic cliff of Madonna del Sasso (NW Italy). Down-hole tests, surface refraction and cross-hole tomography were carried out for the characterization of the fracturing state of the site. Field surveys were implemented with laboratory determination of physico-mechanical properties on rock samples and measurements of the ultrasonic pulse velocity. This multi-scale approach led to a lithological interpretation of the seismic velocity field obtained at the site and to a systematic correlation of the measured velocities with physical properties (density and porosity) and macroscopic features of the granitic cliff (fracturing, weathering and anisotropy). Continuous passive seismic monitoring at the site, from October 2013 to present, systematically highlighted clear energy peaks in the spectral content of seismic noise on the unstable sector, interpreted as resonant frequencies of the investigated volume. Both spectral analysis and cross-correlation of seismic noise showed seasonal reversible variation trends related to air temperature fluctuations. No irreversible changes, resulting from serious damage processes within the rock mass, were detected so far. Modal analysis and geomechanical modeling of the unstable cliff are currently under investigation to better understand the vibration modes that could explain the measured amplitude and orientation of ground motion at the first resonant frequencies. Classification and location of microseismic events still remains the most challenging task, due to the complex structural and morphological setting of the site.

  13. Innovative Approaches for Seismic Studies of Mars (Invited)

    NASA Astrophysics Data System (ADS)

    Banerdt, B.

    2010-12-01

    In addition to its intrinsic interest, Mars is particularly well-suited for studying the full range of processes and phenomena related to early terrestrial planet evolution, from initial differentiation to the start of plate tectonics. It is large and complex enough to have undergone most of the processes that affected early Earth but, unlike the Earth, has apparently not undergone extensive plate tectonics or other major reworking that erased the imprint of early events (as evidenced by the presence of cratered surfaces older than 4 Ga). The martian mantle should have Earth-like polymorphic phase transitions and may even support a perovskite layer near the core (depending on the actual core radius), a characteristic that would have major implications for core cooling and mantle convection. Thus even the most basic measurements of planetary structure, such as crustal thickness, core radius and state (solid/liquid), and gross mantle velocity structure would provide invaluable constraints on models of early planetary evolution. Despite this strong scientific motivation (and several failed attempts), Mars remains terra incognita from a seismic standpoint. This is due to an unfortunate convergence of circumstances, prominent among which are our uncertainty in the level of seismic activity and the relatively high cost of landing multiple long-lived spacecraft on Mars to comprise a seismic network for body-wave travel-time analysis; typically four to ten stations are considered necessary for this type of experiment. In this presentation I will address both of these issues. In order to overcome the concern about a possible lack of marsquakes with which to work, it is useful to identify alternative methods for using seismic techniques to probe the interior. Seismology without quakes can be accomplished in a number of ways. “Unconventional” sources of seismic energy include meteorites (which strike the surface of Mars at a relatively high rate), artificial projectiles (which can supply up to 1010 J of kinetic energy), seismic “hum” from meteorological forcing, and tidal deformation from Phobos (with a period around 6 hours). Another means for encouraging a seismic mission to Mars is to promote methods that can derive interior information from a single seismometer. Fortunately many such methods exist, including source location through P-S and back-azimuth, receiver functions, identification of later phases (PcP, PKP, etc.), surface wave dispersion, and normal mode analysis (from single large events, stacked events, or background noise). Such methods could enable the first successful seismic investigation of another planet since the Apollo seismometers were turned off almost 35 years ago.

  14. Using a cross correlation technique to refine the accuracy of the Failure Forecast Method: Application to Soufrière Hills volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Salvage, R. O.; Neuberg, J. W.

    2016-09-01

    Prior to many volcanic eruptions, an acceleration in seismicity has been observed, suggesting the potential for this as a forecasting tool. The Failure Forecast Method (FFM) relates an accelerating precursor to the timing of failure by an empirical power law, with failure being defined in this context as the onset of an eruption. Previous applications of the FFM have used a wide variety of accelerating time series, often generating questionable forecasts with large misfits between data and the forecast, as well as the generation of a number of different forecasts from the same data series. Here, we show an alternative approach applying the FFM in combination with a cross correlation technique which identifies seismicity from a single active source mechanism and location at depth. Isolating a single system at depth avoids additional uncertainties introduced by averaging data over a number of different accelerating phenomena, and consequently reduces the misfit between the data and the forecast. Similar seismic waveforms were identified in the precursory accelerating seismicity to dome collapses at Soufrière Hills volcano, Montserrat in June 1997, July 2003 and February 2010. These events were specifically chosen since they represent a spectrum of collapse scenarios at this volcano. The cross correlation technique generates a five-fold increase in the number of seismic events which could be identified from continuous seismic data rather than using triggered data, thus providing a more holistic understanding of the ongoing seismicity at the time. The use of similar seismicity as a forecasting tool for collapses in 1997 and 2003 greatly improved the forecasted timing of the dome collapse, as well as improving the confidence in the forecast, thereby outperforming the classical application of the FFM. We suggest that focusing on a single active seismic system at depth allows a more accurate forecast of some of the major dome collapses from the ongoing eruption at Soufrière Hills volcano, and provides a simple addition to the well-used methodology of the FFM.

  15. Seismic Techniques for Subsurface Voids Detection

    NASA Astrophysics Data System (ADS)

    Gritto, Roland; Korneev, Valeri; Elobaid Elnaiem, Ali; Mohamed, Fathelrahman; Sadooni, Fadhil

    2016-04-01

    A major hazards in Qatar is the presence of karst, which is ubiquitous throughout the country including depressions, sinkholes, and caves. Causes for the development of karst include faulting and fracturing where fluids find pathways through limestone and dissolve the host rock to form caverns. Of particular concern in rapidly growing metropolitan areas that expand in heretofore unexplored regions are the collapse of such caverns. Because Qatar has seen a recent boom in construction, including the planning and development of complete new sub-sections of metropolitan areas, the development areas need to be investigated for the presence of karst to determine their suitability for the planned project. In this paper, we present the results of a study to demonstrate a variety of seismic techniques to detect the presence of a karst analog in form of a vertical water-collection shaft located on the campus of Qatar University, Doha, Qatar. Seismic waves are well suited for karst detection and characterization. Voids represent high-contrast seismic objects that exhibit strong responses due to incident seismic waves. However, the complex geometry of karst, including shape and size, makes their imaging nontrivial. While karst detection can be reduced to the simple problem of detecting an anomaly, karst characterization can be complicated by the 3D nature of the problem of unknown scale, where irregular surfaces can generate diffracted waves of different kind. In our presentation we employ a variety of seismic techniques to demonstrate the detection and characterization of a vertical water collection shaft analyzing the phase, amplitude and spectral information of seismic waves that have been scattered by the object. We used the reduction in seismic wave amplitudes and the delay in phase arrival times in the geometrical shadow of the vertical shaft to independently detect and locate the object in space. Additionally, we use narrow band-pass filtered data combining two orthogonal transmission surveys to detect and locate the object. Furthermore, we showed that ambient noise recordings may generate data with sufficient signal-to-noise ratio to successfully detect and locate subsurface voids. Being able to use ambient noise recordings would eliminate the need to employ active seismic sources that are time consuming and more expensive to operate.

  16. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  17. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready-to-use functional model. In this connection, we recommend to use in hazard analyses non-parametric, kernel estimators of magnitude distribution. The earthquake occurrence process of IIS is not a Poisson process. When earthquakes' occurrences are influenced by a multitude of inducing factors, the interevent time distribution can be modelled by the Weibull distribution supporting a negative ageing property of the process. When earthquake occurrences are due to a specific injection activity, the earthquake rate directly depends on the injection rate and responds immediately to the changes of the injection rate. Furthermore, this response is not limited only to correlated variations of the seismic activity but it also concerns significant changes of the shape of interevent time distribution. Unlike the event rate, the shape of magnitude distribution does not exhibit correlation with the injection rate. This work was supported within SHEER: "Shale Gas Exploration and Exploitation Induced Risks" project funded from Horizon 2020 - R&I Framework Programme, call H2020-LCE 16-2014-1 and within statutory activities No3841/E-41/S/2016 of Ministry of Science and Higher Education of Poland.

  18. Black Thunder Coal Mine and Los Alamos National Laboratory experimental study of seismic energy generated by large scale mine blasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, R.L.; Gross, D.; Pearson, D.C.

    In an attempt to better understand the impact that large mining shots will have on verifying compliance with the international, worldwide, Comprehensive Test Ban Treaty (CTBT, no nuclear explosion tests), a series of seismic and videographic experiments has been conducted during the past two years at the Black Thunder Coal Mine. Personnel from the mine and Los Alamos National Laboratory have cooperated closely to design and perform experiments to produce results with mutual benefit to both organizations. This paper summarizes the activities, highlighting the unique results of each. Topics which were covered in these experiments include: (1) synthesis of seismic,more » videographic, acoustic, and computer modeling data to improve understanding of shot performance and phenomenology; (2) development of computer generated visualizations of observed blasting techniques; (3) documentation of azimuthal variations in radiation of seismic energy from overburden casting shots; (4) identification of, as yet unexplained, out of sequence, simultaneous detonation in some shots using seismic and videographic techniques; (5) comparison of local (0.1 to 15 kilometer range) and regional (100 to 2,000 kilometer range) seismic measurements leading to determine of the relationship between local and regional seismic amplitude to explosive yield for overburden cast, coal bulking and single fired explosions; and (6) determination of the types of mining shots triggering the prototype International Monitoring System for the CTBT.« less

  19. A Study on the Data Compression Technology-Based Intelligent Data Acquisition (IDAQ) System for Structural Health Monitoring of Civil Structures

    PubMed Central

    Jeon, Joonryong

    2017-01-01

    In this paper, a data compression technology-based intelligent data acquisition (IDAQ) system was developed for structural health monitoring of civil structures, and its validity was tested using random signals (El-Centro seismic waveform). The IDAQ system was structured to include a high-performance CPU with large dynamic memory for multi-input and output in a radio frequency (RF) manner. In addition, the embedded software technology (EST) has been applied to it to implement diverse logics needed in the process of acquiring, processing and transmitting data. In order to utilize IDAQ system for the structural health monitoring of civil structures, this study developed an artificial filter bank by which structural dynamic responses (acceleration) were efficiently acquired, and also optimized it on the random El-Centro seismic waveform. All techniques developed in this study have been embedded to our system. The data compression technology-based IDAQ system was proven valid in acquiring valid signals in a compressed size. PMID:28704945

  20. A Study on the Data Compression Technology-Based Intelligent Data Acquisition (IDAQ) System for Structural Health Monitoring of Civil Structures.

    PubMed

    Heo, Gwanghee; Jeon, Joonryong

    2017-07-12

    In this paper, a data compression technology-based intelligent data acquisition (IDAQ) system was developed for structural health monitoring of civil structures, and its validity was tested using random signals (El-Centro seismic waveform). The IDAQ system was structured to include a high-performance CPU with large dynamic memory for multi-input and output in a radio frequency (RF) manner. In addition, the embedded software technology (EST) has been applied to it to implement diverse logics needed in the process of acquiring, processing and transmitting data. In order to utilize IDAQ system for the structural health monitoring of civil structures, this study developed an artificial filter bank by which structural dynamic responses (acceleration) were efficiently acquired, and also optimized it on the random El-Centro seismic waveform. All techniques developed in this study have been embedded to our system. The data compression technology-based IDAQ system was proven valid in acquiring valid signals in a compressed size.

  1. Continuous catchment-scale monitoring of geomorphic processes with a 2-D seismological array

    NASA Astrophysics Data System (ADS)

    Burtin, A.; Hovius, N.; Milodowski, D.; Chen, Y.-G.; Wu, Y.-M.; Lin, C.-W.; Chen, H.

    2012-04-01

    The monitoring of geomorphic processes during extreme climatic events is of a primary interest to estimate their impact on the landscape dynamics. However, available techniques to survey the surface activity do not provide a relevant time and/or space resolution. Furthermore, these methods hardly investigate the dynamics of the events since their detection are made a posteriori. To increase our knowledge of the landscape evolution and the influence of extreme climatic events on a catchment dynamics, we need to develop new tools and procedures. In many past works, it has been shown that seismic signals are relevant to detect and locate surface processes (landslides, debris flows). During the 2010 typhoon season, we deployed a network of 12 seismometers dedicated to monitor the surface processes of the Chenyoulan catchment in Taiwan. We test the ability of a two dimensional array and small inter-stations distances (~ 11 km) to map in continuous and at a catchment-scale the geomorphic activity. The spectral analysis of continuous records shows a high-frequency (> 1 Hz) seismic energy that is coherent with the occurrence of hillslope and river processes. Using a basic detection algorithm and a location approach running on the analysis of seismic amplitudes, we manage to locate the catchment activity. We mainly observe short-time events (> 300 occurrences) associated with debris falls and bank collapses during daily convective storms, where 69% of occurrences are coherent with the time distribution of precipitations. We also identify a couple of debris flows during a large tropical storm. In contrast, the FORMOSAT imagery does not detect any activity, which somehow reflects the lack of extreme climatic conditions during the experiment. However, high resolution pictures confirm the existence of links between most of geomorphic events and existing structures (landslide scars, gullies...). We thus conclude to an activity that is dominated by reactivation processes. It highlights the major interest of a seismic monitoring since it allows a detailed spatial and temporal survey of events that classic approaches are not able to observe. In the future, dense two dimensional seismological arrays will assess in real-time the landscape dynamics of an entire catchment, tracking sediments from slopes to rivers.

  2. Application of the principal component analysis (PCA) to HVSR data aimed at the seismic characterization of earthquake prone areas

    NASA Astrophysics Data System (ADS)

    Paolucci, Enrico; Lunedei, Enrico; Albarello, Dario

    2017-10-01

    In this work, we propose a procedure based on principal component analysis on data sets consisting of many horizontal to vertical spectral ratio (HVSR or H/V) curves obtained by single-station ambient vibration acquisitions. This kind of analysis aimed at the seismic characterization of the investigated area by identifying sites characterized by similar HVSR curves. It also allows to extract the typical HVSR patterns of the explored area and to establish their relative importance, providing an estimate of the level of heterogeneity under the seismic point of view. In this way, an automatic explorative seismic characterization of the area becomes possible by only considering ambient vibration data. This also implies that the relevant outcomes can be safely compared with other available information (geological data, borehole measurements, etc.) without any conceptual trade-off. The whole algorithm is remarkably fast: on a common personal computer, the processing time takes few seconds for a data set including 100-200 HVSR measurements. The procedure has been tested in three study areas in the Central-Northern Italy characterized by different geological settings. Outcomes demonstrate that this technique is effective and well correlates with most significant seismostratigraphical heterogeneities present in each of the study areas.

  3. Synthetic Seismograms Derived from Oceanographic Data in the Campeche Canyon, Deepwater Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Gonzalez-Orduno, A.; Fucugauchi, J. U.; Monreal, M.; Perez-Cruz, G.; Salas de León, D. A.

    2013-05-01

    The seismic reflection method has been successfully applied worldwide to investigate subsurface conditions to support important business decisions in the oil industry. When applied in the marine environment, useful reflection information is limited to events on and below the sea floor; Information from the water column, if any, is disregarded. Seismic oceanography is emerging as a new technique that utilize the reflection information within the water column to infer thermal-density contrasts associated with oceanographic processes, such as cyclonic-anticyclonic eddies, ascending-descending water flows, and water flows related to rapid topographic changes on the sea floor. A seismic investigation to infer such oceanographic changes in one sector of the Campeche Canyon is in progress as a research matter at the Instituto de Ciencias del Mar y Limnologia from the University of Mexico (UNAM). First steps of the investigation consisted of creating synthetic seismograms based on oceanographic information (temperature and density) derived from direct observation on a series of close spaced depth points along vertical profiles. Details of the selected algorithms used for the transformation of the oceanographic data to acoustic impedances data sets and further construction of synthetic seismograms on each site and their representation as synthetic seismic sections, are presented in this work, as well as the road ahead in the investigation.

  4. Observing Drought-Induced Groundwater Depletion in California with Seismic Noise

    NASA Astrophysics Data System (ADS)

    Clements, T.; Denolle, M.

    2017-12-01

    While heavy rainfall replenished reservoirs and snowpack recovered in winter 2016/2017, groundwater levels across much of California are still at or near all-time lows following one of the worst droughts in the state's history. Groundwater depletion in California has been studied extensively using GPS, InSAR, and GRACE. Here, we propose to monitor groundwater levels across California through measuring the temporal variation in seismic velocity (dv/v) at a regional scale. In the last decade, dv/v has emerged as a technique to investigate near surface and surficial processes such as landslides, volcanic eruptions, and earthquakes. Toward predicting groundwater levels through real-time monitoring with seismic noise, we investigate the relations between the dv/v time series and observed groundwater levels. 12 years (Jan 2006 - July 2017) of noise cross-correlation functions (CCF) are computed from continuous vertical component seismic data recorded at 100+ sites across California. Velocity changes (dv/v) are obtained by inverting all daily CCFs to produce a dv/v time series for each station pair. Our preliminary results show a seasonal variation in dv/v along with a gradual increase in dv/v throughout the drought. We interpret the increase in dv/v as a response to declining groundwater levels.

  5. Seismic Oceanography's Failure to Flourish: A Possible Solution

    NASA Astrophysics Data System (ADS)

    Ruddick, B. R.

    2018-01-01

    A recent paper in Journal of Geophysical Research: Oceans used multichannel seismic observations to map estimates of internal wave mixing in the Gulf of Mexico, finding greatly enhanced mixing over the slope region. These results suggest that the ocean margins may supply the mixing required to close the global thermohaline circulation, and the techniques demonstrated here might be used to map mixing over much of the world's continental shelves. The use of multichannel seismics to image ocean phenomena is nearly 15 years old, and despite the initial promise, the techniques have not become as broadly used as initially expected. We discuss possible reasons for this, and suggest an alternative approach that might gain broader success.

  6. Contribution of seismic processing to put up the scaffolding for the 3-dimensional study of deep sedimentary basins: the fundaments of trans-national 3D modelling in the project GeoMol

    NASA Astrophysics Data System (ADS)

    Capar, Laure

    2013-04-01

    Within the framework of the transnational project GeoMol geophysical and geological information on the entire Molasse Basin and on the Po Basin are gathered to build consistent cross-border 3D geological models based on borehole evidence and seismic data. Benefiting from important progress in seismic processing, these new models will provide some answers to various questions regarding the usage of subsurface resources, as there are geothermal energy, CO2 and gas storage, oil and gas production, and support decisions-making to national and local administrations as well as to industries. More than 28 000 km of 2D seismic lines are compiled reprocessed and harmonized. This work faces various problems like the vertical drop of more than 700 meters between West and East of the Molasse Basin and to al lesser extent in the Po Plain, the heterogeneities of the substratum, the large disparities between the period and parameters of seismic acquisition, and depending of their availability, the use of two types of seismic data, raw and processed seismic data. The main challenge is to harmonize all lines at the same reference level, amplitude and step of signal processing from France to Austria, spanning more than 1000 km, to avoid misfits at crossing points between seismic lines and artifacts at the country borders, facilitating the interpretation of the various geological layers in the Molasse Basin and Po Basin. A generalized stratigraphic column for the two basins is set up, representing all geological layers relevant to subsurface usage. This stratigraphy constitutes the harmonized framework for seismic reprocessing. In general, processed seismic data is available on paper at stack stage and the mandatory information to take these seismic lines to the final stage of processing, the migration step, are datum plane and replacement velocity. However several datum planes and replacement velocities were used during previous processing projects. Our processing sequence is to first digitize the data, to have them in SEG-Y format. The second step is to apply some post-stack processing to obtain a good data quality before the final migration step. The third step is the final migration, using optimized migration velocities and the fourth step is the post-migration processing. In case of raw seismic data, the mandatory information for processing is made accessible, like from observer logs, coordinates and field seismic data. The processing sequence in order to obtain the final usable version of the seismic line is based on a pre-stack time migration. A complex processing sequence is applied. One main issue is to deal with the significant changes in the topography along the seismic lines and in the first twenty meter layer, this low velocity zone (LVZ) or weathered zone, where some lateral velocity variations occur and disturb the wave propagation, therefore the seismic signal. In seismic processing, this matter is solved by using the static corrections which allow removing these effects of lateral velocity variations and the effects of topography. Another main item is the good determination of root mean square velocities for migration, to improve the final result of seismic processing. Within GeoMol, generalized 3D velocity models of stack velocities are calculated in order to perform a rapid time-depth conversion. In final, all seismic lines of the project GeoMol will be at the same level of processing, the migration level. But to tie all these lines, a single appropriate datum plane and replacement velocity for the entire Molasse Basin and Po Plain, respectively, have to be carefully set up, to avoid misties at crossing points. The reprocessing and use of these 28 000 km of seismic lines in the project GeoMol provide the pivotal database to build a 3D framework model for regional subsurface information on the Alpine foreland basins (cf. Rupf et al. 2013, EGU2013-8924). The project GeoMol is co-funded by the Alpine Space Program as part of the European Territorial Cooperation 2007-2013. The project integrates partners from Austria, France, Germany, Italy, Slovenia and Switzerland and runs from September 2012 to June 2015. Further information on www.geomol.eu The GeoMol seismic interpretation team: Roland Baumberger (swisstopo), Agnès BRENOT (BRGM), Alessandro CAGNONI (RLB), Renaud COUËFFE (BRGM), Gabriel COURRIOUX (BRGM), Chiara D'Ambrogi (ISPRA), Chrystel Dezayes (BRGM), Charlotte Fehn (LGRB), Sunseare GABALDA (BRGM), Gregor Götzl (GBA), Andrej Lapanje (GeoZS), Stéphane MARC (BRGM), Alberto MARTINI (RER-SGSS), Fabio Carlo Molinari (RER-SGSS), Edgar Nitsch (LGRB), Robert Pamer (LfU BY), Marco PANTALONI (ISPRA), Sebastian Pfleiderer (GBA), Andrea PICCIN (RLB), (Nils Oesterling (swisstopo), Isabel Rupf (LGRB), Uta Schulz (LfU BY), Yves SIMEON (BRGM), Günter SÖKOL (LGRB), Heiko Zumsprekel (LGRB)

  7. Long-range dependence in earthquake-moment release and implications for earthquake occurrence probability.

    PubMed

    Barani, Simone; Mascandola, Claudia; Riccomagno, Eva; Spallarossa, Daniele; Albarello, Dario; Ferretti, Gabriele; Scafidi, Davide; Augliera, Paolo; Massa, Marco

    2018-03-28

    Since the beginning of the 1980s, when Mandelbrot observed that earthquakes occur on 'fractal' self-similar sets, many studies have investigated the dynamical mechanisms that lead to self-similarities in the earthquake process. Interpreting seismicity as a self-similar process is undoubtedly convenient to bypass the physical complexities related to the actual process. Self-similar processes are indeed invariant under suitable scaling of space and time. In this study, we show that long-range dependence is an inherent feature of the seismic process, and is universal. Examination of series of cumulative seismic moment both in Italy and worldwide through Hurst's rescaled range analysis shows that seismicity is a memory process with a Hurst exponent H ≈ 0.87. We observe that H is substantially space- and time-invariant, except in cases of catalog incompleteness. This has implications for earthquake forecasting. Hence, we have developed a probability model for earthquake occurrence that allows for long-range dependence in the seismic process. Unlike the Poisson model, dependent events are allowed. This model can be easily transferred to other disciplines that deal with self-similar processes.

  8. Characterization of deep geothermal energy resources using Electro-Magnetic methods, Belgium

    NASA Astrophysics Data System (ADS)

    Loveless, Sian; Harcout-Menou, Virginie; De Ridder, Fjo; Claessens, Bert; Laenen, Ben

    2014-05-01

    Sedimentary basins in Northwest Europe have significant potential for low to medium enthalpy, deep geothermal energy resources. These resources are currently assessed using standard exploration techniques (seismic investigations followed by drilling of a borehole). This has enabled identification of geothermal resources but such techniques are extremely costly. The high cost of exploration remains one of the main barriers to geothermal project development due to the lack of capital in the geothermal industry. We will test the possibility of using the Electro-Magnetic (EM) methods to aid identification of geothermal resources in conjunction with more traditional exploration methods. An EM campaign could cost a third of a seismic campaign and is also often a passive technology, resulting in smaller environmental impacts than seismic surveys or drilling. EM methods image changes in the resistivity of the earth's sub-surface using natural or induced frequency dependant variations of electric and magnetic fields. Changes in resistivity can be interpreted as representing different subsurface properties including changes in rock type, chemistry, temperature and/or hydraulic transmissivity. While EM techniques have proven to be useful in geothermal exploration in high enthalpy areas in the last 2-3 years only a handful of studies assess their applicability in low enthalpy sedimentary basins. Challenges include identifying which sub-surface features cause changes in electrical resistivity as low enthalpy reservoirs are unlikely to exhibit the hydrothermally altered clay layer above the geothermal aquifer that is typical for high enthalpy reservoirs. Yet a principal challenge is likely to be the high levels of industrialisation in the areas of interest. Infrastructure such as train tracks and power cables can create a high level of background noise that can obfuscate the relevant signal. We present our plans for an EM campaign in the Flemish region of Belgium. Field techniques will be developed to increase the signal-noise ratio and identify background noise. Firstly, surface noise will be filtered off by non-parametric approaches such as proper orthogonal decomposition. Secondly, the EM signal and newly acquired seismic data will be combined to obtain a multi-dimensional earth model via an inversion process. Typically, these identification procedures are non-unique, resulting in multiple possible scenarios that cannot be distinguished based on the information at hand. To this end standard approaches) use a regularisation term including an apriori model. Here, Bayesian approaches will also be used, in which expert knowledge is used to guide the outcome to reasonable solutions. We will assess the reduction in uncertainty and therefore risks that EM methods can provide when used in combination with seismic surveys for geothermal exploration prior to drilling. It may also be possible to use this technique for monitoring the evolution of geothermal systems. Such techniques may prove to be extremely valuable for the future development of geothermal energy resources.

  9. PRISM Software: Processing and Review Interface for Strong‐Motion Data

    USGS Publications Warehouse

    Jones, Jeanne M.; Kalkan, Erol; Stephens, Christopher D.; Ng, Peter

    2017-01-01

    A continually increasing number of high‐quality digital strong‐motion records from stations of the National Strong Motion Project (NSMP) of the U.S. Geological Survey, as well as data from regional seismic networks within the United States, calls for automated processing of strong‐motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong‐motion records. When used without AQMS, PRISM provides batch‐processing capabilities. The PRISM software is platform independent (coded in Java), open source, and does not depend on any closed‐source or proprietary software. The software consists of two major components: a record processing engine composed of modules for each processing step, and a review tool, which is a graphical user interface for manual review, edit, and processing. To facilitate use by non‐NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand‐alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible to accommodate implementation of new processing techniques. All the computing features have been thoroughly tested.

  10. Physics-Based Broadband Ground Motion Simulations in Near Fault Conditions: the L'Aquila (Italy) and the Upper Rhine Graben (France-Germany) Case of Studies

    NASA Astrophysics Data System (ADS)

    Del Gaudio, S.; Lancieri, M.; Hok, S.; Satriano, C.; Chartier, T.; Scotti, O.; Bernard, P.

    2016-12-01

    Predictions of realistic ground motion for potential future earthquakes are always an interesting task for seismologists and are also the main objective of seismic hazard assessment. While, on one hand, numerical simulations have become more and more accurate and several different techniques have been developed, on the other hand ground motion prediction equations (GMPEs) have become a powerful instrument (due to great improvement of seismic strong motion networks providing a large amount of data). Nevertheless GMPEs do not represent the whole variety of source processes and this can lead to incorrect estimates especially in the near fault conditions because of the lack of records of large earthquakes at short distances. In such cases, physics-based ground motion simulations can be a valid tool to complement prediction equations for scenario studies, provided that both source and propagation are accurately described. We present here a comparison between numerical simulations performed in near fault conditions using two different kinematic source models, which are based on different assumptions and parameterizations: the "k-2 model" and the "fractal model". Wave propagation is taken into account using hybrid Green's function (HGF), which consists in coupling numerical Green's function with an empirical Green's function (EGF) approach. The advantage of this technique is that it does not require a very detailed knowledge of the propagation medium, but requires availability of high quality records of small earthquakes in the target area. The first application we show is on L'Aquila 2009 M 6.3 earthquake, where the main event records provide a benchmark for the synthetic waveforms. Here we can clearly observe which are the limitations of these techniques and investigate which are the physical parameters that are effectively controlling the ground motion level. The second application is a blind test on Upper Rhine Graben (URG) where active faults producing micro seismic activity are very close to sites of interest needing a careful investigation of seismic hazard. Finally we will perform a probabilistic seismic hazard analysis (PSHA) for the URG using numerical simulations to define input ground motion for different scenarios and compare them with a classical probabilistic study based on GMPEs.

  11. Application of Carbonate Reservoir using waveform inversion and reverse-time migration methods

    NASA Astrophysics Data System (ADS)

    Kim, W.; Kim, H.; Min, D.; Keehm, Y.

    2011-12-01

    Recent exploration targets of oil and gas resources are deeper and more complicated subsurface structures, and carbonate reservoirs have become one of the attractive and challenging targets in seismic exploration. To increase the rate of success in oil and gas exploration, it is required to delineate detailed subsurface structures. Accordingly, migration method is more important factor in seismic data processing for the delineation. Seismic migration method has a long history, and there have been developed lots of migration techniques. Among them, reverse-time migration is promising, because it can provide reliable images for the complicated model even in the case of significant velocity contrasts in the model. The reliability of seismic migration images is dependent on the subsurface velocity models, which can be extracted in several ways. These days, geophysicists try to obtain velocity models through seismic full waveform inversion. Since Lailly (1983) and Tarantola (1984) proposed that the adjoint state of wave equations can be used in waveform inversion, the back-propagation techniques used in reverse-time migration have been used in waveform inversion, which accelerated the development of waveform inversion. In this study, we applied acoustic waveform inversion and reverse-time migration methods to carbonate reservoir models with various reservoir thicknesses to examine the feasibility of the methods in delineating carbonate reservoir models. We first extracted subsurface material properties from acoustic waveform inversion, and then applied reverse-time migration using the inverted velocities as a background model. The waveform inversion in this study used back-propagation technique, and conjugate gradient method was used in optimization. The inversion was performed using the frequency-selection strategy. Finally waveform inversion results showed that carbonate reservoir models are clearly inverted by waveform inversion and migration images based on the inversion results are quite reliable. Different thicknesses of reservoir models were also described and the results revealed that the lower boundary of the reservoir was not delineated because of energy loss. From these results, it was noted that carbonate reservoirs can be properly imaged and interpreted by waveform inversion and reverse-time migration methods. This work was supported by the Energy Resources R&D program of the Korea Institute of Energy Technology Evaluation and Planning (KETEP) grant funded by the Korea government Ministry of Knowledge Economy (No. 2009201030001A, No. 2010T100200133) and the Brain Korea 21 project of Energy System Engineering.

  12. Yearly report, Yucca Mountain project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brune, J.N.

    1992-09-30

    We proposed to (1) Develop our data logging and analysis equipment and techniques for analyzing seismic data from the Southern Great Basin Seismic Network (SGBSN), (2) Investigate the SGBSN data for evidence of seismicity patterns, depth distribution patterns, and correlations with geologic features (3) Repair and maintain our three broad band downhole digital seismograph stations at Nelson, nevada, Troy Canyon, Nevada, and Deep Springs, California (4) Install, operate, and log data from a super sensitive microearthquake array at Yucca Mountain (5) Analyze data from micro-earthquakes relative to seismic hazard at Yucca Mountain.

  13. Quantitative Seismic Interpretation: Applying Rock Physics Tools to Reduce Interpretation Risk

    NASA Astrophysics Data System (ADS)

    Sondergeld, Carl H.

    This book is divided into seven chapters that cover rock physics, statistical rock physics, seismic inversion techniques, case studies, and work flows. On balance, the emphasis is on rock physics. Included are 56 color figures that greatly help in the interpretation of more complicated plots and displays.The domain of rock physics falls between petrophysics and seismics. It is the basis for interpreting seismic observations and therefore is pivotal to the understanding of this book. The first two chapters are dedicated to this topic (109 pages).

  14. Determining the metallicity of the solar envelope using seismic inversion techniques

    NASA Astrophysics Data System (ADS)

    Buldgen, G.; Salmon, S. J. A. J.; Noels, A.; Scuflaire, R.; Dupret, M. A.; Reese, D. R.

    2017-11-01

    The solar metallicity issue is a long-lasting problem of astrophysics, impacting multiple fields and still subject to debate and uncertainties. While spectroscopy has mostly been used to determine the solar heavy elements abundance, helioseismologists attempted providing a seismic determination of the metallicity in the solar convective envelope. However, the puzzle remains since two independent groups provided two radically different values for this crucial astrophysical parameter. We aim at providing an independent seismic measurement of the solar metallicity in the convective envelope. Our main goal is to help provide new information to break the current stalemate amongst seismic determinations of the solar heavy element abundance. We start by presenting the kernels, the inversion technique and the target function of the inversion we have developed. We then test our approach in multiple hare-and-hounds exercises to assess its reliability and accuracy. We then apply our technique to solar data using calibrated solar models and determine an interval of seismic measurements for the solar metallicity. We show that our inversion can indeed be used to estimate the solar metallicity thanks to our hare-and-hounds exercises. However, we also show that further dependencies in the physical ingredients of solar models lead to a low accuracy. Nevertheless, using various physical ingredients for our solar models, we determine metallicity values between 0.008 and 0.014.

  15. Detection of sinkholes or anomalies using full seismic wave fields.

    DOT National Transportation Integrated Search

    2013-04-01

    This research presents an application of two-dimensional (2-D) time-domain waveform tomography for detection of embedded sinkholes and anomalies. The measured seismic surface wave fields were inverted using a full waveform inversion (FWI) technique, ...

  16. Crustal structure of Central Sicily

    NASA Astrophysics Data System (ADS)

    Giustiniani, Michela; Tinivella, Umberta; Nicolich, Rinaldo

    2018-01-01

    We processed crustal seismic profile SIRIPRO, acquired across Central Sicily. To improve the seismic image we utilized the wave equation datuming technique, a process of upward or downward continuation of the wave-field between two arbitrarily shaped surfaces. Wave equation datuming was applied to move shots and receivers to a given datum plane, removing time shifts related to topography and to near-surface velocity variations. The datuming procedure largely contributed to attenuate ground roll, enhance higher frequencies, increase resolution and improve the signal/noise ratio. Processed data allow recognizing geometries of crust structures differentiating seismic facies and offering a direct image of ongoing tectonic setting within variable lithologies characterizing the crust of Central Sicily. Migrated sections underline distinctive features of Hyblean Plateau foreland and above all a crustal thinning towards the Caltanissetta trough, to the contact with a likely deep Permo-Triassic rifted basin or rather a zone of a continent to oceanic transition. Inhomogeneity and fragmentation of Sicily crust, with a distinct separation of Central Sicily basin from western and eastern blocks, appear to have guided the tectonic transport inside the Caltanissetta crustal scale syncline and the accumulation of allochthonous terrains with south and north-verging thrusts. Major tectonic stack operated on the construction of a wide anticline of the Maghrebian chain in northern Sicily. Sequential south-verging imbrications of deep elements forming the anticline core denote a crust wedge indenting foreland structures. Deformation processes involved multiple detachment planes down to decoupling levels located near crust/mantle transition, supporting a presence of high-density lenses beneath the chain, interrelated to a southwards push of Tyrrhenian mantle and asthenosphere.

  17. Seismic depth imaging of sequence boundaries beneath the New Jersey shelf

    NASA Astrophysics Data System (ADS)

    Riedel, M.; Reiche, S.; Aßhoff, K.; Buske, S.

    2018-06-01

    Numerical modelling of fluid flow and transport processes relies on a well-constrained geological model, which is usually provided by seismic reflection surveys. In the New Jersey shelf area a large number of 2D seismic profiles provide an extensive database for constructing a reliable geological model. However, for the purpose of modelling groundwater flow, the seismic data need to be depth-converted which is usually accomplished using complementary data from borehole logs. Due to the limited availability of such data in the New Jersey shelf, we propose a two-stage processing strategy with particular emphasis on reflection tomography and pre-stack depth imaging. We apply this workflow to a seismic section crossing the entire New Jersey shelf. Due to the tomography-based velocity modelling, the processing flow does not depend on the availability of borehole logging data. Nonetheless, we validate our results by comparing the migrated depths of selected geological horizons to borehole core data from the IODP expedition 313 drill sites, located at three positions along our seismic line. The comparison yields that in the top 450 m of the migrated section, most of the selected reflectors were positioned with an accuracy close to the seismic resolution limit (≈ 4 m) for that data. For deeper layers the accuracy still remains within one seismic wavelength for the majority of the tested horizons. These results demonstrate that the processed seismic data provide a reliable basis for constructing a hydrogeological model. Furthermore, the proposed workflow can be applied to other seismic profiles in the New Jersey shelf, which will lead to an even better constrained model.

  18. High frequency seismic monitoring of debris flows at Chalk Cliffs (CO), USA

    NASA Astrophysics Data System (ADS)

    Coviello, Velio; Kean, Jason; Smith, Joel; Coe, Jeffrey; Arattano, Massimo; McCoy, Scott

    2015-04-01

    A growing number of studies adopt passive seismic monitoring techniques to investigate slope instabilities and landslide processes. These techniques are attractive and convenient because large areas can be monitored from a safe distance. This is particularly true when the phenomena under investigation are rapid and infrequent mass movements like debris flows. Different types of devices are used to monitor debris flow processes, but among them ground vibration detectors (GVDs) present several, specific advantages that encourage their use. These advantages include: (i) the possibility to be installed outside the channel bed, (ii) the high adaptability to different and harsh field conditions, and (iii) the capability to detect the debris flow front arrival tens of seconds earlier than contact and stage sensors. Ground vibration data can provide relevant information on the dynamics of debris flows such as timing and velocity of the main surges. However, the processing of the raw seismic signal is usually needed, both to obtain a more effective representation of waveforms and to decrease the amount of data that need to be recorded and analyzed. With this objective, the methods of Amplitude and Impulses are commonly adopted to transform the raw signal to a 1-Hz signal that allows for a more useful representation of the phenomenon. In that way, peaks and other features become more visible and comparable with data obtained from other monitoring devices. In this work, we present the first debris flows seismic recordings gathered in the Chalk Cliffs instrumented basin, central Colorado, USA. In May 2014, two 4.5-Hz, three-axial geophones were installed in the upper part of the catchment. Seismic data are sampled at 333 Hz and then recorded by a standalone recording unit. One geophone is directly installed on bedrock, the other one mounted on a 1-m boulder partially buried in colluvium. This latter sensor integrates a heavily instrumented cross-section consisting of a 225 cm2 force plate recording basal impact forces at 333 Hz, a laser distance meter recording flow stage over the plate at 10 Hz, and a high definition video camera (24 frames per seconds). This combination of instrumentation allows for a comparison of the amplitude and spectral response of the geophones to flow depth, impact force, and video recordings. On July 4, 2014 a debris flow event occurred in the basin that was recorded by the whole monitoring system. Both geophone installation methods and channel bed characteristics largely influenced the seismic records. One geophone exhibits a broad frequency response during all debris flow surges, while the energy recorded by the other one is mainly concentrated in the 40-80 Hz band. Furthermore, erosion and entrainment processes have a crucial effect on the recorded waveforms. The presence of channel bed sediment damps the Amplitude waveforms during the first surges, when the flow is not yet erosive. The typical proportionality between the Amplitude curve and the flow stage is observed only after the entrainment of the channel bed sediment by the debris flow, when the flow is directly on bedrock. The processing of the signal with the Impulse transformation displays the same damping effect when a high threshold is adopted. However, the use of a high threshold entails the disappearance of the first surge and causes a less effective early detection of the flow. On the contrary, the adoption of a lower threshold impedes the observation of sediment damping effect.

  19. Updated Colombian Seismic Hazard Map

    NASA Astrophysics Data System (ADS)

    Eraso, J.; Arcila, M.; Romero, J.; Dimate, C.; Bermúdez, M. L.; Alvarado, C.

    2013-05-01

    The Colombian seismic hazard map used by the National Building Code (NSR-98) in effect until 2009 was developed in 1996. Since then, the National Seismological Network of Colombia has improved in both coverage and technology providing fifteen years of additional seismic records. These improvements have allowed a better understanding of the regional geology and tectonics which in addition to the seismic activity in Colombia with destructive effects has motivated the interest and the need to develop a new seismic hazard assessment in this country. Taking advantage of new instrumental information sources such as new broad band stations of the National Seismological Network, new historical seismicity data, standardized global databases availability, and in general, of advances in models and techniques, a new Colombian seismic hazard map was developed. A PSHA model was applied. The use of the PSHA model is because it incorporates the effects of all seismic sources that may affect a particular site solving the uncertainties caused by the parameters and assumptions defined in this kind of studies. First, the seismic sources geometry and a complete and homogeneous seismic catalog were defined; the parameters of seismic rate of each one of the seismic sources occurrence were calculated establishing a national seismotectonic model. Several of attenuation-distance relationships were selected depending on the type of seismicity considered. The seismic hazard was estimated using the CRISIS2007 software created by the Engineering Institute of the Universidad Nacional Autónoma de México -UNAM (National Autonomous University of Mexico). A uniformly spaced grid each 0.1° was used to calculate the peak ground acceleration (PGA) and response spectral values at 0.1, 0.2, 0.3, 0.5, 0.75, 1, 1.5, 2, 2.5 and 3.0 seconds with return periods of 75, 225, 475, 975 and 2475 years. For each site, a uniform hazard spectrum and exceedance rate curves were calculated. With the results, it is possible to determinate environments and scenarios where the seismic hazard is a function of distance and magnitude and also the principal seismic sources that contribute to the seismic hazard at each site (dissagregation). This project was conducted by the Servicio Geológico Colombiano (Colombian Geological Survey) and the Universidad Nacional de Colombia (National University of Colombia), with the collaboration of national and foreign experts and the National System of Prevention and Attention of Disaster (SNPAD). It is important to stand out that this new seismic hazard map was used in the updated national building code (NSR-10). A new process is ongoing in order to improve and present the Seismic Hazard Map in terms of intensity. This require new knowledge in site effects, in both local and regional scales, checking the existing and develop new acceleration to intensity relationships, in order to obtain results more understandable and useful for a wider range of users, not only in the engineering field, but also all the risk assessment and management institutions, research and general community.

  20. Machine Learning Method for Pattern Recognition in Volcano Seismic Spectra

    NASA Astrophysics Data System (ADS)

    Radic, V.; Unglert, K.; Jellinek, M.

    2016-12-01

    Variations in the spectral content of volcano seismicity related to changes in volcanic activity are commonly identified manually in spectrograms. However, long time series of monitoring data at volcano observatories require tools to facilitate automated and rapid processing. Techniques such as Self-Organizing Maps (SOM), Principal Component Analysis (PCA) and clustering methods can help to quickly and automatically identify important patterns related to impending eruptions. In this study we develop and evaluate an algorithm applied on a set of synthetic volcano seismic spectra as well as observed spectra from Kılauea Volcano, Hawai`i. Our goal is to retrieve a set of known spectral patterns that are associated with dominant phases of volcanic tremor before, during, and after periods of volcanic unrest. The algorithm is based on training a SOM on the spectra and then identifying local maxima and minima on the SOM 'topography'. The topography is derived from the first two PCA modes so that the maxima represent the SOM patterns that carry most of the variance in the spectra. Patterns identified in this way reproduce the known set of spectra. Our results show that, regardless of the level of white noise in the spectra, the algorithm can accurately reproduce the characteristic spectral patterns and their occurrence in time. The ability to rapidly classify spectra of volcano seismic data without prior knowledge of the character of the seismicity at a given volcanic system holds great potential for real time or near-real time applications, and thus ultimately for eruption forecasting.

  1. Real-Time seismic waveforms monitoring with BeiDou Navigation Satellite System (BDS) observations for the 2015 Mw 7.8 Nepal earthquake

    NASA Astrophysics Data System (ADS)

    Geng, T.

    2015-12-01

    Nowadays more and more high-rate Global Navigation Satellite Systems (GNSS) data become available in real time, which provide more opportunities to monitor the seismic waveforms. China's GNSS, BeiDou Navigation Satellite System (BDS), has already satisfied the requirement of stand-alone precise positioning in Asia-Pacific region with 14 in-orbit satellites, which promisingly suggests that BDS could be applied to the high-precision earthquake monitoring as GPS. In the present paper, real-time monitoring of seismic waveforms using BDS measurements is assessed. We investigate a so-called "variometric" approach to measure real-time seismic waveforms with high-rate BDS observations. This approach is based on time difference technique and standard broadcast products which are routinely available in real time. The 1HZ BDS data recorded by Beidou Experimental Tracking Stations (BETS) during the 2015 Mw 7.8 Nepal earthquake is analyzed. The results indicate that the accuracies of velocity estimation from BDS are 2-3 mm/s in horizontal components and 8-9 mm/s in vertical component, respectively, which are consistent with GPS. The seismic velocity waveforms during earthquake show good agreement between BDS and GPS. Moreover, the displacement waveforms is reconstructed by an integration of velocity time series with trend removal. The displacement waveforms with the accuracy of 1-2 cm are derived by comparing with post-processing GPS precise point positioning (PPP).

  2. Polarization Analysis of Ambient Seismic Noise Green's Functions for Monitoring Glacial State

    NASA Astrophysics Data System (ADS)

    Fry, B.; Horgan, H. J.; Levy, R. H.; Bertler, N. A. N.

    2017-12-01

    Analysis of continuously recorded background seismic noise has emerged as a powerful technique to monitor changes within the Earth. In a process analogous to Einstein's 'Brownian motion', seismic energy enters the Earth through a variety of mechanisms and then is dissipated through scattering processes or through a semi-random distribution of sources. Eventually, in stratified media, some of this energy assembles itself in coherent packets and propagates as seismic surface waves. Through careful analysis of these waves as recorded by two seismic stations over a short period of time, we can reconstruct Empirical Green's Functions (EGF). EGF are sensitive to the material through which the waves are travelling between the two stations. They can thus provide 4D estimates of material properties such as seismic velocity and anisotropy. We specifically analyze both the bulk velocity and the complex phase of these EGF to look for subtle changes in velocity with direction of propagation as well as the nature of particle polarization and ellipticity. These characteristics can then be used as a proxy for contemporaneous stress and strain or 'inherited' strain. Similar approaches have proven successful in mapping stresses and strain in the crust, on plate interface faults, volcanoes, and on glaciers and the Greenland ice sheet. We will present results from applying this approach to continuous broadband data recorded on the West Antarctic Ice Sheet through the Polenet project. Our results suggest that we can reconstruct EGF at least between frequencies of 300mHz and 50mHz for time periods, providing information about the contemporary state of ice and underlying lithosphere on a seasonal or annual basis. Our primary goals are determining glacial state by linking wave propagation to material fabric on micro (crystal orientation) and macro (strain marker) scales and well as rebound processes in the lithosphere during glacial loading and unloading. We will present our current results, effectively 1) providing an affordable and non-invasive method for monitoring changes in ice conditions through time and space (including depth) and 2) defining a baseline for the nature of wave propagation through the upper crust and ice sheet that will be useful for future studies examining the relation between forcing and ice sheet dynamic response.

  3. Improvements of Real Time First Motion Focal Mechanism and Noise Characteristics of New Sites at the Puerto Rico Seismic Network

    NASA Astrophysics Data System (ADS)

    Williams, D. M.; Lopez, A. M.; Huerfano, V.; Lugo, J.; Cancel, J.

    2011-12-01

    Seismic networks need quick and efficient ways to obtain information related to seismic events for the purposes of seismic activity monitoring, risk assessment, and scientific knowledge among others. As part of an IRIS summer internship program, two projects were performed to provide a tool for quick faulting mechanism and improve seismic data at the Puerto Rico Seismic Network (PRSN). First, a simple routine to obtain a focal mechanisms, the geometry of the fault, based on first motions was developed and implemented for data analysts routine operations at PRSN. The new tool provides the analyst a quick way to assess the probable faulting mechanism that occurred while performing the interactive earthquake location procedure. The focal mechanism is generated on-the-fly when data analysts pick P wave arrivals onsets and motions. Once first motions have been identified, an in-house PRSN utility is employed to obtain the double couple representation and later plotted using GMT's psmeca utility. Second, we addressed the issue of seismic noise related to thermal fluctuations inside seismic vaults. Seismic sites can be extremely noisy due to proximity to cultural activities and unattended thermal fluctuations inside sensor housings, thus resulting in skewed readings. In the past, seismologists have used different insulation techniques to reduce the amount of unwanted noise that a seismometers experience due to these thermal changes with items such as Styrofoam, and fiber glass among others. PRSN traditionally uses Styrofoam boxes to cover their seismic sensors, however, a proper procedure to test how these method compare to other new techniques has never been approached. The deficiency of properly testing these techniques in the Caribbean and especially Puerto Rico is that these thermal fluctuations still happen because of the intense sun and humidity. We conducted a test based on the methods employed by the IRIS Transportable Array, based on insulation by sand burial of the sensor. Two Guralps CMG-3T's connected to RefTek's 150 digitizers were used at PRSN's MPR site seismic vault to compare the two types of insulation. Two temperature loggers were placed along each seismic sensor for a period of one week to observe how much thermal fluctuations occur in each insulation method and then compared its capability for noise reduction due to thermal fluctuations. With only a single degree Celsius fluctuation inside the sand (compared to almost twice that value for the foam) the sensor buried in sand provided the best insulation for the seismic vault. In addition, the quality of the data was analyzed by comparing both sensors using PQLX. We show results of this analysis and also provide a site characteristic of new stations to be included in the daily earthquake location operations at the PRSN.

  4. Lower crustal seismic activity in the Adana Basin (Eastern Mediterranean): Possible connection to gravitational flexure

    NASA Astrophysics Data System (ADS)

    Ergin, Mehmet; Aktar, Mustafa

    2018-04-01

    High quality broadband data, together with the application of the double difference relocation technique, has been used to study the characteristics of the lower crustal seismicity in the Adana Basin, in southwestern Turkey. Deep events are clearly seen to be restricted only to the Adana Basin and never extend outside its boundaries. Furthermore, the seismogenic zone is observed to align roughly with the main axis of the basin and plunges steadily in the SSW-direction, following the basement trend of the Adana Basin. Similarities between geometries of the basin evolution and the deep seismic production suggest that both processes are closely related. A flexure process is proposed related to the subsidence of the Adana Basin. The seismogenic zone, originally at a shallow depth, is assumed to have been displaced vertically into the lower crust, by flexure. The temperature evolution of the crust during the flexure has been studied in detail using finite difference modeling, with amplitude and duration parameters taken from earlier studies. It has been concluded that the physical conditions for brittle fracturing remained unchanged for an extended period of time after the flexure. The brittle layers originally at shallow depths, preserved their original thermal properties after the subsidence and will continue to produce earthquakes at considerable depths. Numerical tests using inferred parameters imply a total vertical shift of 7-8 km for the seismogenic zone. Discussions for additional processes, which may further contribute to the cooling of the crust, are also included.

  5. Near-vertical seismic reflection image using a novel acquisition technique across the Vrancea Zone and Foscani Basin, south-eastern Carpathians (Romania)

    NASA Astrophysics Data System (ADS)

    Panea, I.; Stephenson, R.; Knapp, C.; Mocanu, V.; Drijkoningen, G.; Matenco, L.; Knapp, J.; Prodehl, K.

    2005-12-01

    The DACIA PLAN (Danube and Carpathian Integrated Action on Process in the Lithosphere and Neotectonics) deep seismic sounding survey was performed in August-September 2001 in south-eastern Romania, at the same time as the regional deep refraction seismic survey VRANCEA 2001. The main goal of the experiment was to obtain new information on the deep structure of the external Carpathians nappes and the architecture of Tertiary/Quaternary basins developed within and adjacent to the seismically-active Vrancea zone, including the Focsani Basin. The seismic reflection line had a WNW-ESE orientation, running from internal East Carpathians units, across the mountainous south-eastern Carpathians, and the foreland Focsani Basin towards the Danube Delta. There were 131 shot points along the profile, with about 1 km spacing, and data were recorded with stand-alone RefTek-125s (also known as "Texans"), supplied by the University Texas at El Paso and the PASSCAL Institute. The entire line was recorded in three deployments, using about 340 receivers in the first deployment and 640 receivers in each of the other two deployments. The resulting deep seismic reflection stacks, processed to 20 s along the entire profile and to 10 s in the eastern Focsani Basin, are presented here. The regional architecture of the latter, interpreted in the context of abundant independent constraint from exploration seismic and subsurface data, is well imaged. Image quality within and beneath the thrust belt is of much poorer quality. Nevertheless, there is good evidence to suggest that a thick (˜10 km) sedimentary basin having the structure of a graben and of indeterminate age underlies the westernmost part of the Focsani Basin, in the depth range 10-25 km. Most of the crustal depth seismicity observed in the Vrancea zone (as opposed to the more intense upper mantle seismicity) appears to be associated with this sedimentary basin. The sedimentary successions within this basin and other horizons visible further to the west, beneath the Carpathian nappes, suggest that the geometry of the Neogene and recent uplift observed in the Vrancea zone, likely coupled with contemporaneous rapid subsidence in the foreland, is detached from deeper levels of the crust at about 10 km depth. The Moho lies at a depth of about 40 km along the profile, its poor expression in the reflection stack being strengthened by independent estimates from the refraction data. Given the apparent thickness of the (meta)sedimentary supracrustal units, the crystalline crust beneath this area is quite thin (< 20 km) supporting the hypothesis that there may have been delamination of (lower) continental crust in this area involved in the evolution of the seismic Vrancea zone.

  6. Along-strike variations in seismic structure of the locked-sliding transition on the plate boundary beneath the southern part of Kii Peninsula, southwestern Japan

    NASA Astrophysics Data System (ADS)

    Kurashimo, E.; Iidaka, T.; Iwasaki, T.; Saiga, A.; Umeyama, E.; Tsumura, N.; Sakai, S.; Hirata, N.

    2013-12-01

    The Nankai trough region, where the Philippine Sea Plate (PHS) subducts beneath the SW Japan arc, is a well-known seismogenic zone of interplate earthquakes. A narrow zone of nonvolcanic tremor has been found in the SW Japan fore-arc, along strike of the arc (Obara, 2002). The epicentral distribution of tremor corresponds to the locked-sliding transition estimated from thermal and deformation models (Hyndman et al., 1995). The spatial distribution of the tremor is not homogeneous in a narrow belt but is spatially clustered. Obara [2002] suggested fluids as a source for tremor because of the long duration and the mobility of the tremor activity. The behavior of fluids at the plate interface is a key factor in understanding fault slip processes. Seismic reflection characteristics and seismic velocity variations can provide important information on the fluid-related heterogeneity of structure around plate interface. However, little is known about the deeper part of the plate boundary, especially the transition zone on the subducting plate. To reveal the seismic structure of the transition zone, we conducted passive and active seismic experiments in the southern part of Kii Peninsula, SW Japan. Sixty 3-component portable seismographs were installed on a 60-km-long line (SM-line) nearly perpendicular to the direction of the subduction of the PHS with approximately 1 km spacing. To improve accuracy of hypocenter locations, we additionally deployed six 3-component seismic stations around the survey line. Waveforms were continuously recorded during a five-month period from December, 2009. In October of 2010, a deep seismic profiling was also conducted. 290 seismometers were deployed on the SM-line with about 200 m spacing, on which five explosives shots were fired as controlled seismic sources. Arrival times of local earthquakes and explosive shots were used in a joint inversion for earthquake locations and 3-D Vp and Vp/Vs structures, using the iterative damped least-squares algorithm, simul2000 (Thurber and Eberhart-Phillips, 1999). To obtain the detailed structure image of the transition zone on the subducting plate, the explosive shot data recorded on the SM-line were processed using the seismic reflection technique. Seismic reflection image shows the lateral variation of the reflectivity along the top of the PHS. A clear reflection band is present where the clustered tremors occurred. The depth section of Vp/Vs structure shows the lateral variation of the Vp/Vs values along the top of the PHS. Clustered tremors are located in and around the high Vp/Vs zone. These results suggest the occurrence of the tremors may be associated with fluids dehydrated from the subducted oceanic lithosphere.

  7. Big Data GPU-Driven Parallel Processing Spatial and Spatio-Temporal Clustering Algorithms

    NASA Astrophysics Data System (ADS)

    Konstantaras, Antonios; Skounakis, Emmanouil; Kilty, James-Alexander; Frantzeskakis, Theofanis; Maravelakis, Emmanuel

    2016-04-01

    Advances in graphics processing units' technology towards encompassing parallel architectures [1], comprised of thousands of cores and multiples of parallel threads, provide the foundation in terms of hardware for the rapid processing of various parallel applications regarding seismic big data analysis. Seismic data are normally stored as collections of vectors in massive matrices, growing rapidly in size as wider areas are covered, denser recording networks are being established and decades of data are being compiled together [2]. Yet, many processes regarding seismic data analysis are performed on each seismic event independently or as distinct tiles [3] of specific grouped seismic events within a much larger data set. Such processes, independent of one another can be performed in parallel narrowing down processing times drastically [1,3]. This research work presents the development and implementation of three parallel processing algorithms using Cuda C [4] for the investigation of potentially distinct seismic regions [5,6] present in the vicinity of the southern Hellenic seismic arc. The algorithms, programmed and executed in parallel comparatively, are the: fuzzy k-means clustering with expert knowledge [7] in assigning overall clusters' number; density-based clustering [8]; and a selves-developed spatio-temporal clustering algorithm encompassing expert [9] and empirical knowledge [10] for the specific area under investigation. Indexing terms: GPU parallel programming, Cuda C, heterogeneous processing, distinct seismic regions, parallel clustering algorithms, spatio-temporal clustering References [1] Kirk, D. and Hwu, W.: 'Programming massively parallel processors - A hands-on approach', 2nd Edition, Morgan Kaufman Publisher, 2013 [2] Konstantaras, A., Valianatos, F., Varley, M.R. and Makris, J.P.: 'Soft-Computing Modelling of Seismicity in the Southern Hellenic Arc', Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [3] Papadakis, S. and Diamantaras, K.: 'Programming and architecture of parallel processing systems', 1st Edition, Eds. Kleidarithmos, 2011 [4] NVIDIA.: 'NVidia CUDA C Programming Guide', version 5.0, NVidia (reference book) [5] Konstantaras, A.: 'Classification of Distinct Seismic Regions and Regional Temporal Modelling of Seismicity in the Vicinity of the Hellenic Seismic Arc', IEEE Selected Topics in Applied Earth Observations and Remote Sensing, vol. 6 (4), pp. 1857-1863, 2013 [6] Konstantaras, A. Varley, M.R.,. Valianatos, F., Collins, G. and Holifield, P.: 'Recognition of electric earthquake precursors using neuro-fuzzy models: methodology and simulation results', Proc. IASTED International Conference on Signal Processing Pattern Recognition and Applications (SPPRA 2002), Crete, Greece, 2002, pp 303-308, 2002 [7] Konstantaras, A., Katsifarakis, E., Maravelakis, E., Skounakis, E., Kokkinos, E. and Karapidakis, E.: 'Intelligent Spatial-Clustering of Seismicity in the Vicinity of the Hellenic Seismic Arc', Earth Science Research, vol. 1 (2), pp. 1-10, 2012 [8] Georgoulas, G., Konstantaras, A., Katsifarakis, E., Stylios, C.D., Maravelakis, E. and Vachtsevanos, G.: '"Seismic-Mass" Density-based Algorithm for Spatio-Temporal Clustering', Expert Systems with Applications, vol. 40 (10), pp. 4183-4189, 2013 [9] Konstantaras, A. J.: 'Expert knowledge-based algorithm for the dynamic discrimination of interactive natural clusters', Earth Science Informatics, 2015 (In Press, see: www.scopus.com) [10] Drakatos, G. and Latoussakis, J.: 'A catalog of aftershock sequences in Greece (1971-1997): Their spatial and temporal characteristics', Journal of Seismology, vol. 5, pp. 137-145, 2001

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nurhandoko, Bagus Endar B.; Wely, Woen; Setiadi, Herlan

    It is already known that tomography has a great impact for analyzing and mapping unknown objects based on inversion, travel time as well as waveform inversion. Therefore, tomography has used in wide area, not only in medical but also in petroleum as well as mining. Recently, tomography method is being applied in several mining industries. A case study of tomography imaging has been carried out in DOZ ( Deep Ore Zone ) block caving mine, Tembagapura, Papua. Many researchers are undergoing to investigate the properties of DOZ cave not only outside but also inside which is unknown. Tomography takes amore » part for determining this objective.The sources are natural from the seismic events that caused by mining induced seismicity and rocks deformation activity, therefore it is called as passive seismic. These microseismic travel time data are processed by Simultaneous Iterative Reconstruction Technique (SIRT). The result of the inversion can be used for DOZ cave monitoring. These information must be used for identifying weak zone inside the cave. In addition, these results of tomography can be used to determine DOZ and cave information to support mine activity in PT. Freeport Indonesia.« less

  9. Precise Relative Earthquake Depth Determination Using Array Processing Techniques

    NASA Astrophysics Data System (ADS)

    Florez, M. A.; Prieto, G. A.

    2014-12-01

    The mechanism for intermediate depth and deep earthquakes is still under debate. The temperatures and pressures are above the point where ordinary fractures ought to occur. Key to constraining this mechanism is the precise determination of hypocentral depth. It is well known that using depth phases allows for significant improvement in event depth determination, however routinely and systematically picking such phases for teleseismic or regional arrivals is problematic due to poor signal-to-noise ratios around the pP and sP phases. To overcome this limitation we have taken advantage of the additional information carried by seismic arrays. We have used beamforming and velocity spectral analysis techniques to precise measure pP-P and sP-P differential travel times. These techniques are further extended to achieve subsample accuracy and to allow for events where the signal-to-noise ratio is close to or even less than 1.0. The individual estimates obtained at different subarrays for a pair of earthquakes can be combined using a double-difference technique in order to precisely map seismicity in regions where it is tightly clustered. We illustrate these methods using data from the recent M 7.9 Alaska earthquake and its aftershocks, as well as data from the Bucaramanga nest in northern South America, arguably the densest and most active intermediate-depth earthquake nest in the world.

  10. Expected damages of retrofitted bridges with RC jacketing

    NASA Astrophysics Data System (ADS)

    Montes, O.; Jara, J. M.; Jara, M.; Olmos, B. A.

    2015-07-01

    The bridge infrastructure in many countries of the world consists of medium span length structures built several decades ago and designed for very low seismic forces. Many of them are reinforced concrete structures that according to the current code regulations have to be rehabilitated to increase their seismic capacity. One way to reduce the vulnerability of the bridges is by using retrofitting techniques that increase the strength of the structure or by incorporating devices to reduce the seismic demand. One of the most common retrofit techniques of the bridges substructures is the use of RC jacketing; this research assesses the expected damages of seismically deficient medium length highway bridges retrofitted with reinforced concrete jacketing, by conducting a parametric study. We select a suite of twenty accelerograms of subduction earthquakes recorded close to the Pacific Coast in Mexico. The original structures consist of five 30 m span simple supported bridges with five pier heights of 5 m, 10 m, 15 m 20 and 25 m and the analyses include three different jacket thickness and three steel ratios. The bridges were subjected to the seismic records and non-linear time history analyses were carried out by using the OpenSEEs Plataform. Results allow selecting the reinforced concrete jacketing that better improves the expected seismic behavior of the bridge models.

  11. Near-real-time information products for Mount St. Helens -- tracking the ongoing eruption: Chapter 3 in A volcano rekindled: the renewed eruption of Mount St. Helens, 2004-2006

    USGS Publications Warehouse

    Qamar, Anthony I.; Malone, Stephen; Moran, Seth C.; Steele, William P.; Thelen, Weston A.; Sherrod, David R.; Scott, William E.; Stauffer, Peter H.

    2008-01-01

    The rapid onset of energetic seismicity on September 23, 2004, at Mount St. Helens caused seismologists at the Pacific Northwest Seismic Network and the Cascades Volcano Observatory to quickly improve and develop techniques that summarized and displayed seismic parameters for use by scientists and the general public. Such techniques included webicorders (Web-based helicorder-like displays), graphs showing RSAM (real-time seismic amplitude measurements), RMS (root-mean-square) plots, spectrograms, location maps, automated seismic-event detectors, focal mechanism solutions, automated approximations of earthquake magnitudes, RSAM-based alarms, and time-depth plots for seismic events. Many of these visual-information products were made available publicly as Web pages generated and updated routinely. The graphs and maps included short written text that explained the concepts behind them, which increased their value to the nonseismologic community that was tracking the eruption. Laypeople could read online summaries of the scientific interpretations and, if they chose, review some of the basic data, thereby providing a better understanding of the data used by scientists to make interpretations about ongoing eruptive activity, as well as a better understanding of how scientists worked to monitor the volcano.

  12. Bedload transport from spectral analysis of seismic noise near rivers

    NASA Astrophysics Data System (ADS)

    Hsu, L.; Finnegan, N. J.; Brodsky, E. E.

    2010-12-01

    Channel change in rivers is driven by bedload sediment transport. However, the nonlinear nature of sediment transport combined with the difficulty of making direct observations in rivers at flood hinder prediction of the timing and magnitude of bedload movement. Recent studies have shown that spectral analysis of seismic noise from seismometers near rivers illustrate a correlation between the relative amplitude of high frequency (>1 Hz) seismic noise and conditions for bedload transport, presumably from the energy transferred from clast collisions with the channel. However, a previous study in the Himalayas did not contain extensive bedload transport or discharge measurements, and the correspondence of seismic noise with proxy variables such as regional hydrologic and meteorologic data was not exact. A more complete understanding of the relationship between bedload transport and seismic noise would be valuable for extending the spatial and temporal extent of bedload data. To explore the direct relationship between bedload transport and seismic noise, we examine data from several seismic stations near the Trinity River in California, where the fluvial morphodynamics and bedload rating curves have been studied extensively. We compare the relative amplitude of the ambient seismic noise with records of water discharge and sediment transport. We also examine the noise at hourly, daily, and seasonal timescales to determine other possible sources of noise. We report the influence of variables such as local river slope, adjacent geology, anthropogenic noise, and distance from the river. The results illustrate the feasibility of using existing seismic arrays to sense radiated energy from processes of bedload transport. In addition, the results can be used to design future seismic array campaigns to optimize information about bedload transport. This technique provides great spatial and temporal coverage, and can be performed where direct bedload measurements are difficult or impossible. In addition to supplying information about sediment transport, the measure of energy transfer to the bed is useful for other applications such as potential for channel bed scour and erosion. Preliminary calculations indicate that the radiated energy sensed by a seismometer 1 km from a large mountain stream is of order 10^2 joules/s. This is similar in magnitude to the gravitational potential energy supply per time of the river, and therefore suggests that in these steep landscapes, a significant fraction of the energy from rivers is transmitted to the bed and can be documented by seismic noise.

  13. [Correlation between the microbiological (S. aureus) and seismic activities with regard to the sun-earth interactions and neutron flux generation].

    PubMed

    Shestopalov, I P; Rogozhin, Iu A

    2005-01-01

    The study searched for interactions between the solar activity, seismic energy of the Earth and microbiological processes in the period from 1969 to 1997. Microbiological processes were found dependent on as the solar, so intraterrestrial (e.g. seismic) activity. The 11-year seismic on biological cycles on Earth display a positive inter-correlation and a negative one with the solar activity (sun-spots cycles). There is also correlation between the Earth's seismic energy and neutron fluxes generated at the times of earthquakes on our planet, and microbiological parameters.

  14. Geophysical monitoring technology for CO2 sequestration

    NASA Astrophysics Data System (ADS)

    Ma, Jin-Feng; Li, Lin; Wang, Hao-Fan; Tan, Ming-You; Cui, Shi-Ling; Zhang, Yun-Yin; Qu, Zhi-Peng; Jia, Ling-Yun; Zhang, Shu-Hai

    2016-06-01

    Geophysical techniques play key roles in the measuring, monitoring, and verifying the safety of CO2 sequestration and in identifying the efficiency of CO2-enhanced oil recovery. Although geophysical monitoring techniques for CO2 sequestration have grown out of conventional oil and gas geophysical exploration techniques, it takes a long time to conduct geophysical monitoring, and there are many barriers and challenges. In this paper, with the initial objective of performing CO2 sequestration, we studied the geophysical tasks associated with evaluating geological storage sites and monitoring CO2 sequestration. Based on our review of the scope of geophysical monitoring techniques and our experience in domestic and international carbon capture and sequestration projects, we analyzed the inherent difficulties and our experiences in geophysical monitoring techniques, especially, with respect to 4D seismic acquisition, processing, and interpretation.

  15. Correlation of engineering parameters of the presumpscot formation to the seismic cone penetration test (SCPTU).

    DOT National Transportation Integrated Search

    2015-08-01

    The seismic cone penetration test with pore pressure measurement (SCPTu) is a geotechnical investigation technique which : involves pushing a sensitized cone into the subsurface at a constant rate while continuously measuring tip resistance, sleeve :...

  16. Case Studies on Application of Data Integration Techniques to Nondestructive Testing of Pavements

    DOT National Transportation Integrated Search

    2005-11-01

    The nondestructive testing devices currently in use by TxDOT are the falling weight deflectometer, the seismic pavement analyzer, the portable seismic pavement analyzer, and ground penetrating radar, which provide thickness or modulus information. In...

  17. Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents

    NASA Astrophysics Data System (ADS)

    Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.

    2016-12-01

    Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.

  18. Multi-Array Detection, Association and Location of Infrasound and Seismo-Acoustic Events in Utah

    DTIC Science & Technology

    2008-09-30

    techniques for detecting , associating, and locating infrasound signals at single and multiple arrays and then combining the processed results with...was detected and located by both infrasound and seismic instruments (Figure 3). Infrasound signals at all three arrays , from one of the explosions, are...COVERED (From - To) 30-Sep-2008 REPRINT 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER MULTI- ARRAY DETECTION , ASSOCIATION AND LOCATION OF INFRASOUND FA8718

  19. Convolutional neural network for earthquake detection and location

    PubMed Central

    Perol, Thibaut; Gharbi, Michaël; Denolle, Marine

    2018-01-01

    The recent evolution of induced seismicity in Central United States calls for exhaustive catalogs to improve seismic hazard assessment. Over the last decades, the volume of seismic data has increased exponentially, creating a need for efficient algorithms to reliably detect and locate earthquakes. Today’s most elaborate methods scan through the plethora of continuous seismic records, searching for repeating seismic signals. We leverage the recent advances in artificial intelligence and present ConvNetQuake, a highly scalable convolutional neural network for earthquake detection and location from a single waveform. We apply our technique to study the induced seismicity in Oklahoma, USA. We detect more than 17 times more earthquakes than previously cataloged by the Oklahoma Geological Survey. Our algorithm is orders of magnitude faster than established methods. PMID:29487899

  20. Seismicity Pattern and Fault Structure in the Central Himalaya Seismic Gap Using Precise Earthquake Hypocenters and their Source Parameters

    NASA Astrophysics Data System (ADS)

    Mendoza, M.; Ghosh, A.; Rai, S. S.

    2017-12-01

    The devastation brought on by the Mw 7.8 Gorkha earthquake in Nepal on 25 April 2015, reconditioned people to the high earthquake risk along the Himalayan arc. It is therefore imperative to learn from the Gorkha earthquake, and gain a better understanding of the state of stress in this fault regime, in order to identify areas that could produce the next devastating earthquake. Here, we focus on what is known as the "central Himalaya seismic gap". It is located in Uttarakhand, India, west of Nepal, where a large (> Mw 7.0) earthquake has not occurred for over the past 200 years [Rajendran, C.P., & Rajendran, K., 2005]. This 500 - 800 km long along-strike seismic gap has been poorly studied, mainly due to the lack of modern and dense instrumentation. It is especially concerning since it surrounds densely populated cities, such as New Delhi. In this study, we analyze a rich seismic dataset from a dense network consisting of 50 broadband stations, that operated between 2005 and 2012. We use the STA/LTA filter technique to detect earthquake phases, and the latest tools contributed to the Antelope software environment, to develop a large and robust earthquake catalog containing thousands of precise hypocentral locations, magnitudes, and focal mechanisms. By refining those locations in HypoDD [Waldhauser & Ellsworth, 2000] to form a tighter cluster of events using relative relocation, we can potentially illustrate fault structures in this region with high resolution. Additionally, using ZMAP [Weimer, S., 2001], we perform a variety of statistical analyses to understand the variability and nature of seismicity occurring in the region. Generating a large and consistent earthquake catalog not only brings to light the physical processes controlling the earthquake cycle in an Himalayan seismogenic zone, it also illustrates how stresses are building up along the décollment and the faults that stem from it. With this new catalog, we aim to reveal fault structure, study seismicity patterns, and assess the potential seismic hazard of the central Himalaya seismic gap.

  1. Seismic Structure of the Antarctic Upper Mantle and Transition Zone Unearthed by Full Waveform Adjoint Tomography

    NASA Astrophysics Data System (ADS)

    Lloyd, A. J.; Wiens, D.; Zhu, H.; Tromp, J.; Nyblade, A.; Anandakrishnan, S.; Aster, R. C.; Huerta, A. D.; Winberry, J. P.; Wilson, T. J.; Dalziel, I. W. D.; Hansen, S. E.; Shore, P.

    2017-12-01

    The upper mantle and transition zone beneath Antarctica and the surrounding ocean are among the poorest seismically imaged regions of the Earth's interior. Over the last 1.5 decades researchers have deployed several large temporary broadband seismic arrays focusing on major tectonic features in the Antarctic. The broader international community has also facilitated further instrumentation of the continent, often operating stations in additional regions. As of 2016, waveforms are available from almost 300 unique station locations. Using these stations along with 26 southern mid-latitude seismic stations we have imaged the seismic structure of the upper mantle and transition zone using full waveform adjoint techniques. The full waveform adjoint inversion assimilates phase observations from 3-component seismograms containing P, S, Rayleigh, and Love waves, including reflections and overtones, from 270 earthquakes (5.5 ≤ Mw ≤ 7.0) that occurred between 2001-2003 and 2007-2016. We present the major results of the full waveform adjoint inversion following 20 iterations, resulting in a continental-scale seismic model (ANT_20) with regional-scale resolution. Within East Antarctica, ANT_20 reveals internal seismic heterogeneity and differences in lithospheric thickness. For example, fast seismic velocities extending to 200-300 km depth are imaged beneath both Wilkes Land and the Gamburtsev Subglacial Mountains, whereas fast velocities only extend to 100-200 km depth beneath the Lambert Graben and Enderby Land. Furthermore, fast velocities are not found beneath portions of Dronning Maud Land, suggesting old cratonic lithosphere may be absent. Beneath West Antarctica slow upper mantle seismic velocities are imaged extending from the Balleny Island southward along the Transantarctic Mountains front, and broaden beneath the southern and northern portion of the mountain range. In addition, slow upper mantle velocities are imaged beneath the West Antarctic coast extending from Marie Byrd Land to the Antarctic Peninsula. This region of slow velocity only extends to 150-200 km depth beneath the Antarctic Peninsula, while elsewhere it extends to deeper upper mantle depths and possibly into the transition zone as well as offshore, suggesting two different geodynamic processes are at play.

  2. An integrated study of seismic anisotropy and the natural fracture system at the Conoco Borehole Test Facility, Kay County, Oklahoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Queen, J.H.; Rizer, W.D.

    1990-07-10

    A significant body of published work has developed establishing fracture-related seismic anisotropy as an observable effect. To further the understanding of seismic birefringence techniques in characterizing natural fracture systems at depth, an integrated program of seismic and geologic measurements has been conducted at Conoco's Borehole Test Facility in Kay County, Oklahoma. Birefringence parameters inferred from the seismic data are consistent with a vertical fracture model of density 0.04 striking east-northeast. That direction is subparallel to a fracture set mapped both on the surface and from subsurface data, to the in situ maximum horizontal stress, and to the inferred microfabric.

  3. Noise-based seismic monitoring of the Campi Flegrei caldera

    NASA Astrophysics Data System (ADS)

    Zaccarelli, Lucia; Bianco, Francesca

    2017-03-01

    The Campi Flegrei caldera is one of the highest risk volcanic fields worldwide, because of its eruptive history and the large population hosted within the caldera. It experiences bradiseismic crises: sudden uplift with low energetic seismic swarm occurrences. No seismicity is recorded out of these deformation rate changes. Therefore, a continuous seismic monitoring of the caldera is possible only by means of the ambient seismic noise. We apply a noise-based seismic monitoring technique to the cross correlations of 5 year recordings at the mobile seismic network. The resulting relative velocity variations are compared to the temporal behavior of the geophysical and geochemical observations routinely sampled at Campi Flegrei. We discriminate between two kinds of crustal stress field variations acting at different timescales. They are related to a possible magmatic intrusion and to the gradual heating of the hydrothermal system, respectively. This study sets up the basis for future volcano monitoring strategies.

  4. Engineering geological zonation of a complex landslide system through seismic ambient noise measurements at the Selmun Promontory (Malta)

    NASA Astrophysics Data System (ADS)

    Iannucci, Roberto; Martino, Salvatore; Paciello, Antonella; D'Amico, Sebastiano; Galea, Pauline

    2018-05-01

    The cliff slope of the Selmun Promontory, located in the Northern part of the island of Malta (Central Mediterranean Sea) close to the coastline, is involved in a landslide process as exhibited by the large block-size talus at its bottom. The landslide process is related to the geological succession outcropping in the Selmun area, characterized by the overposition of a grained limestone on a plastic clay, that induces a lateral spreading phenomenon associated with detachment and collapse of different-size rock blocks. The landslide process shapes a typical landscape with a stable plateau of stiff limestone bordered by an unstable cliff slope. The ruins of Għajn Ħadid Tower, the first of the 13 watchtowers built in 1658 by the Grand Master Martin de Redin, stand out on the Selmun Promontory. The conservation of this important heritage site, already damaged by an earthquake which struck the Maltese Archipelago on 1856 October 12, is currently threatened by a progressive retreat of the landslide process towards the inland plateau area. During 2015 and 2016, field surveys were carried out to derive an engineering geological model of the Selmun Promontory. After a high-resolution geomechanical survey, the spatial distribution of the joints affecting the limestone was obtained. At the same time, 116 single-station noise measurements were carried out to cover inland and edge of the limestone plateau as well as the slope where the clays outcrop. The obtained 1-hour time histories were analysed through the horizontal to vertical spectral ratio technique, as well as polarization and ellipticity analysis of particle motion to define the local seismic response in zones having different stability conditions, that is, related to the presence of unstable rock blocks characterized by different vibrational modes. The results obtained demonstrate the suitability of passive seismic geophysical techniques for zoning landslide hazard in case of rock slopes and prove the relevance of anisotropies in conditioning the polarization of vibrational modes for dislodged rock masses.

  5. Elastic Reverse Time Migration (RTM) From Surface Topography

    NASA Astrophysics Data System (ADS)

    Akram, Naveed; Chen, Xiaofei

    2017-04-01

    Seismic Migration is a promising data processing technique to construct subsurface images by projecting the recorded seismic data at surface back to their origins. There are numerous Migration methods. Among them, Reverse Time Migration (RTM) is considered a robust and standard imaging technology in present day exploration industry as well as in academic research field because of its superior performance compared to traditional migration methods. Although RTM is extensive computing and time consuming but it can efficiently handle the complex geology, highly dipping reflectors and strong lateral velocity variation all together. RTM takes data recorded at the surface as a boundary condition and propagates the data backwards in time until the imaging condition is met. It can use the same modeling algorithm that we use for forward modeling. The classical seismic exploration theory assumes flat surface which is almost impossible in practice for land data. So irregular surface topography has to be considered in simulation of seismic wave propagation, which is not always a straightforward undertaking. In this study, Curved grid finite difference method (CG-FDM) is adapted to model elastic seismic wave propagation to investigate the effect of surface topography on RTM results and explore its advantages and limitations with synthetic data experiments by using Foothill model with topography as the true model. We focus on elastic wave propagation rather than acoustic wave because earth actually behaves as an elastic body. Our results strongly emphasize on the fact that irregular surface topography must be considered for modeling of seismic wave propagation to get better subsurface images specially in mountainous scenario and suggest practitioners to properly handled the geometry of data acquired on irregular topographic surface in their imaging algorithms.

  6. Elastic Reverse Time Migration (RTM) From Surface Topography

    NASA Astrophysics Data System (ADS)

    Naveed, A.; Chen, X.

    2016-12-01

    Seismic Migration is a promising data processing technique to construct subsurface images by projecting the recorded seismic data at surface back to their origins. There are numerous Migration methods. Among them, Reverse Time Migration (RTM) is considered a robust and standard imaging technology in present day exploration industry as well as in academic research field because of its superior performance compared to traditional migration methods. Although RTM is extensive computing and time consuming but it can efficiently handle the complex geology, highly dipping reflectors and strong lateral velocity variation all together. RTM takes data recorded at the surface as a boundary condition and propagates the data backwards in time until the imaging condition is met. It can use the same modeling algorithm that we use for forward modeling. The classical seismic exploration theory assumes flat surface which is almost impossible in practice for land data. So irregular surface topography has to be considered in simulation of seismic wave propagation, which is not always a straightforward undertaking. In this study, Curved grid finite difference method (CG-FDM) is adapted to model elastic seismic wave propagation to investigate the effect of surface topography on RTM results and explore its advantages and limitations with synthetic data experiments by using Foothill model with topography as the true model. We focus on elastic wave propagation rather than acoustic wave because earth actually behaves as an elastic body. Our results strongly emphasize on the fact that irregular surface topography must be considered for modeling of seismic wave propagation to get better subsurface images specially in mountainous scenario and suggest practitioners to properly handled the geometry of data acquired on irregular topographic surface in their imaging algorithms.

  7. New Ground Truth Capability from InSAR Time Series Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckley, S; Vincent, P; Yang, D

    2005-07-13

    We demonstrate that next-generation interferometric synthetic aperture radar (InSAR) processing techniques applied to existing data provide rich InSAR ground truth content for exploitation in seismic source identification. InSAR time series analyses utilize tens of interferograms and can be implemented in different ways. In one such approach, conventional InSAR displacement maps are inverted in a final post-processing step. Alternatively, computationally intensive data reduction can be performed with specialized InSAR processing algorithms. The typical final result of these approaches is a synthesized set of cumulative displacement maps. Examples from our recent work demonstrate that these InSAR processing techniques can provide appealing newmore » ground truth capabilities. We construct movies showing the areal and temporal evolution of deformation associated with previous nuclear tests. In other analyses, we extract time histories of centimeter-scale surface displacement associated with tunneling. The potential exists to identify millimeter per year surface movements when sufficient data exists for InSAR techniques to isolate and remove phase signatures associated with digital elevation model errors and the atmosphere.« less

  8. Very-long-period seismic signals - filling the gap between deformation and seismicity

    NASA Astrophysics Data System (ADS)

    Neuberg, Jurgen; Smith, Paddy

    2013-04-01

    Good broadband seismic sensors are capable to record seismic transients with dominant wavelengths of several tens or even hundreds of seconds. This allows us to generate a multi-component record of seismic volcanic events that are located in between the conventional high to low-frequency seismic spectrum and deformation signals. With a much higher temporal resolution and accuracy than e.g. GPS records, these signals fill the gap between seismicity and deformation studies. In this contribution we will review the non-trivial processing steps necessary to retrieve ground deformation from the original velocity seismogram and explore which role the resulting displacement signals have in the analysis of volcanic events. We use examples from Soufriere Hills volcano in Montserrat, West Indies, to discuss the benefits and shortcomings of such methods regarding new insights into volcanic processes.

  9. Source mechanics for monochromatic icequakes produced during iceberg calving at Columbia Glacier, AK

    USGS Publications Warehouse

    O'Neel, Shad; Pfeffer, W.T.

    2007-01-01

    Seismograms recorded during iceberg calving contain information pertaining to source processes during calving events. However, locally variable material properties may cause signal distortions, known as site and path effects, which must be eliminated prior to commenting on source mechanics. We applied the technique of horizontal/vertical spectral ratios to passive seismic data collected at Columbia Glacier, AK, and found no dominant site or path effects. Rather, monochromatic waveforms generated by calving appear to result from source processes. We hypothesize that a fluid-filled crack source model offers a potential mechanism for observed seismograms produced by calving, and fracture-processes preceding calving.

  10. Man-caused seismicity of Kuzbass

    NASA Astrophysics Data System (ADS)

    Emanov, Alexandr; Emanov, Alexey; Leskova, Ekaterina; Fateyev, Alexandr

    2010-05-01

    A natural seismicity of Kuznetsk Basin is confined in the main to mountain frame of Kuznetsk hollow. In this paper materials of experimental work with local station networks within sediment basin are presented. Two types of seismicity display within Kuznetsk hollow have been understood: first, man-caused seismic processes, confined to mine working and concentrated on depths up to one and a half of km; secondly, seismic activations on depths of 2-56 km, not coordinated in plan with coal mines. Every of studied seismic activations consists of large quantity of earthquakes of small powers (Ms=1-3). From one to first tens of earthquakes were recorded in a day. The earthquakes near mine working shift in space along with mine working, and seismic process become stronger at the instant a coal-plough machine is operated, and slacken at the instant the preventive works are executed. The seismic processes near three lavas in Kuznetsk Basin have been studied in detail. Uplift is the most typical focal mechanism. Activated zone near mine working reach in diameter 1-1,5 km. Seismic activations not linked with mine working testify that the subsoil of Kuznetsk hollow remain in stress state in whole. The most probable causes of man-caused action on hollow are processes, coupled with change of physical state of rocks at loss of methane from large volume or change by mine working of rock watering in large volume. In this case condensed rocks, lost gas and water, can press out upwards, realizing the reverse fault mechanism of earthquakes. A combination of stress state of hollow with man-caused action at deep mining may account for incipient activations in Kuznetsk Basin. Today earthquakes happen mainly under mine workings, though damages of workings themselves do not happen, but intensive shaking on surface calls for intent study of so dangerous phenomena. In 2009 replicates of the experiment on research of seismic activations in area of before investigated lavas have been conducted. A spatial displacement of activations along with mine working has been found. An impact of technogeneous factors on behavior of seismic process was investigated. It was demonstrated that industrial explosions in neighboring open-casts have no pronounced effect on seismic process near lavas. Stoppage of mole work in lavas leads to simultaneous changes in man-caused seismicity. The number of technogeneous earthquakes is halved. The earthquakes of small powers remain, but such slack lead to occasional though more strong technogeneous earthquakes.

  11. Seismic multiple attenuation in the northern continent-ocean transition zone of the South China Sea

    NASA Astrophysics Data System (ADS)

    Chen, N.; Li, C. F.

    2017-12-01

    In seismic exploration, especially in marine oil and gas exploration, presence of multiple reflections lowers signal-to-noise ratio of seismic data and makes it difficult to analyze seismic velocity. In northern continent-ocean transition zone of the South China Sea (SCS), low-velocity Cenozoic strata cover sets of high-velocity carbonate strata directly, and over 1000 m thick of sediments were deposited on the igneous basement in the northwest SCS. These sedimentary boundaries generate quite strong impedance interfaces and strong internal multiples. Diffractions as a result of variation of seabed topography, coupled with the vibration, free surface multiples and refraction multiples, cause a variety of strong energy disturbances and missing of frequency component. In this study, we process four recently acquired multichannel reflection seismic profiles from the northern continent-ocean transition zone of the SCS with a new combination of demultiple techniques. There is a variety of strong multiples in the raw data, and the seabed multiple occurs between 9 to 11 seconds in two-way travel time (TWTT), and we apply Surface-related Multiple Elimination (SRME) to attenuate the free surface multiples. After SRME, we use high-resolution Radon transform (RAMUR) to attenuate deep multiples concentrating below 10 seconds in TWTT. Normal moveout correction (NMO) is necessary to flatten true reflections and turn multiples into a parabola before RAMUR, and we can attenuate the deep multiples in theτ-p domain. The seabed topography varies greatly in the continent-ocean transition zone, so the diffractions are well developed. However, SRME and RAMUR are not effective in attenuating diffractions and internal multiples. We select diffracted multiple attenuation (DIMAT) after many trials and detailed analysis. The diffractions are extracted in decomposed frequency bands. The internal multiples below 11 seconds in TWTT and high-amplitude noises are successfully suppressed while keeping the primary events. This combination of SRME, RAMUR and DIMAT in sequence demonstrates to be quite effective in attenuating these types of multiples on the continent-ocean transition zone. Keywords: Continent-ocean transition zone, seismic exploration, data processing, multiple attenuation

  12. Seismic instrumentation plan for the Hawaiian Volcano Observatory

    USGS Publications Warehouse

    Thelen, Weston A.

    2014-01-01

    The installation of new seismic stations is only the first part of building a volcanic early warning capability for seismicity in the State of Hawaii. Additional personnel will likely be required to study the volcanic processes at work under each volcano, analyze the current seismic activity at a level sufficient for early warning, build new tools for monitoring, maintain seismic computing resources, and maintain the new seismic stations.

  13. Multiple attenuation to reflection seismic data using Radon filter and Wave Equation Multiple Rejection (WEMR) method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erlangga, Mokhammad Puput

    Separation between signal and noise, incoherent or coherent, is important in seismic data processing. Although we have processed the seismic data, the coherent noise is still mixing with the primary signal. Multiple reflections are a kind of coherent noise. In this research, we processed seismic data to attenuate multiple reflections in the both synthetic and real seismic data of Mentawai. There are several methods to attenuate multiple reflection, one of them is Radon filter method that discriminates between primary reflection and multiple reflection in the τ-p domain based on move out difference between primary reflection and multiple reflection. However, inmore » case where the move out difference is too small, the Radon filter method is not enough to attenuate the multiple reflections. The Radon filter also produces the artifacts on the gathers data. Except the Radon filter method, we also use the Wave Equation Multiple Elimination (WEMR) method to attenuate the long period multiple reflection. The WEMR method can attenuate the long period multiple reflection based on wave equation inversion. Refer to the inversion of wave equation and the magnitude of the seismic wave amplitude that observed on the free surface, we get the water bottom reflectivity which is used to eliminate the multiple reflections. The WEMR method does not depend on the move out difference to attenuate the long period multiple reflection. Therefore, the WEMR method can be applied to the seismic data which has small move out difference as the Mentawai seismic data. The small move out difference on the Mentawai seismic data is caused by the restrictiveness of far offset, which is only 705 meter. We compared the real free multiple stacking data after processing with Radon filter and WEMR process. The conclusion is the WEMR method can more attenuate the long period multiple reflection than the Radon filter method on the real (Mentawai) seismic data.« less

  14. Imaging the North Anatolian Fault using the scattered teleseismic wavefield

    NASA Astrophysics Data System (ADS)

    Thompson, D. A.; Rost, S.; Houseman, G. A.; Cornwell, D. G.; Turkelli, N.; Teoman, U.; Kahraman, M.; Altuncu Poyraz, S.; Gülen, L.; Utkucu, M.; Frederiksen, A. W.; Rondenay, S.

    2013-12-01

    The North Anatolian Fault Zone (NAFZ) is a major continental strike-slip fault system, similar in size and scale to the San Andreas system, that extends ˜1200 km across Turkey. In 2012, a new multidisciplinary project (FaultLab) was instigated to better understand deformation throughout the entire crust in the NAFZ, in particular the expected transition from narrow zones of brittle deformation in the upper crust to possibly broader shear zones in the lower crust/upper mantle and how these features contribute to the earthquake loading cycle. This contribution will discuss the first results from the seismic component of the project, a 73 station network encompassing the northern and southern branches of the NAFZ in the Sakarya region. The Dense Array for North Anatolia (DANA) is arranged as a 6×11 grid with a nominal station spacing of 7 km, with a further 7 stations located outside of the main grid. With the excellent resolution afforded by the DANA network, we will present images of crustal structure using the technique of teleseismic scattering tomography. The method uses a full waveform inversion of the teleseismic scattered wavefield coupled with array processing techniques to infer the properties and location of small-scale heterogeneities (with scales on the order of the seismic wavelength) within the crust. We will also present preliminary results of teleseismic scattering migration, another powerful method that benefits from the dense data coverage of the deployed seismic network. Images obtained using these methods together with other conventional imaging techniques will provide evidence for how the deformation is distributed within the fault zone at depth, providing constraints that can be used in conjunction with structural analyses of exhumed fault segments and models of geodetic strain-rate across the fault system. By linking together results from the complementary techniques being employed in the FaultLab project, we aim to produce a comprehensive picture of fault structure and dynamics throughout the crust and shallow upper mantle of this major active fault zone.

  15. Seismic Moment and Recurrence using Luminescence Dating Techniques: Characterizing brittle fault zone materials suitable for luminescence dating

    NASA Astrophysics Data System (ADS)

    Tsakalos, E.; Lin, A.; Bassiakos, Y.; Kazantzaki, M.; Filippaki, E.

    2017-12-01

    During a seismic-geodynamic process, frictional heating and pressure are generated on sediments fragments resulting in deformation and alteration of minerals contained in them. The luminescence signal enclosed in minerals crystal lattice can be affected and even zeroed during such an event. This has been breakthrough in geochronological studies as it could be utilized as a chronometer for the previous seismic activity of a tectonically active area. Although the employment of luminescence dating has in some cases been successfully described, a comprehensive study outlining and defining protocols for routine luminescence dating applied to neotectonic studies has not been forthcoming. This study is the experimental investigation, recording and parameterization of the effects of tectonic phenomena on minerals luminescence signal and the development of detailed protocols for the standardization of the luminescence methodology for directly dating deformed geological formations, so that the long-term temporal behaviour of seismically active faults could be reasonably understood and modeled. This will be achieved by: a) identifying and proposing brittle fault zone materials suitable for luminescence dating using petrological, mineralogical and chemical analyses and b) investigating the "zeroing" potential of the luminescence signal of minerals contained in fault zone materials by employing experimental simulations of tectonic processes in the laboratory, combined with luminescence measurements on samples collected from real fault zones. For this to be achieved, a number of samples collected from four faults of four different geographical regions will be used. This preliminary-first step of the study presents the microstructural, and mineralogical analyses for the characterization of brittle fault zone materials that contain suitable minerals for luminescence dating (e.g., quartz and feldspar). The results showed that the collected samples are seismically deformed fault zone materials (mylonites, tectonites, and tectonic breccias etc) and contained enough quantity of minerals suitable for luminescence dating.

  16. Detecting lower-mantle slabs beneath Asia and the Aleutians

    NASA Astrophysics Data System (ADS)

    Schumacher, L.; Thomas, C.

    2016-06-01

    To investigate the descend of subducted slabs we search for and analyse seismic arrivals that reflected off the surface of the slab. In order to distinguish between such arrivals and other seismic phases, we search for waves that reach a seismic array with a backazimuth deviating from the theoretical backazimuth of the earthquake. Source-receiver combinations are chosen in a way that their great circle paths do not intersect the slab region, hence the direct arrivals can serve as reference. We focus on the North and Northwest Pacific region by using earthquakes from Japan, the Philippines and the Hindu Kush area recorded at North American networks (e.g. USArray, Alaska and Canada). Using seismic array techniques for analysing the data and record information on slowness, backazimuth and traveltime of the observed out-of-plane arrivals we use these measurements to trace the wave back through a 1-D velocity model to its scattering/reflection location. We find a number of out-of-plane reflections. Assuming only single scattering, most out-of-plane signals have to travel as P-to-P phases and only a few as S-to-P phases, due to the length of the seismograms we processed. The located reflection points present a view of the 3-D structures within the mantle. In the upper mantle and the transition zone they correlate well with the edges of fast velocity regions in tomographic images. We also find reflection points in the mid- and lower mantle and their locations generally agree with fast velocities mapped by seismic tomography models suggesting that in the subduction regions we map, slabs enter the lower mantle. To validate our approach, we calculate and process synthetic seismograms for 3-D wave field propagation through a model containing a slab-like heterogeneity. We show, that depending on the source-receiver geometry relative to the reflection plane, it is indeed possible to observe and back-trace out-of-plane signals.

  17. The Seismic component of the IBERARRAY: Placing constraints on the Lithosphere and Mantle.

    NASA Astrophysics Data System (ADS)

    Carbonell, R.; Diaz, J.; Villaseñor, A.; Gallart, J.; Morales, J.; Pazos, A.; Cordoba, D.; Pulgar, J.; Garcia-Lobon, J.; Harnafi, M.

    2008-12-01

    TOPOIBERIA, is a multidisciplinary large scale research project which aims to study the links between the deep and superficial processes within the Iberian Peninsula.One of its main experimental components is the deployment of the IBERARRAY seismic network. This is a dense array (60x60 km) of new generation dataloggers equipped with broad-band seismometers which will cover Iberia and North Morocco in three successive deployments, each lasting for about 18 months. The first leg, deployed since late 2007, covers the southern part of Iberia (35 stations) and northern Morocco (20 stations). Two data centers have been established one at the CSIC-Institute of Earth Sciences (CSIC-Barcelona) and a second at the Geologic and Mining Insititute (IGME-Madrid) the data follows a standard-conventional flow from recovery to archival. The field teams collect the recorded hard disk on the field and send data and metadata to a processing center, where raw data is collected and stored and a quality control checking is performed. This include a systematic inspection of the experimental parameters (batteries charge, thermal insulation, time adjustments, geophone leveling etc), the visual verification of the seismic waveforms and the analysis, using power density spectra (PSD), of the noise level of each station. All this information is disseminated between the research teams involved in the project using a dedicated website and the continuous seismic data is made accessible through FTP and CWQ servers. Some of the nodes of the theoretical network are covered by permanent stations of the national broad-band network (IGN) or other networks operating in the region (IAG-UGR, ROA). Data from those stations will also be integrated to the Iberarray database. This Iberarray network will provide a large database of both waveform and catalogued events, with an unprecedented resolution. Earthquake data at local, regional and teleseismic scales will be analyzed using different methodologies. The first result would be an increase in the accuracy of the location of regional seismicity and the termination of focal mechanisms. A special emphasis will be attributed to seismic tomographic techniques using travel times and waveforms of P and S arrivals at different scales as well as surface waves, using dispersion measurements as well as studies dealing with background/environmental noise. In addition, receiver function analysis for seismic imaging of deep lithospheric features and splitting analysis of shear-wave arrivals will also be developed.

  18. Observing the Microseism Source Regions from Space

    NASA Astrophysics Data System (ADS)

    Simard, M.; Kedar, S.; Rodriguez, E.; Webb, F. H.

    2005-12-01

    Correlations of this ambient seismic signal between seismic stations has recently emerged as a powerful technique for tomography of the Earth's crust, allowing continuous global monitoring of the crust to seismogenic depths without relying on the occurrence of earthquakes. The technique has the potential for resolving changes in the crust during periods of little or no earthquake activity. Since ambient seismic noise is predominantly generated by ocean wave-wave interactions known to originate in narrowly defined geographical source areas that vary according to ocean swell state and season, it may be possible to derive physical constraints of the source characteristics by globallyly observing candidate source regions from space. At present, such observations have been confined to point measurements such as directional buoys and ocean-bottom seismometers. Using a technique formulated by Engen and Jonsen [1995], a 'field view' of the generating region can be obtained by deriving ocean directional spectra from Synthetic Aperature Radar (SAR) images by analysis of cross correlation of single-look SAR images. In November 2004, the Jet Propulsion Laboratory's (JPL) air-borne SAR instrument, has collected data off the Alaska coast, while a large storm with wave heights of ~8m was pounding the coast. This was contemporaneous with the recording of strong microseismic activity by the Canadian National Seismic (CNSN). The AirSAR collected over a 100km long, 10km wide swath offshore, the region most likely to involve wave-wave interaction between the incoming swell and coast-reflected waves. JPL has implemented the cross correlation spectral technique, and applied it to the 2004 data-set. We will present results of the analysis of the SAR data in conjunction with analysis of the CNSN broadband seismic data.

  19. Alsep data processing: How we processed Apollo Lunar Seismic Data

    NASA Technical Reports Server (NTRS)

    Latham, G. V.; Nakamura, Y.; Dorman, H. J.

    1979-01-01

    The Apollo lunar seismic station network gathered data continuously at a rate of 3 x 10 to the 8th power bits per day for nearly eight years until the termination in September, 1977. The data were processed and analyzed using a PDP-15 minicomputer. On the average, 1500 long-period seismic events were detected yearly. Automatic event detection and identification schemes proved unsuccessful because of occasional high noise levels and, above all, the risk of overlooking unusual natural events. The processing procedures finally settled on consist of first plotting all the data on a compressed time scale, visually picking events from the plots, transferring event data to separate sets of tapes and performing detailed analyses using the latter. Many problems remain especially for automatically processing extraterrestrial seismic signals.

  20. Seismic Interface Waves in Coastal Waters: A Review

    DTIC Science & Technology

    1980-11-15

    Being at the low- 4 frequency end of classical sonar activity and at the high-frequency end of seismic research, the propagation of infrasonic energy...water areas. Certainly this and other seismic detection methods will never replace the highly-developed sonar techniques but in coastal waters they...for many sonar purposes [5, 85 to 90) shows that very simple bottom models may already be sufficient to make allowance for the influence of the sea

  1. Glutenite bodies sequence division of the upper Es4 in northern Minfeng zone of Dongying Sag, Bohai Bay Basin, China

    NASA Astrophysics Data System (ADS)

    Shao, Xupeng

    2017-04-01

    Glutenite bodies are widely developed in northern Minfeng zone of Dongying Sag. Their litho-electric relationship is not clear. In addition, as the conventional sequence stratigraphic research method drawbacks of involving too many subjective human factors, it has limited deepening of the regional sequence stratigraphic research. The wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data have advantages of dividing sequence stratigraphy quantitatively comparing with the conventional methods. Under the basis of the conventional sequence research method, this paper used the above techniques to divide the fourth-order sequence of the upper Es4 in northern Minfeng zone of Dongying Sag. The research shows that the wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data are essentially consistent, both of which divide sequence stratigraphy quantitatively in the frequency domain; wavelet transform technique has high resolutions. It is suitable for areas with wells. The seismic time-frequency analysis technique has wide applicability, but a low resolution. Both of the techniques should be combined; the upper Es4 in northern Minfeng zone of Dongying Sag is a complete set of third-order sequence, which can be further subdivided into 5 fourth-order sequences that has the depositional characteristics of fine-upward sequence in granularity. Key words: Dongying sag, northern Minfeng zone, wavelet transform technique, time-frequency analysis technique ,the upper Es4, sequence stratigraphy

  2. Surprises from the Magnetotelluric Component of the USArray in the Eastern United States: Perplexing Anticorrelations with Seismic Images and Puzzling Insights into Continental Dynamics

    NASA Astrophysics Data System (ADS)

    Murphy, B. S.; Egbert, G. D.

    2017-12-01

    In addition to its broadband seismic component, the USArray has also been collecting long-period magnetotelluric (MT) data across the continental United States. These data allow for an unprecedented three-dimensional view of the lithospheric geoelectric structure of the continent. As electrical conductivity and seismic properties provide complementary views of the Earth, synthesizing seismic and MT images can reduce ambiguity inherent in each technique and can thereby allow for tighter constraints on lithospheric properties. In the western US, comparison of MT and seismic results has clarified some issues (e.g., with regard to fluids and volatiles) and has raised some new questions, but for the most part the two techniques provide views that generally mesh well together. In sharp contrast, MT and seismic results in the eastern US lead to seemingly contradictory conclusions about lithosphere properties. The most striking example is the Piedmont region of the southeastern United States; here seismic images suggest a relatively thin, warm Phanerozoic lithosphere, while MT images show a large, deep, highly resistive body that seems to require thick, cold, even cratonic lithosphere. While these MT results shed intriguing new light onto the enigmatic post-Paleozoic history of eastern North America, the strong anticorrelation with seismic images remains a mystery. A similar anticorrelation appears to also exist in the Northern Appalachians, and preliminary views of the geoelectric signature of the well-studied Northern Appalachian Anomaly suggest that synthesizing the seismic and MT images of that region may be nontrivial. Clearly, a major challenge in continued analysis of USArray data is the reconciliation of seemingly contradictory seismic and MT images. The path forward in addressing this problem will require closer collaboration between seismologists and MT scientists and will likely require a careful reconsideration of how each group interprets the physical meaning of their respective anomalies.

  3. Characterizing the deformation of reservoirs using interferometry, gravity, and seismic analyses

    NASA Astrophysics Data System (ADS)

    Schiek, Cara Gina

    In this dissertation, I characterize how reservoirs deform using surface and subsurface techniques. The surface technique I employ is radar interferometry, also known as InSAR (Interferometric Synthetic Aperture Radar). The subsurface analyses I explore include gravity modeling and seismic techniques consisting of determining earthquake locations from a small-temporary seismic network of six seismometers. These techniques were used in two different projects to determine how reservoirs deform in the subsurface and how this deformation relates to its remotely sensed surface deformation. The first project uses InSAR to determine land subsidence in the Mimbres basin near Deming, NM. The land subsidence measurements are visually compared to gravity models in order to determine the influence of near surface faults on the subsidence and the physical properties of the aquifers in these basins. Elastic storage coefficients were calculated for the Mimbres basin to aid in determining the stress regime of the aquifers. In the Mimbres basin, I determine that it is experiencing elastic deformation at differing compaction rates. The west side of the Mimbres basin is deforming faster, 17 mm/yr, while the east side of the basin is compacting at a rate of 11 mm/yr. The second project focuses on San Miguel volcano, El Salvador. Here, I integrate InSAR with earthquake locations using surface deformation forward modeling to investigate the explosive volcanism in this region. This investigation determined the areas around the volcano that are undergoing deformation, and that could lead to volcanic hazards such as slope failure from a fractured volcano interior. I use the earthquake epicenters with field data to define the subsurface geometry of the deformation source, which I forward model to produce synthetic interferograms. Residuals between the synthetic and observed interferograms demonstrate that the observed deformation is a direct result of the seismic activity along the San Miguel Fracture Zone. Based on the large number of earthquakes concentrated in this region and the fracturing suggested by the earthquake location results, I conclude that the southwestern slope of San Miguel is the most susceptible to volcanic hazards such as landsliding and flank lava flows. Together these projects explore the dynamics of reservoir systems, both hydrologic and magmatic. They show the utility of geodetic remote sensing to constrain the relative importance of various, complex, subsurface processes, including faulting, fluid migration, and compaction.

  4. Travel time seismic tomography on Reykjanes, SW Iceland

    NASA Astrophysics Data System (ADS)

    Jousset, Philippe; Ágústsson, Kristjan; Blanck, Hanna; Metz, Malte; Franke, Steven; Pàll Hersir, Gylfi; Bruhn, David; Flovenz, Ólafur; Friðleifsson, Guðmundur

    2017-04-01

    We present updated tomographic results obtained using seismic data recorded around geothermal reservoirs located both on-land Reykjanes, SW-Iceland and offshore along Reykjanes Ridge. We gathered records from a network of 234 seismic stations (including 24 Ocean Bottom Seismometers) deployed between April 2014 and August 2015. In order to determine the orientation of the OBS stations, we used Rayleigh waves planar particle motions from large magnitude earthquakes. This method proved suitable using the on-land stations: orientations determined using this method with the orientations measured using a giro-compass agreed. We focus on the 3D velocity images using local earthquakes to perform travel time tomography. The processing includes first arrival picking of P- and S- phases using an automatic detection and picking technique based on Akaike Information Criteria. We locate earthquakes by using a non-linear localization technique, as a priori information for deriving a 1D velocity model. We then computed 3D velocity model by joint inversion of each earthquake's location and velocity lateral anomalies with respect to the 1D model. Our models confirms previous models obtained in the area, with enhanced details. In a second step, we performed inversion of the Vp/Vs ratio. Results indicate a low Vp/Vs ratio anomaly at depth suggesting the absence of large magmatic body under Reykjanes, unlike results obtained at other geothermal field, sucha as Krafla and Hengill. We discuss implications of those results in the light of recent IDDP drilling in Reykjanes.

  5. Seismic envelope-based detection and location of ground-coupled airwaves from volcanoes in Alaska

    USGS Publications Warehouse

    Fee, David; Haney, Matt; Matoza, Robin S.; Szuberla, Curt A.L.; Lyons, John; Waythomas, Christopher F.

    2016-01-01

    Volcanic explosions and other infrasonic sources frequently produce acoustic waves that are recorded by seismometers. Here we explore multiple techniques to detect, locate, and characterize ground‐coupled airwaves (GCA) on volcano seismic networks in Alaska. GCA waveforms are typically incoherent between stations, thus we use envelope‐based techniques in our analyses. For distant sources and planar waves, we use f‐k beamforming to estimate back azimuth and trace velocity parameters. For spherical waves originating within the network, we use two related time difference of arrival (TDOA) methods to detect and localize the source. We investigate a modified envelope function to enhance the signal‐to‐noise ratio and emphasize both high energies and energy contrasts within a spectrogram. We apply these methods to recent eruptions from Cleveland, Veniaminof, and Pavlof Volcanoes, Alaska. Array processing of GCA from Cleveland Volcano on 4 May 2013 produces robust detection and wave characterization. Our modified envelopes substantially improve the short‐term average/long‐term average ratios, enhancing explosion detection. We detect GCA within both the Veniaminof and Pavlof networks from the 2007 and 2013–2014 activity, indicating repeated volcanic explosions. Event clustering and forward modeling suggests that high‐resolution localization is possible for GCA on typical volcano seismic networks. These results indicate that GCA can be used to help detect, locate, characterize, and monitor volcanic eruptions, particularly in difficult‐to‐monitor regions. We have implemented these GCA detection algorithms into our operational volcano‐monitoring algorithms at the Alaska Volcano Observatory.

  6. Waveform Retrieval and Phase Identification for Seismic Data from the CASS Experiment

    NASA Astrophysics Data System (ADS)

    Li, Zhiwei; You, Qingyu; Ni, Sidao; Hao, Tianyao; Wang, Hongti; Zhuang, Cantao

    2013-05-01

    The little destruction to the deployment site and high repeatability of the Controlled Accurate Seismic Source (CASS) shows its potential for investigating seismic wave velocities in the Earth's crust. However, the difficulty in retrieving impulsive seismic waveforms from the CASS data and identifying the seismic phases substantially prevents its wide applications. For example, identification of the seismic phases and accurate measurement of travel times are essential for resolving the spatial distribution of seismic velocities in the crust. Until now, it still remains a challenging task to estimate the accurate travel times of different seismic phases from the CASS data which features extended wave trains, unlike processing of the waveforms from impulsive events such as earthquakes or explosive sources. In this study, we introduce a time-frequency analysis method to process the CASS data, and try to retrieve the seismic waveforms and identify the major seismic phases traveling through the crust. We adopt the Wigner-Ville Distribution (WVD) approach which has been used in signal detection and parameter estimation for linear frequency modulation (LFM) signals, and proves to feature the best time-frequency convergence capability. The Wigner-Hough transform (WHT) is applied to retrieve the impulsive waveforms from multi-component LFM signals, which comprise seismic phases with different arrival times. We processed the seismic data of the 40-ton CASS in the field experiment around the Xinfengjiang reservoir with the WVD and WHT methods. The results demonstrate that these methods are effective in waveform retrieval and phase identification, especially for high frequency seismic phases such as PmP and SmS with strong amplitudes in large epicenter distance of 80-120 km. Further studies are still needed to improve the accuracy on travel time estimation, so as to further promote applicability of the CASS for and imaging the seismic velocity structure.

  7. Tectonics and seismicity of the southern Washington Cascade range

    USGS Publications Warehouse

    Stanley, W.D.; Johnson, S.Y.; Qamar, A.I.; Weaver, C.S.; Williams, J.M.

    1996-01-01

    Geophysical, geological, and seismicity data are combined to develop a transpressional strain model for the southern Washington Cascades region. We use this model to explain oblique fold and fault systems, transverse faults, and a linear seismic zone just west of Mt. Rainier known as the western Rainier zone. We also attempt to explain a concentration of earthquakes that connects the northwest-trending Mount St. Helens seismic zone to the north-trending western Rainier zone. Our tectonic model illustrates the pervasive effects of accretionary processes, combined with subsequent transpressive forces generated by oblique subduction, on Eocene to present crustal processes, such as seismicity and volcanism.

  8. Effects of salt-related mode conversions on subsalt prospecting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ogilvie, J.S.; Purnell, G.W.

    1996-03-01

    Mode conversion of waves during seismic reflection surveys has generally been considered a small phenomenon that could be neglected in data processing and interpretation. However, in subsalt prospecting, the contrast in material properties at the salt/sediment interface is often great enough that significant P-to-S and/or S-to-P conversion occurs. The resulting converted waves can be both a help and a hindrance for subsalt prospecting. A case history from the Mississippi Canyon area of the Gulf of Mexico demonstrates strong converted-wave reflections from the base-of-salt that complicate the evaluation of a subsalt prospect using 3-D seismic data. Before and after stack, themore » converted-wave reflections are evident in 2-D and 3-D surveys across the prospect. Ray-tracing synthetic common midpoint (CMP) gathers provides some useful insights about the occurrence of these waves, but elastic-wave-equation modeling is even more useful. While the latter is more time-consuming, even in 2-D, it also provides a more realistic simulated seismic survey across the prospect, which helps to reveal how some converted waves survive the processes of CMP stack and migration, and thereby present possible pitfalls to an unwary interpreter. The insights gained from the synthetic-data suggest some simple techniques that can assist an interpreter in the 3-D interpretation of subsalt events.« less

  9. A new algorithm to detect earthquakes outside the seismic network: preliminary results

    NASA Astrophysics Data System (ADS)

    Giudicepietro, Flora; Esposito, Antonietta Maria; Ricciolino, Patrizia

    2017-04-01

    In this text we are going to present a new technique for detecting earthquakes outside the seismic network, which are often the cause of fault of automatic analysis system. Our goal is to develop a robust method that provides the discrimination result as quickly as possible. We discriminate local earthquakes from regional earthquakes, both recorded at SGG station, equipped with short period sensors, operated by Osservatorio Vesuviano (INGV) in the Southern Apennines (Italy). The technique uses a Multi Layer Perceptron (MLP) neural network with an architecture composed by an input layer, a hidden layer and a single node output layer. We pre-processed the data using the Linear Predictive Coding (LPC) technique to extract the spectral features of the signals in a compact form. We performed several experiments by shortening the signal window length. In particular, we used windows of 4, 2 and 1 seconds containing the onset of the local and the regional earthquakes. We used a dataset of 103 local earthquakes and 79 regional earthquakes, most of which occurred in Greece, Albania and Crete. We split the dataset into a training set, for the network training, and a testing set to evaluate the network's capacity of discrimination. In order to assess the network stability, we repeated this procedure six times, randomly changing the data composition of the training and testing set and the initial weights of the net. We estimated the performance of this method by calculating the average of correct detection percentages obtained for each of the six permutations. The average performances are 99.02%, 98.04% and 98.53%, which concern respectively the experiments carried out on 4, 2 and 1 seconds signal windows. The results show that our method is able to recognize the earthquakes outside the seismic network using only the first second of the seismic records, with a suitable percentage of correct detection. Therefore, this algorithm can be profitably used to make earthquake automatic analyses more robust and reliable. Finally, with appropriate tuning, it can be integrated in multi-parametric systems for monitoring high natural risk areas.

  10. Influence of Spatial Variation in Ground Motion Peak Acceleration on Local Site Effects Estimation at Bucovina Seismic Array (BURAR) Romania

    NASA Astrophysics Data System (ADS)

    Ghica, D. V.; Radulian, M.; Popa, M.; Grecu, B.

    2006-05-01

    Basically, array processing techniques require a high signal coherency across the seismic site; therefore the local crustal velocities below the station, signal amplitude differences between array elements and local noise conditions, resulting in local site effects will affect calculation of phase arrival times, propagation velocities and ground motion amplitudes. In general, array techniques assume a homogenous structure for all sites, and a simple relief correction is taking in account for the data analysis. To increase the results accuracy, individual element corrections must be applied, based on the biases factors systematically observed. This study aims at identifying the anomalous amplitude variations recorded at the Bucovina Seismic Array (BURAR) and at explaining their influence on site effects estimation. Maximum amplitudes for the teleseismic and regional phases in four narrow frequency bands (0.25-0.5Hz; 0.5-1Hz; 1-2Hz; 1.5-3Hz) are measured. Spatial distribution of ground motion peak acceleration in BURAR site, for each band, is plotted; a different behavior was observed at frequencies below 2Hz. The most important aspect observed is the largest amplitude exhibited by BUR07 across the whole array at high frequencies (an amplification factor of about two). This can be explained by the different geology at BUR07 site (mica schist outcrops), comparing with the rest of elements (green schist outcrops). At the lowest frequencies (0.25-0.5Hz), BUR09 peak amplitudes dominate the other sites. Considering BUR07 as reference site, peak acceleration ratios were investigated. The largest scattering of these ratios appears at the highest frequencies (1.5-3Hz), when the weight of over unit values is about 90 %. No azimuth and distance dependence was found for these effects, suggesting the absence of the dipping layer structures. Although an increase of the ratio values is noticed for epicentral distance between 8000 and 10000 km, for frequencies over 1 Hz. The results of this study are essential to further develop the calibration technique for seismic monitoring with BURAR array, in order to improve the detection and single-array location capabilities of the system.

  11. Mobile seismic exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dräbenstedt, A., E-mail: a.draebenstedt@polytec.de, E-mail: rembe@iei.tu-clausthal.de, E-mail: ulrich.polom@liag-hannover.de; Seyfried, V.; Cao, X.

    2016-06-28

    Laser-Doppler-Vibrometry (LDV) is an established technique to measure vibrations in technical systems with picometer vibration-amplitude resolution. Especially good sensitivity and resolution can be achieved at an infrared wavelength of 1550 nm. High-resolution vibration measurements are possible over more than 100 m distance. This advancement of the LDV technique enables new applications. The detection of seismic waves is an application which has not been investigated so far because seismic waves outside laboratory scales are usually analyzed at low frequencies between approximately 1 Hz and 250 Hz and require velocity resolutions in the range below 1 nm/s/√Hz. Thermal displacements and air turbulence have critical influences to LDVmore » measurements at this low-frequency range leading to noise levels of several 100 nm/√Hz. Commonly seismic waves are measured with highly sensitive inertial sensors (geophones or Micro Electro-Mechanical Sensors (MEMS)). Approaching a laser geophone based on LDV technique is the topic of this paper. We have assembled an actively vibration-isolated optical table in a minivan which provides a hole in its underbody. The laser-beam of an infrared LDV assembled on the optical table impinges the ground below the car through the hole. A reference geophone has detected remaining vibrations on the table. We present the results from the first successful experimental demonstration of contactless detection of seismic waves from a movable vehicle with a LDV as laser geophone.« less

  12. A synthetic seismicity model for the Middle America Trench

    NASA Technical Reports Server (NTRS)

    Ward, Steven N.

    1991-01-01

    A novel iterative technique, based on the concept of fault segmentation and computed using 2D static dislocation theory, for building models of seismicity and fault interaction which are physically acceptable and geometrically and kinematically correct, is presented. The technique is applied in two steps to seismicity observed at the Middle America Trench. The first constructs generic models which randomly draw segment strengths and lengths from a 2D probability distribution. The second constructs predictive models in which segment lengths and strengths are adjusted to mimic the actual geography and timing of large historical earthquakes. Both types of models reproduce the statistics of seismicity over five units of magnitude and duplicate other aspects including foreshock and aftershock sequences, migration of foci, and the capacity to produce both characteristic and noncharacteristic earthquakes. Over a period of about 150 yr the complex interaction of fault segments and the nonlinear failure conditions conspire to transform an apparently deterministic model into a chaotic one.

  13. On the use of a laser ablation as a laboratory seismic source

    NASA Astrophysics Data System (ADS)

    Shen, Chengyi; Brito, Daniel; Diaz, Julien; Zhang, Deyuan; Poydenot, Valier; Bordes, Clarisse; Garambois, Stéphane

    2017-04-01

    Mimic near-surface seismic imaging conducted in well-controlled laboratory conditions is potentially a powerful tool to study large scale wave propagations in geological media by means of upscaling. Laboratory measurements are indeed particularly suited for tests of theoretical modellings and comparisons with numerical approaches. We have developed an automated Laser Doppler Vibrometer (LDV) platform, which is able to detect and register broadband nano-scale displacements on the surface of various materials. This laboratory equipment has already been validated in experiments where piezoelectric transducers were used as seismic sources. We are currently exploring a new seismic source in our experiments, a laser ablation, in order to compensate some drawbacks encountered with piezoelectric sources. The laser ablation source is considered to be an interesting ultrasound wave generator since the 1960s. It was believed to have numerous potential applications such as the Non-Destructive Testing (NDT) and the measurements of velocities and attenuations in solid samples. We aim at adapting and developing this technique into geophysical experimental investigations in order to produce and explore complete micro-seismic data sets in the laboratory. We will first present the laser characteristics including its mechanism, stability, reproducibility, and will evaluate in particular the directivity patterns of such a seismic source. We have started by applying the laser ablation source on the surfaces of multi-scale homogeneous aluminum samples and are now testing it on heterogeneous and fractured limestone cores. Some other results of data processing will also be shown, especially the 2D-slice V P and V S tomographic images obtained in limestone samples. Apart from the experimental records, numerical simulations will be carried out for both the laser source modelling and the wave propagation in different media. First attempts will be done to compare quantitatively the experimental data with simulations. Meanwhile, CT-scan X-ray images of these limestone cores will be used to check the relative pertinences of velocity tomography images produced by this newly developed laser ablation seismic source.

  14. Reflection seismic imaging in the volcanic area of the geothermal field Wayang Windu, Indonesia

    NASA Astrophysics Data System (ADS)

    Polom, Ulrich; Wiyono, Wiyono; Pramono, Bambang; Krawczyk, CharLotte M.

    2014-05-01

    Reflection seismic exploration in volcanic areas is still a scientific challenge and requires major efforts to develop imaging workflows capable of an economic utilization, e.g., for geothermal exploration. The SESaR (Seismic Exploration and Safety Risk study for decentral geothermal plants in Indonesia) project therefore tackles still not well resolved issues concerning wave propagation or energy absorption in areas covered by pyroclastic sediments using both active P-wave and S-wave seismics. Site-specific exploration procedures were tested in different tectonic and lithological regimes to compare imaging conditions. Based on the results of a small-scale, active seismic pre-site survey in the area of the Wayang Windu geothermal field in November 2012, an additional medium-scale active seismic experiment using P-waves was carried out in August 2013. The latter experiment was designed to investigate local changes of seismic subsurface response, to expand the knowledge about capabilities of the vibroseis method for seismic surveying in regions covered by pyroclastic material, and to achieve higher depth penetration. Thus, for the first time in the Wayang Windu geothermal area, a powerful, hydraulically driven seismic mini-vibrator device of 27 kN peak force (LIAG's mini-vibrator MHV2.7) was used as seismic source instead of the weaker hammer blow applied in former field surveys. Aiming at acquiring parameter test and production data southeast of the Wayang Windu geothermal power plant, a 48-channel GEODE recording instrument of the Badan Geologi was used in a high-resolution configuration, with receiver group intervals of 5 m and source intervals of 10 m. Thereby, the LIAG field crew, Star Energy, GFZ Potsdam, and ITB Bandung acquired a nearly 600 m long profile. In general, we observe the successful applicability of the vibroseis method for such a difficult seismic acquisition environment. Taking into account the local conditions at Wayang Windu, the method is superior to the common seismic explosive source techniques, both with respect to production rate as well as resolution and data quality. Source signal frequencies of 20-80 Hz are most efficient for the attempted depth penetration, even though influenced by the dry subsurface conditions during the experiment. Depth penetration ranges between 0.5-1 km. Based on these new experimental data, processing workflows can be tested the first time for adapted imaging strategies. This will not only allow to focus on larger exploration depths covering the geothermal reservoir at the Wayang Windu power plant site itself, but also opens the possibility to transfer the lessons learned to other sites.

  15. Infrasonic and seismic signals from earthquakes and explosions observed with Plostina seismo-acoustic array

    NASA Astrophysics Data System (ADS)

    Ghica, D.; Ionescu, C.

    2012-04-01

    Plostina seismo-acoustic array has been recently deployed by the National Institute for Earth Physics in the central part of Romania, near the Vrancea epicentral area. The array has a 2.5 km aperture and consists of 7 seismic sites (PLOR) and 7 collocated infrasound instruments (IPLOR). The array is being used to assess the importance of collocated seismic and acoustic sensors for the purposes of (1) seismic monitoring of the local and regional events, and (2) acoustic measurement, consisting of detection of the infrasound events (explosions, mine and quarry blasts, earthquakes, aircraft etc.). This paper focuses on characterization of infrasonic and seismic signals from the earthquakes and explosions (accidental and mining type). Two Vrancea earthquakes with magnitude above 5.0 were selected to this study: one occurred on 1st of May 2011 (MD = 5.3, h = 146 km), and the other one, on 4th October 2011 (MD = 5.2, h = 142 km). The infrasonic signals from the earthquakes have the appearance of the vertical component of seismic signals. Because the mechanism of the infrasonic wave formation is the coupling of seismic waves with the atmosphere, trace velocity values for such signals are compatible with the characteristics of the various seismic phases observed with PLOR array. The study evaluates and characterizes, as well, infrasound and seismic data recorded from the explosion caused by the military accident produced at Evangelos Florakis Naval Base, in Cyprus, on 11th July 2011. Additionally, seismo-acoustic signals presumed to be related to strong mine and quarry blasts were investigated. Ground truth of mine observations provides validation of this interpretation. The combined seismo-acoustic analysis uses two types of detectors for signal identification: one is the automatic detector DFX-PMCC, applied for infrasound detection and characterization, while the other one, which is used for seismic data, is based on array processing techniques (beamforming and frequency-wave number analysis). Spectrograms of the recorded infrasonic and seismic data were examined, showing that an earthquake produces acoustic signals with a high energy in the 1 to 5 Hz frequency range, while, for the explosion, this range lays below 0.6 Hz. Using the combined analysis of the seismic and acoustic data, Plostina array can greatly enhance the event detection and localization in the region. The analysis can be, as well, particularly important in identifying sources of industrial explosion, and therefore, in monitoring of the hazard created both by earthquakes and anthropogenic sources of pollution (chemical factories, nuclear and power plants, refineries, mines).

  16. Automated seismic detection of landslides at regional scales: a Random Forest based detection algorithm

    NASA Astrophysics Data System (ADS)

    Hibert, C.; Michéa, D.; Provost, F.; Malet, J. P.; Geertsema, M.

    2017-12-01

    Detection of landslide occurrences and measurement of their dynamics properties during run-out is a high research priority but a logistical and technical challenge. Seismology has started to help in several important ways. Taking advantage of the densification of global, regional and local networks of broadband seismic stations, recent advances now permit the seismic detection and location of landslides in near-real-time. This seismic detection could potentially greatly increase the spatio-temporal resolution at which we study landslides triggering, which is critical to better understand the influence of external forcings such as rainfalls and earthquakes. However, detecting automatically seismic signals generated by landslides still represents a challenge, especially for events with small mass. The low signal-to-noise ratio classically observed for landslide-generated seismic signals and the difficulty to discriminate these signals from those generated by regional earthquakes or anthropogenic and natural noises are some of the obstacles that have to be circumvented. We present a new method for automatically constructing instrumental landslide catalogues from continuous seismic data. We developed a robust and versatile solution, which can be implemented in any context where a seismic detection of landslides or other mass movements is relevant. The method is based on a spectral detection of the seismic signals and the identification of the sources with a Random Forest machine learning algorithm. The spectral detection allows detecting signals with low signal-to-noise ratio, while the Random Forest algorithm achieve a high rate of positive identification of the seismic signals generated by landslides and other seismic sources. The processing chain is implemented to work in a High Performance Computers centre which permits to explore years of continuous seismic data rapidly. We present here the preliminary results of the application of this processing chain for years of continuous seismic record by the Alaskan permanent seismic network and Hi-Climb trans-Himalayan seismic network. The processing chain we developed also opens the possibility for a near-real time seismic detection of landslides, in association with remote-sensing automated detection from Sentinel 2 images for example.

  17. Investigating source processes of isotropic events

    NASA Astrophysics Data System (ADS)

    Chiang, Andrea

    This dissertation demonstrates the utility of the complete waveform regional moment tensor inversion for nuclear event discrimination. I explore the source processes and associated uncertainties for explosions and earthquakes under the effects of limited station coverage, compound seismic sources, assumptions in velocity models and the corresponding Green's functions, and the effects of shallow source depth and free-surface conditions. The motivation to develop better techniques to obtain reliable source mechanism and assess uncertainties is not limited to nuclear monitoring, but they also provide quantitative information about the characteristics of seismic hazards, local and regional tectonics and in-situ stress fields of the region . This dissertation begins with the analysis of three sparsely recorded events: the 14 September 1988 US-Soviet Joint Verification Experiment (JVE) nuclear test at the Semipalatinsk test site in Eastern Kazakhstan, and two nuclear explosions at the Chinese Lop Nor test site. We utilize a regional distance seismic waveform method fitting long-period, complete, three-component waveforms jointly with first-motion observations from regional stations and teleseismic arrays. The combination of long period waveforms and first motion observations provides unique discrimination of these sparsely recorded events in the context of the Hudson et al. (1989) source-type diagram. We examine the effects of the free surface on the moment tensor via synthetic testing, and apply the moment tensor based discrimination method to well-recorded chemical explosions. These shallow chemical explosions represent rather severe source-station geometry in terms of the vanishing traction issues. We show that the combined waveform and first motion method enables the unique discrimination of these events, even though the data include unmodeled single force components resulting from the collapse and blowout of the quarry face immediately following the initial explosion. In contrast, recovering the announced explosive yield using seismic moment estimates from moment tensor inversion remains challenging but we can begin to put error bounds on our moment estimates using the NSS technique. The estimation of seismic source parameters is dependent upon having a well-calibrated velocity model to compute the Green's functions for the inverse problem. Ideally, seismic velocity models are calibrated through broadband waveform modeling, however in regions of low seismicity velocity models derived from body or surface wave tomography may be employed. Whether a velocity model is 1D or 3D, or based on broadband seismic waveform modeling or the various tomographic techniques, the uncertainty in the velocity model can be the greatest source of error in moment tensor inversion. These errors have not been fully investigated for the nuclear discrimination problem. To study the effects of unmodeled structures on the moment tensor inversion, we set up a synthetic experiment where we produce synthetic seismograms for a 3D model (Moschetti et al., 2010) and invert these data using Green's functions computed with a 1D velocity mode (Song et al., 1996) to evaluate the recoverability of input solutions, paying particular attention to biases in the isotropic component. The synthetic experiment results indicate that the 1D model assumption is valid for moment tensor inversions at periods as short as 10 seconds for the 1D western U.S. model (Song et al., 1996). The correct earthquake mechanisms and source depth are recovered with statistically insignificant isotropic components as determined by the F-test. Shallow explosions are biased by the theoretical ISO-CLVD tradeoff but the tectonic release component remains low, and the tradeoff can be eliminated with constraints from P wave first motion. Path-calibration to the 1D model can reduce non-double-couple components in earthquakes, non-isotropic components in explosions and composite sources and improve the fit to the data. When we apply the 3D model to real data, at long periods (20-50 seconds), we see good agreement in the solutions between the 1D and 3D models and slight improvement in waveform fits when using the 3D velocity model Green's functions. (Abstract shortened by ProQuest.).

  18. Instantaneous Frequency Attribute Comparison

    NASA Astrophysics Data System (ADS)

    Yedlin, M. J.; Margrave, G. F.; Ben Horin, Y.

    2013-12-01

    The instantaneous seismic data attribute provides a different means of seismic interpretation, for all types of seismic data. It first came to the fore in exploration seismology in the classic paper of Taner et al (1979), entitled " Complex seismic trace analysis". Subsequently a vast literature has been accumulated on the subject, which has been given an excellent review by Barnes (1992). In this research we will compare two different methods of computation of the instantaneous frequency. The first method is based on the original idea of Taner et al (1979) and utilizes the derivative of the instantaneous phase of the analytic signal. The second method is based on the computation of the power centroid of the time-frequency spectrum, obtained using either the Gabor Transform as computed by Margrave et al (2011) or the Stockwell Transform as described by Stockwell et al (1996). We will apply both methods to exploration seismic data and the DPRK events recorded in 2006 and 2013. In applying the classical analytic signal technique, which is known to be unstable, due to the division of the square of the envelope, we will incorporate the stabilization and smoothing method proposed in the two paper of Fomel (2007). This method employs linear inverse theory regularization coupled with the application of an appropriate data smoother. The centroid method application is straightforward and is based on the very complete theoretical analysis provided in elegant fashion by Cohen (1995). While the results of the two methods are very similar, noticeable differences are seen at the data edges. This is most likely due to the edge effects of the smoothing operator in the Fomel method, which is more computationally intensive, when an optimal search of the regularization parameter is done. An advantage of the centroid method is the intrinsic smoothing of the data, which is inherent in the sliding window application used in all Short-Time Fourier Transform methods. The Fomel technique has a larger CPU run-time, resulting from the necessary matrix inversion. Barnes, Arthur E. "The calculation of instantaneous frequency and instantaneous bandwidth.", Geophysics, 57.11 (1992): 1520-1524. Fomel, Sergey. "Local seismic attributes.", Geophysics, 72.3 (2007): A29-A33. Fomel, Sergey. "Shaping regularization in geophysical-estimation problems." , Geophysics, 72.2 (2007): R29-R36. Stockwell, Robert Glenn, Lalu Mansinha, and R. P. Lowe. "Localization of the complex spectrum: the S transform."Signal Processing, IEEE Transactions on, 44.4 (1996): 998-1001. Taner, M. Turhan, Fulton Koehler, and R. E. Sheriff. "Complex seismic trace analysis." Geophysics, 44.6 (1979): 1041-1063. Cohen, Leon. "Time frequency analysis theory and applications."USA: Prentice Hall, (1995). Margrave, Gary F., Michael P. Lamoureux, and David C. Henley. "Gabor deconvolution: Estimating reflectivity by nonstationary deconvolution of seismic data." Geophysics, 76.3 (2011): W15-W30.

  19. Processing Approaches for DAS-Enabled Continuous Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Dou, S.; Wood, T.; Freifeld, B. M.; Robertson, M.; McDonald, S.; Pevzner, R.; Lindsey, N.; Gelvin, A.; Saari, S.; Morales, A.; Ekblaw, I.; Wagner, A. M.; Ulrich, C.; Daley, T. M.; Ajo Franklin, J. B.

    2017-12-01

    Distributed Acoustic Sensing (DAS) is creating a "field as laboratory" capability for seismic monitoring of subsurface changes. By providing unprecedented spatial and temporal sampling at a relatively low cost, DAS enables field-scale seismic monitoring to have durations and temporal resolutions that are comparable to those of laboratory experiments. Here we report on seismic processing approaches developed during data analyses of three case studies all using DAS-enabled seismic monitoring with applications ranging from shallow permafrost to deep reservoirs: (1) 10-hour downhole monitoring of cement curing at Otway, Australia; (2) 2-month surface monitoring of controlled permafrost thaw at Fairbanks, Alaska; (3) multi-month downhole and surface monitoring of carbon sequestration at Decatur, Illinois. We emphasize the data management and processing components relevant to DAS-based seismic monitoring, which include scalable approaches to data management, pre-processing, denoising, filtering, and wavefield decomposition. DAS has dramatically increased the data volume to the extent that terabyte-per-day data loads are now typical, straining conventional approaches to data storage and processing. To achieve more efficient use of disk space and network bandwidth, we explore improved file structures and data compression schemes. Because noise floor of DAS measurements is higher than that of conventional sensors, optimal processing workflow involving advanced denoising, deconvolution (of the source signatures), and stacking approaches are being established to maximize signal content of DAS data. The resulting workflow of data management and processing could accelerate the broader adaption of DAS for continuous monitoring of critical processes.

  20. Work flow of signal processing data of ground penetrating radar case of rigid pavement measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Handayani, Gunawan

    The signal processing of Ground Penetrating Radar (GPR) requires a certain work flow to obtain good results. Even though the Ground Penetrating Radar data looks similar with seismic reflection data, but the GPR data has particular signatures that the seismic reflection data does not have. This is something to do with coupling between antennae and the ground surface. Because of this, the GPR data should be treated differently from the seismic signal data processing work flow. Even though most of the processing steps still follow the same work flow of seismic reflection data such as: filtering, predictive deconvolution etc. Thismore » paper presents the work flow of GPR processing data on rigid pavement measurements. The processing steps start from raw data, de-Wow process, remove DC and continue with the standard process to get rid of noises i.e. filtering process. Some radargram particular features of rigid pavement along with pile foundations are presented.« less

  1. Processing and review interface for strong motion data (PRISM) software, version 1.0.0—Methodology and automated processing

    USGS Publications Warehouse

    Jones, Jeanne; Kalkan, Erol; Stephens, Christopher

    2017-02-23

    A continually increasing number of high-quality digital strong-motion records from stations of the National Strong-Motion Project (NSMP) of the U.S. Geological Survey (USGS), as well as data from regional seismic networks within the United States, call for automated processing of strong-motion records with human review limited to selected significant or flagged records. The NSMP has developed the Processing and Review Interface for Strong Motion data (PRISM) software to meet this need. In combination with the Advanced National Seismic System Quake Monitoring System (AQMS), PRISM automates the processing of strong-motion records. When used without AQMS, PRISM provides batch-processing capabilities. The PRISM version 1.0.0 is platform independent (coded in Java), open source, and does not depend on any closed-source or proprietary software. The software consists of two major components: a record processing engine and a review tool that has a graphical user interface (GUI) to manually review, edit, and process records. To facilitate use by non-NSMP earthquake engineers and scientists, PRISM (both its processing engine and review tool) is easy to install and run as a stand-alone system on common operating systems such as Linux, OS X, and Windows. PRISM was designed to be flexible and extensible in order to accommodate new processing techniques. This report provides a thorough description and examples of the record processing features supported by PRISM. All the computing features of PRISM have been thoroughly tested.

  2. Methods and benefits of experimental seismic evaluation of nuclear power plants. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-07-01

    This study reviews experimental techniques, instrumentation requirements, safety considerations, and benefits of performing vibration tests on nuclear power plant containments and internal components. The emphasis is on testing to improve seismic structural models. Techniques for identification of resonant frequencies, damping, and mode shapes, are discussed. The benefits of testing with regard to increased damping and more accurate computer models are oulined. A test plan, schedule and budget are presented for a typical PWR nuclear power plant.

  3. Subband Coding Methods for Seismic Data Compression

    NASA Technical Reports Server (NTRS)

    Kiely, A.; Pollara, F.

    1995-01-01

    This paper presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The compression technique described could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.

  4. Mapping Fluid Injection and Associated Induced Seismicity Using InSAR Analysis

    NASA Astrophysics Data System (ADS)

    Thorpe, S. D.; Tiampo, K. F.

    2016-12-01

    In recent years there has been a rise in unconventional oil and gas production in western North America which has been coupled with an increase in the number of earthquakes recorded in these regions, commonly referred to as "induced seismicity" (Ellsworth, 2013). As fluid is pumped into the subsurface during hydraulic fracturing or fluid disposal, the state of stress within the subsurface changes, potentially reactivating pre-existing faults and/or causing subsidence or uplift of the surface. This anthropogenic surface deformation also provides significant hazard to communities and structures surrounding these hydraulic fracturing or fluid disposal sites (Barnhart et al., 2014; Shirzaei et al., 2016). This study aims to relate, both spatially and temporally, this surface deformation to hydraulic fracturing and fluid disposal operations in Alberta (AB) and British Columbia (BC) using Differential Interferometric Synthetic Aperture Radar (InSAR) analysis. Satellite-based geodetic methods such as InSAR provide frequent measurements of ground deformation at high spatial resolution. Based on locations of previously identified induced seismicity in areas throughout AB and BC, images were acquired for multiple locations from the Canadian RADARSAT-2 satellite, including Fort St. John and Fox Creek, AB (Atkinson et al., 2016). Using advanced processing techniques, these images then were stacked to generate coherent interferograms. We present results from this processing as a set of time series that are correlated with both hydraulic fracturing and fluid disposal sites at each location. These results reveal the temporal and spatial relationship between well injection activity and associated induced seismicity in western Canada. Future work will utilise these time series to model subsurface fluid flow, providing important information regarding the nature of the subsurface structure and associated aquifer due to fluid injection and withdrawal.

  5. Overview and First Results of an In-situ Stimulation Experiment in Switzerland

    NASA Astrophysics Data System (ADS)

    Amann, F.; Gischig, V.; Doetsch, J.; Jalali, M.; Valley, B.; Evans, K. F.; Krietsch, H.; Dutler, N.; Villiger, L.

    2017-12-01

    A decameter-scale in-situ stimulation and circulation (ISC) experiment is currently being conducted at the Grimsel Test Site in Switzerland with the objective of improving our understanding of key seismo-hydro-mechanical coupled processes associated with high pressure fluid injections in a moderately fractured crystalline rock mass. The ISC experiment activities aim to support the development of EGS technology by 1) advancing the understanding of fundamental processes that occur within the rock mass in response to relatively large-volume fluid injections at high pressures, 2) improving the ability to estimate and model induced seismic hazard and risks, 3) assessing the potential of different injection protocols to keep seismic event magnitudes below an acceptable threshold, 4) developing novel monitoring and imaging techniques for pressure, temperature, stress, strain and displacement as well as geophysical methods such as ground penetration radar, passive and active seismic and 5) generating a high-quality benchmark datasets that facilitates the development and validation of numerical modelling tools. The ISC experiment includes six fault slip and five hydraulic fracturing experiments at an intermediate scale (i.e. 20*20*20m) at 480m depth, which allows high resolution monitoring of the evolution of pore pressure in the stimulated fault zone and the surrounding rock matrix, fault dislocations including shear and dilation, and micro-seismicity in an exceptionally well characterized structural setting. In February 2017 we performed the fault-slip experiments on interconnected faults. Subsequently an intense phase of post-stimulation hydraulic characterization was performed. In Mai 2017 we performed hydraulic fracturing tests within test intervals that were free of natural fractures. In this contribution we give an overview and show first results of the above mentioned stimulation tests.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burrell, J.; Luheshi, M.; Mackenzie, A.

    Gyda field (operated by BP) is located in Block 2/1 of the Norwegian outer continental shelf. The reservoir comprises a thin, wedge-shaped Upper Jurassic sand, overlain by Lower Cretaceous mudstones. For field development, it is necessary to accurately map a laterally discontinuous high-porosity zone and thus to help site well locations. To this end, it was decided to invert the 3-D seismic data set over the field to the seismic attribute of absolute acoustic impedance (AAI). This was based on the observation that there is a good correlation between porosity and AII derived from well logs. Comparisons of core porosity,more » log-derived porosity, and seismic-derived porosity at several well locations showed this technique to be generally satisfactory. An additional problem in Gyda is the detection of the truncation edge of the reservoir along the southeastern part of the field. Deterministic methods based on AAI and on forward seismic modeling were not able to unambiguously define the edge of the reservoir. The truncation of th reservoir is not clear on normal seismic amplitude displays. In order to investigate the zone where the reservoir interval changes form sand to shale, certain special seismic attributes were computer over a gate of seismic data covering the top reservoir reflection. These attributes represented the energy, phase, and frequency content of the gate of seismic data. The area investigated was between wells where the reservoir sand was known to pinch out. These attributes were clustered using the statistical technique of projection pursuit. The cluster map correlates with the observations from the wells in this area of the field and appears to show the edge of the effective reservoir in the field.« less

  7. Enhancing our View of the Reservoir: New Insights into Deepwater Gulf of Mexico fields using Frequency Decomposition

    NASA Astrophysics Data System (ADS)

    Murat, M.

    2017-12-01

    Color-blended frequency decomposition is a seismic attribute that can be used to educe or draw out and visualize geomorphological features enabling a better understanding of reservoir architecture and connectivity for both exploration and field development planning. Color-blended frequency decomposition was applied to seismic data in several areas of interest in the Deepwater Gulf of Mexico. The objective was stratigraphic characterization to better define reservoir extent, highlight depositional features, identify thicker reservoir zones and examine potential connectivity issues due to stratigraphic variability. Frequency decomposition is a technique to analyze changes in seismic frequency caused by changes in the reservoir thickness, lithology and fluid content. This technique decomposes or separates the seismic frequency spectra into discrete bands of frequency limited seismic data using digital filters. The workflow consists of frequency (spectral) decomposition, RGB color blending of three frequency slices, and horizon or stratal slicing of the color blended frequency data for interpretation. Patterns were visualized and identified in the data that were not obvious on standard stacked seismic sections. These seismic patterns were interpreted and compared to known geomorphological patterns and their environment of deposition. From this we inferred the distribution of potential reservoir sand versus non-reservoir shale and even finer scale details such as the overall direction of the sediment transport and relative thickness. In exploratory areas, stratigraphic characterization from spectral decomposition is used for prospect risking and well planning. Where well control exists, we can validate the seismic observations and our interpretation and use the stratigraphic/geomorphological information to better inform decisions on the need for and placement of development wells.

  8. Time-lapse 3-D seismic imaging of shallow subsurface contaminant flow.

    PubMed

    McKenna, J; Sherlock, D; Evans, B

    2001-12-01

    This paper presents a physical modelling study outlining a technique whereby buoyant contaminant flow within water-saturated unconsolidated sand was remotely monitored utilizing the time-lapse 3-D (TL3-D) seismic response. The controlled temperature and pressure conditions, along with the high level of acquisition repeatability attainable using sandbox physical models, allow the TL3-D seismic response to pore fluid movement to be distinguished from all other effects. TL3-D seismic techniques are currently being developed to monitor hydrocarbon reserves within producing reservoirs in an endeavour to improve overall recovery. However, in many ways, sandbox models under atmospheric conditions more accurately simulate the shallow subsurface than petroleum reservoirs. For this reason, perhaps the greatest application for analogue sandbox modelling is to improve our understanding of shallow groundwater and environmental flow mechanisms. Two fluid flow simulations were conducted whereby air and kerosene were injected into separate water-saturated unconsolidated sand models. In both experiments, a base 3-D seismic volume was recorded and compared with six later monitor surveys recorded while the injection program was conducted. Normal incidence amplitude and P-wave velocity information were extracted from the TL3-D seismic data to provide visualization of contaminant migration. Reflection amplitudes displayed qualitative areal distribution of fluids when a suitable impedance contrast existed between pore fluids. TL3-D seismic reflection tomography can potentially monitor the change in areal distribution of fluid contaminants over time, indicating flow patterns. However, other research and this current work have not established a quantifiable relationship between either normal reflection amplitudes and attenuation and fluid saturation. Generally, different pore fluids will have unique seismic velocities due to differences in compressibility and density. The predictable relationships that exist between P-wave velocity and fluid saturation can allow a quantitative assessment of contaminant migration.

  9. Imaging Subsurface Structure of Tehran/Iran region using Ambient Seismic Noise Tomography

    NASA Astrophysics Data System (ADS)

    Shirzad Iraj, Taghi; Shmomali, Z. Hossein

    2013-04-01

    Tehran, capital of Iran, is surrounded by many active faults (including Mosha, North Tehran and North and/or South Rey faults), however our knowledge about the 3D velocity structure of the study area is limited. Recent developments in seismology have shown that cross-correlation of a long time ambient seismic noise recorded by pair of stations, contain information about the Green's function between the stations. Thus ambient seismic noise carries valuable information of propagation path which can be extracted. We obtained 2D model of shear wave velocity (Vs) for Tehran/Iran area using seismic ambient noise tomography (ANT) method. In this study, we use continuous vertical component of data recorded by TDMMO (Tehran Disaster Mitigation and Management Organization) and IRSC (Iranian Seismological Center) networks in the Tehran/Iran area. The TDMMO and IRSC networks are equipped with CMG-5TD Guralp sensor and SS-1 Kinemetrics sensor respectively. We use data from 25 stations for 12 months from 2009/Oct. to 2010/Oct. Data processing is similar to that explained in detail by Bensen et al. (2007) including processed daily base data. The mean, trend, and instrument response were removed and the data were decimated to 10 sps. One-bit time-domain normalization was then applied to suppress the influence of instrument irregularities and earthquake signals followed by spectral normalization between 0.1-1.0 Hz (period 1-10 sec). After cross-correlation processing, we implement a new stacking method to stack many cross-correlation functions bases on the highest energy in a time interval which we expect to receive the Rayleigh wave fundamental mode. We then obtained group velocity of Rayleigh wave by using phase match filtering and frequency-time analysis techniques. Finally, we applied iterative inversion method to extract Vs model of shallow structure in the Tehran/Iran area.

  10. 4-D Visualization of Seismic and Geodetic Data of the Big Island of Hawai'i

    NASA Astrophysics Data System (ADS)

    Burstein, J. A.; Smith-Konter, B. R.; Aryal, A.

    2017-12-01

    For decades Hawai'i has served as a natural laboratory for studying complex interactions between magmatic and seismic processes. Investigating characteristics of these processes, as well as the crustal response to major Hawaiian earthquakes, requires a synthesis of seismic and geodetic data and models. Here, we present a 4-D visualization of the Big Island of Hawai'i that investigates geospatial and temporal relationships of seismicity, seismic velocity structure, and GPS crustal motions to known volcanic and seismically active features. Using the QPS Fledermaus visualization package, we compile 90 m resolution topographic data from NASA's Shuttle Radar Topography Mission (SRTM) and 50 m resolution bathymetric data from the Hawaiian Mapping Research Group (HMRG) with a high-precision earthquake catalog of more than 130,000 events from 1992-2009 [Matoza et al., 2013] and a 3-D seismic velocity model of Hawai'i [Lin et al., 2014] based on seismic data from the Hawaiian Volcano Observatory (HVO). Long-term crustal motion vectors are integrated into the visualization from HVO GPS time-series data. These interactive data sets reveal well-defined seismic structure near the summit areas of Mauna Loa and Kilauea volcanoes, where high Vp and high Vp/Vs anomalies at 5-12 km depth, as well as clusters of low magnitude (M < 3.5) seismicity, are observed. These areas of high Vp and high Vp/Vs are interpreted as mafic dike complexes and the surrounding seismic clusters are associated with shallow magma processes. GPS data are also used to help identify seismic clusters associated with the steady crustal detachment of the south flank of Kilauea's East Rift Zone. We also investigate the fault geometry of the 2006 M6.7 Kiholo Bay earthquake event by analyzing elastic dislocation deformation modeling results [Okada, 1985] and HVO GPS and seismic data of this event. We demonstrate the 3-D fault mechanisms of the Kiholo Bay main shock as a combination of strike-slip and dip-slip components (net slip 0.55 m) delineating a 30 km east-west striking, southward-dipping fault plane, occurring at 39 km depth. This visualization serves as a resource for advancing scientific analyses of Hawaiian seismic processes, as well as an interactive educational tool for demonstrating the geospatial and geophysical structure of the Big Island of Hawai'i.

  11. The utility of petroleum seismic exploration data in delineating structural features within salt anticlines

    USGS Publications Warehouse

    Stockton, S.L.; Balch, Alfred H.

    1978-01-01

    The Salt Valley anticline, in the Paradox Basin of southeastern Utah, is under investigation for use as a location for storage of solid nuclear waste. Delineation of thin, nonsalt interbeds within the upper reaches of the salt body is extremely important because the nature and character of any such fluid- or gas-saturated horizons would be critical to the mode of emplacement of wastes into the structure. Analysis of 50 km of conventional seismic-reflection data, in the vicinity of the anticline, indicates that mapping of thin beds at shallow depths may well be possible using a specially designed adaptation of state-of-the-art seismic oil-exploration procedures. Computer ray-trace modeling of thin beds in salt reveals that the frequency and spatial resolution required to map the details of interbeds at shallow depths (less than 750 m) may be on the order of 500 Hz, with surface-spread lengths of less than 350 m. Consideration should be given to the burial of sources and receivers in order to attenuate surface noise and to record the desired high frequencies. Correlation of the seismic-reflection data with available well data and surface geology reveals the complex, structurally initiated diapir, whose upward flow was maintained by rapid contemporaneous deposition of continental clastic sediments on its flanks. Severe collapse faulting near the crests of these structures has distorted the seismic response. Evidence exists, however, that intrasalt thin beds of anhydrite, dolomite, and black shale are mappable on seismic record sections either as short, discontinuous reflected events or as amplitude anomalies that result from focusing of the reflected seismic energy by the thin beds; computer modeling of the folded interbeds confirms both of these as possible causes of seismic response from within the salt diapir. Prediction of the seismic signatures of the interbeds can be made from computer-model studies. Petroleum seismic-reflection data are unsatisfactory for mapping the thin beds because of the lack of sufficient resolution to provide direct evidence of the presence of the thin beds. However, indirect evidence, present in these data as discontinuous seismic events, suggests that two geophysical techniques designed for this specific problem would allow direct detection of the interbeds in salt. These techniques are vertical seismic profiling and shallow, short-offset, high-frequency, seismic-reflection recording.

  12. Ultrasonic Tomography of Fractured Rocks to Characterize Elastic Weakening Induced by Finite-Amplitude Waves

    NASA Astrophysics Data System (ADS)

    Riviere, J.; Roux, P.

    2017-12-01

    The use of seismic noise in seismology enables one to detect small velocity changes induced by earthquakes, earth tides or volcanic activity. In particular, co-seismic drops in velocity followed by a slow relaxation back (or partially back) to the original velocity have been observed across various tectonic regions. The co-seismic drop is typically attributed to the creation of damage within the fault zone, while the slow recovery is attributed to post-seismic healing processes. At the laboratory scale, a dynamic perturbation of strain amplitude as low as 10-6 in rocks also results in a transient elastic softening, followed by a log(t)-type relaxation back to the initial state once the perturbation is turned off. This suggests that radiated waves produced during unstable slip are partially responsible for the co-seismic velocity drops. The main objective of this work is to help interpret the elastic changes observed in the field and in particular to disentangle velocity drops that originate from damage creation along the slip surface from the ones produced during radiation of finite-amplitude waves. To do so, we use a technique called Dynamic Acousto-Elastic Testing that provides comprehensive details on the nonlinear elastic response of consolidated granular media (e.g. rocks), including tension/compression asymmetry, hysteretic behaviors as well as conditioning and relaxation effects. Such technique uses a pump-probe scheme where a high frequency, low amplitude wave probes the state of a sample that is dynamically disturbed by a low frequency, large amplitude pump wave. While previous work typically involved a single pair of probing transducers, here we use two dense arrays of ultrasonic transducers to image a sample of Westerly granite with a complex fracture. We apply double beamforming to disentangle complex arrivals and conduct ray-based and finite-frequency tomography using both travel time and amplitude information. By comparing images obtained before, during and after the pump wave disturbance, we are able to locate and characterize elastic changes within the sample. We discuss their locations with regard to low velocity/high attenuation zones and relate our observations to large-scale data.

  13. Quantifying the similarity of seismic polarizations

    NASA Astrophysics Data System (ADS)

    Jones, Joshua P.; Eaton, David W.; Caffagni, Enrico

    2016-02-01

    Assessing the similarities of seismic attributes can help identify tremor, low signal-to-noise (S/N) signals and converted or reflected phases, in addition to diagnosing site noise and sensor misalignment in arrays. Polarization analysis is a widely accepted method for studying the orientation and directional characteristics of seismic phases via computed attributes, but similarity is ordinarily discussed using qualitative comparisons with reference values or known seismic sources. Here we introduce a technique for quantitative polarization similarity that uses weighted histograms computed in short, overlapping time windows, drawing on methods adapted from the image processing and computer vision literature. Our method accounts for ambiguity in azimuth and incidence angle and variations in S/N ratio. Measuring polarization similarity allows easy identification of site noise and sensor misalignment and can help identify coherent noise and emergent or low S/N phase arrivals. Dissimilar azimuths during phase arrivals indicate misaligned horizontal components, dissimilar incidence angles during phase arrivals indicate misaligned vertical components and dissimilar linear polarization may indicate a secondary noise source. Using records of the Mw = 8.3 Sea of Okhotsk earthquake, from Canadian National Seismic Network broad-band sensors in British Columbia and Yukon Territory, Canada, and a vertical borehole array at Hoadley gas field, central Alberta, Canada, we demonstrate that our method is robust to station spacing. Discrete wavelet analysis extends polarization similarity to the time-frequency domain in a straightforward way. Time-frequency polarization similarities of borehole data suggest that a coherent noise source may have persisted above 8 Hz several months after peak resource extraction from a `flowback' type hydraulic fracture.

  14. Spectral-element simulations of carbon dioxide (CO2) sequestration time-lapse monitoring

    NASA Astrophysics Data System (ADS)

    Morency, C.; Luo, Y.; Tromp, J.

    2009-12-01

    Geologic sequestration of CO2, a green house gas, represents an effort to reduce the large amount of CO2 generated as a by-product of fossil fuels combustion and emitted into the atmosphere. This process of sequestration involves CO2 storage deep underground. There are three main storage options: injection into hydrocarbon reservoirs, injection into methane-bearing coal beds, or injection into deep saline aquifers, that is, highly permeable porous media. The key issues involve accurate monitoring of the CO2, from the injection stage to the prediction & verification of CO2 movement over time for environmental considerations. A natural non-intrusive monitoring technique is referred to as ``4D seismics'', which involves 3D time-lapse seismic surveys. The success of monitoring the CO2 movement is subject to a proper description of the physics of the problem. We propose to realize time-lapse migrations comparing acoustic, elastic, and poroelastic simulations of 4D seismic imaging to characterize the storage zone. This approach highlights the influence of using different physical theories on interpreting seismic data, and, more importantly, on extracting the CO2 signature from the seismic wave field. Our simulations are performed using a spectral-element method, which allows for highly accurate results. Biot's equations are implemented to account for poroelastic effects. Attenuation associated with the anelasticity of the rock frame and frequency-dependent viscous resistance of the pore fluid are accommodated based upon a memory variable approach. The sensitivity of observables to the model parameters is quantified based upon finite-frequency sensitivity kernels calculated using an adjoint method.

  15. GIS modeling of seismic vulnerability of residential fabrics considering geotechnical, structural, social and physical distance indicators in Tehran using multi-criteria decision-making techniques

    NASA Astrophysics Data System (ADS)

    Rezaie, F.; Panahi, M.

    2015-03-01

    The main issue in determining seismic vulnerability is having a comprehensive view of all probable damages related to earthquake occurrence. Therefore, taking into account factors such as peak ground acceleration at the time of earthquake occurrence, the type of structures, population distribution among different age groups, level of education and the physical distance to hospitals (or medical care centers) and categorizing them into four indicators of geotechnical, structural, social and physical distance to needed facilities and from dangerous ones will provide us with a better and more exact outcome. To this end, this paper uses the analytic hierarchy process to study the importance of criteria or alternatives and uses the geographical information system to study the vulnerability of Tehran to an earthquake. This study focuses on the fact that Tehran is surrounded by three active and major faults: Mosha, North Tehran and Rey. In order to comprehensively determine the vulnerability, three scenarios are developed. In each scenario, seismic vulnerability of different areas in Tehran is analyzed and classified into four levels: high, medium, low and safe. The results show that, regarding seismic vulnerability, the faults of Mosha, North Tehran and Rey make, respectively, 6, 16 and 10% of Tehran highly vulnerable, while 34, 14 and 27% is safe.

  16. Results and implications of new regional seismic lines in the Malay Basin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leslie, W.; Ho, W.K.; Ghani, M.A.

    1994-07-01

    Regional seismic data, which was previously acquired between 1968 and 1971 by early operators in the Malay Basin, has limitations because the sophisticated modern-day acquisition and processing techniques were not available. These old data do not permit confident mapping below 3 s (TWT), equivalent to approximately 3000 m subsea, but aeromagnetic data indicate that the total sedimentary thickness exceeds 13,000 m. Hence, existing regional seismic data with a record length of 5 s (TWT) is neither adequate to map deeper play opportunities nor able to aid in understanding the geological history of the basin. New plays at deeper levels maymore » exist. (1) Geochemical modeling results now predict the top of the oil generation window at depths greater than previously thought. (2) Existing gas fields occur in the upper section in areas of thickest sedimentary fill but underlying targets have not been tested. (3) Past exploration has been focused on oil and not gas in deeper structures. Because of Malaysia's rapid development and its dedication to the protection of the environment, gas is becoming an increasingly important energy source. Hence, ample internal markets for additional gas discoveries are being created. A better understanding of the Malay Basin geological history will assist in locating these potential plays. To do this, Petronas acquired approximately 3000 line km of high-quality regional seismic data to further exploration efforts in this prospective region.« less

  17. A systematic technique for the sequential restoration of salt structures

    NASA Astrophysics Data System (ADS)

    Rowan, Mark G.

    1993-12-01

    A method is described for the sequential restoration of cross sections in areas of salt tectonics where deformation is confined to the salt and higher layers. The subsurface geometry evolves with time through the interaction of various processes: sedimentation, compaction, isostatic adjustment, thermal subsidence (if present), faulting, and salt withdrawal/ diapirism. The technique systematically calculates and removes the effects of each of these processes during specified time intervals defined by the interpreted horizons. It makes no assumptions about salt kinematics and generally results in the area of the salt layer changing through time. The method is described for restoration of extensional terranes, but it is also suitable for areas of contractional salt tectonics with only minor modifications. After converting an interpreted seismic profile to depth, the top layer is stripped off and the underlying section is decompacted according to standard porosity-depth functions. A deep baseline, unaffected by compaction or deformation, is used to restore any isostatic compensation or thermal subsidence. Isostasy is calculated according to the Airy model, and differential sedimentary loading across a section is shown to be approximately balanced by changes in salt thickness so that the load is evenly distributed. After these processes have been reversed, the resulting geometry and the seismic data are used to create the sea-floor template for structural restoration. Fault offsets are removed and the layers down to the top salt are restored to this template, while the base salt remains fixed. The resulting space between the restored top salt and the fixed base salt defines the restored salt geometry. In addition, the difference between the sea-floor template and a fixed sea level provides a measure of the change in water depth (ignoring eustatic changes in sea level). The technique is applied to an interpreted seismic profile from the eastern Green Canyon/Ewing Bank area of offshore Louisiana. The section is characterized by a variety of salt structures, including salt rollers, a diapiric massif, a remnant salt sheet, and a salt weld, which are shown to have derived from an originally continuous salt sheet which has been modified by sedimentary loading. Early loading created vertical basin growth that was accommodated primarily by salt withdrawal and associated diapiric rise through the process of downbuilding. Once the salt weld formed, continued sedimentation was accommodated by a lateral increase in basin size caused by down-dip extension on listric growth faults.

  18. Modernization of the USGS Hawaiian Volcano Observatory Seismic Processing Infrastructure

    NASA Astrophysics Data System (ADS)

    Antolik, L.; Shiro, B.; Friberg, P. A.

    2016-12-01

    The USGS Hawaiian Volcano Observatory (HVO) operates a Tier 1 Advanced National Seismic System (ANSS) seismic network to monitor, characterize, and report on volcanic and earthquake activity in the State of Hawaii. Upgrades at the observatory since 2009 have improved the digital telemetry network, computing resources, and seismic data processing with the adoption of the ANSS Quake Management System (AQMS) system. HVO aims to build on these efforts by further modernizing its seismic processing infrastructure and strengthen its ability to meet ANSS performance standards. Most notably, this will also allow HVO to support redundant systems, both onsite and offsite, in order to provide better continuity of operation during intermittent power and network outages. We are in the process of implementing a number of upgrades and improvements on HVO's seismic processing infrastructure, including: 1) Virtualization of AQMS physical servers; 2) Migration of server operating systems from Solaris to Linux; 3) Consolidation of AQMS real-time and post-processing services to a single server; 4) Upgrading database from Oracle 10 to Oracle 12; and 5) Upgrading to the latest Earthworm and AQMS software. These improvements will make server administration more efficient, minimize hardware resources required by AQMS, simplify the Oracle replication setup, and provide better integration with HVO's existing state of health monitoring tools and backup system. Ultimately, it will provide HVO with the latest and most secure software available while making the software easier to deploy and support.

  19. Analysis of surface deformation during the eruptive process of El Hierro Island (Canary Islands, Spain): Detection, Evolution and Forecasting.

    NASA Astrophysics Data System (ADS)

    Berrocoso, M.; Fernandez-Ros, A.; Prates, G.; Martin, M.; Hurtado, R.; Pereda, J.; Garcia, M. J.; Garcia-Cañada, L.; Ortiz, R.; Garcia, A.

    2012-04-01

    The surface deformation has been an essential parameter for the onset and evolution of the eruptive process of the island of El Hierro (October 2011) as well as for forecasting changes in seismic and volcanic activity during the crisis period. From GNSS-GPS observations the reactivation is early detected by analizing the change in the deformation of the El Hierro Island regional geodynamics. It is found that the surface deformation changes are detected before the occurrence of seismic activity using the station FRON (GRAFCAN). The evolution of the process has been studied by the analysis of time series of topocentric coordinates and the variation of the distance between stations on the island of El Hierro (GRAFCAN station;IGN network; and UCA-CSIC points) and LPAL-IGS station on the island of La Palma. In this work the main methodologies and their results are shown: •The location (and its changes) of the litospheric pressure source obtained by applying the Mogi model. •Kalman filtering technique for high frequency time series, used to make the forecasts issued for volcanic emergency management. •Correlations between deformation of the different GPS stations and their relationship with seismovolcanic settings.

  20. Seismological constraints on the crustal structures generated by continental rejuvenation in northeastern China

    PubMed Central

    Zheng, Tian-Yu; He, Yu-Mei; Yang, Jin-Hui; Zhao, Liang

    2015-01-01

    Crustal rejuvenation is a key process that has shaped the characteristics of current continental structures and components in tectonic active continental regions. Geological and geochemical observations have provided insights into crustal rejuvenation, although the crustal structural fabrics have not been well constrained. Here, we present a seismic image across the North China Craton (NCC) and Central Asian Orogenic Belt (CAOB) using a velocity structure imaging technique for receiver functions from a dense array. The crustal evolution of the eastern NCC was delineated during the Mesozoic by a dominant low seismic wave velocity with velocity inversion, a relatively shallow Moho discontinuity, and a Moho offset beneath the Tanlu Fault Zone. The imaged structures and geochemical evidence, including changes in the components and ages of continental crusts and significant continental crustal growth during the Mesozoic, provide insight into the rejuvenation processes of the evolving crust in the eastern NCC caused by structural, magmatic and metamorphic processes in an extensional setting. The fossil structural fabric of the convergent boundary in the eastern CAOB indicates that the back-arc action of the Paleo-Pacific Plate subduction did not reach the hinterland of Asia. PMID:26443323

  1. Looking inside the microseismic cloud using seismic interferometry

    NASA Astrophysics Data System (ADS)

    Matzel, E.; Rhode, A.; Morency, C.; Templeton, D. C.; Pyle, M. L.

    2015-12-01

    Microseismicity provides a direct means of measuring the physical characteristics of active tectonic features such as fault zones. Thousands of microquakes are often associated with an active site. This cloud of microseismicity helps define the tectonically active region. When processed using novel geophysical techniques, we can isolate the energy sensitive to the faulting region, itself. The virtual seismometer method (VSM) is a technique of seismic interferometry that provides precise estimates of the GF between earthquakes. In many ways the converse of ambient noise correlation, it is very sensitive to the source parameters (location, mechanism and magnitude) and to the Earth structure in the source region. In a region with 1000 microseisms, we can calculate roughly 500,000 waveforms sampling the active zone. At the same time, VSM collapses the computation domain down to the size of the cloud of microseismicity, often by 2-3 orders of magnitude. In simple terms VSM involves correlating the waveforms from a pair of events recorded at an individual station and then stacking the results over all stations to obtain the final result. In the far-field, when most of the stations in a network fall along a line between the two events, the result is an estimate of the GF between the two, modified by the source terms. In this geometry each earthquake is effectively a virtual seismometer recording all the others. When applied to microquakes, this alignment is often not met, and we also need to address the effects of the geometry between the two microquakes relative to each seismometer. Nonetheless, the technique is quite robust, and highly sensitive to the microseismic cloud. Using data from the Salton Sea geothermal region, we demonstrate the power of the technique, illustrating our ability to scale the technique from the far-field, where sources are well separated, to the near field where their locations fall within each other's uncertainty ellipse. VSM provides better illumination of the complex subsurface by generating precise, high frequency estimates of the GF and resolution of seismic properties between every pair of events. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344

  2. Multi-Sensor Data Fusion Project

    DTIC Science & Technology

    2000-02-28

    seismic network by detecting T phases generated by underground events ( generally earthquakes ) and associating these phases to seismic events. The...between underwater explosions (H), underground sources, mostly earthquake - generated (7), and noise detections (N). The phases classified as H are the only...processing for infrasound sensors is most similar to seismic array processing with the exception that the detections are based on a more sophisticated

  3. Monitoring earthen dams and levees with ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Planès, T.; Mooney, M.; Rittgers, J. B.; Kanning, W.; Draganov, D.

    2017-12-01

    Internal erosion is a major cause of failure of earthen dams and levees and is difficult to detect at an early stage by traditional visual inspection techniques. The passive and non-invasive ambient-noise correlation technique could help detect and locate internal changes taking place within these structures. First, we apply this passive seismic method to monitor a canal embankment model submitted to piping erosion, in laboratory-controlled conditions. We then present the monitoring of a sea levee in the Netherlands. A 150m-long section of the dike shows sandboils in the drainage ditch located downstream of the levee. These sandboils are the sign of concentrated seepage and potential initiation of internal erosion in the structure. Using the ambient-noise correlation technique, we retrieve surface waves propagating along the crest of the dike. Temporal variations of the seismic wave velocity are then computed during the tide cycle. These velocity variations are correlated with local in-situ pore water pressure measurements and are possibly influenced by the presence of concentrated seepage paths.

  4. Geophysical examination of coal deposits

    NASA Astrophysics Data System (ADS)

    Jackson, L. J.

    1981-04-01

    Geophysical techniques for the solution of mining problems and as an aid to mine planning are reviewed. Techniques of geophysical borehole logging are discussed. The responses of the coal seams to logging tools are easily recognized on the logging records. Cores for laboratory analysis are cut from selected sections of the borehole. In addition, information about the density and chemical composition of the coal may be obtained. Surface seismic reflection surveys using two dimensional arrays of seismic sources and detectors detect faults with throws as small as 3 m depths of 800 m. In geologically disturbed areas, good results have been obtained from three dimensional surveys. Smaller faults as far as 500 m in advance of the working face may be detected using in seam seismic surveying conducted from a roadway or working face. Small disturbances are detected by pulse radar and continuous wave electromagnetic methods either from within boreholes or from underground. Other geophysical techniques which explicit the electrical, magnetic, gravitational, and geothermal properties of rocks are described.

  5. Geophysical Evidence for the Locations, Shapes and Sizes, and Internal Structures of Magma Chambers beneath Regions of Quaternary Volcanism

    NASA Astrophysics Data System (ADS)

    Iyer, H. M.

    1984-04-01

    This paper is a review of seismic, gravity, magnetic and electromagnetic techniques to detect and delineate magma chambers of a few cubic kilometres to several thousand cubic kilometres volume. A dramatic decrease in density and seismic velocity, and an increase in seismic attenuation and electrical conductivity occurs at the onset of partial melting in rocks. The geophysical techniques are based on detecting these differences in physical properties between solid and partially molten rock. Although seismic refraction techniques, with sophisticated instrumentation and analytical procedures, are routinely used for detailed studies of crustal structure in volcanic regions, their application for magma detection has been quite limited. In one study, in Yellowstone National Park, U.S.A., fan-shooting and time-term techniques have been used to detect an upper-crustal magma chamber. Attenuation and velocity changes in seismic waves from explosions and earthquakes diffracted around magma chambers are observed near some volcanoes in Kamchatka. Strong attenuation of shear waves from regional earthquakes, interpreted as a diffraction effect, has been used to model magma chambers in Alaska, Kamchatka, Iceland, and New Zealand. One of the most powerful techniques in modern seismology, the seismic reflection technique with vibrators, was used to confirm the existence of a strong reflector in the crust near Socorro, New Mexico, in the Rio Grande Rift. This reflector, discovered earlier from data from local earthquakes, is interpreted as a sill-like magma body. In the Kilauea volcano, Hawaii, mapping seismicity patterns in the upper crust has enabled the modelling of the complex magma conduits in the crust and upper mantle. On the other hand, in the Usu volcano, Japan, the magma conduits are delineated by zones of seismic quiescence. Three-dimensional modelling of laterally varying structures using teleseismic residuals is proving to be a very promising technique for detecting and delineating magma chambers with minimum horizontal and vertical dimensions of about 6 km. This technique has been used successfully to detect low-velocity anomalies, interpreted as magma bodies in the volume range 103-106 km3, in several volcanic centres in the U.S.A. and in Mt Etna, Sicily. Velocity models developed using teleseismic residuals of the Cascades volcanoes of Oregon and California, and Kilauea volcano, Hawaii, do not show appreciable storage of magma in the crust. However, regional models imply that large volumes of parental magma may be present in the upper mantle of these regions. In some volcanic centres, teleseismic delays are accompanied by P-wave attenuation, and linear inversion of spectral data have enabled computation of three-dimensional Q-models for these areas. The use of gravity data for magma chamber studies is illustrated by a study in the Geysers-Clear Lake volcanic field in California, where a strong gravity low has been modelled as a low-density body in the upper crust. This body is approximately in the same location as the low-velocity body delineated with teleseismic delays, and is interpreted as a magma body. In Yellowstone National Park, magnetic field data have been used to map the depth to the Curie isotherm, and the results show that high temperatures may be present at shallow depths beneath the Yellowstone caldera. The main application of electrical techniques in magma-related studies has been to understand the deep structure of continental rifts. Electromagnetic studies in several rift zones of the world provide constraints on the thermal structure and magma storage beneath these regions. Geophysical tools commonly used in resource exploration and earth-structure studies are also suited for the detection of magma chambers. Active seismic techniques, with controlled sources, and passive seismic techniques, with local and regional earthquakes and teleseisms, can be used to detect the drastic changes in velocity and attenuation that occur at the onset of melting of rocks and to delineate in three dimensions the shape of the partly melted zone. Similarly, decreases in density and electrical resistivity in rocks during melting, can be detected. Seismic refraction and reflection are not yet used extensively in magma chamber studies. In a study, in the Yellowstone region, seismic delays occurring in a fan-shooting configuration and time-term modelling show the presence of an intense molten zone in the upper crust. Deep seismic sounding (a combination of seismic refraction and reflection) and modelling amplitude and velocity changes of diffracted seismic waves from explosions and earthquakes, have enabled mapping of small and large magma chambers beneath many volcanoes in Kamchatka, U.S.S.R. Teleseismic P-wave residuals have been used to model low-velocity bodies, interpreted as magma chambers, in several Quaternary volcanic centres in the U.S.A. The results show that magma chambers with volumes of a few hundred to a few thousand cubic kilometres volume seem to be confined to regions of silicic volcanism. Many of the magma bodies seem to have upper-mantle roots implying that they are not isolated pockets of partial melt, but may be deriving their magma supplies from deeper parental sources. Medium or large crustal magma chambers are absent in the andesitic volcanoes of western United States and the basaltic Kilauea volcano, Hawaii. However, regional velocity models of the Oregon Cascades and Hawaii show evidence for the presence of magma reservoirs in the upper mantle. The transport of magma to the upper crust in these regions probably occurs rapidly through narrow conduits, with transient storage occurring in small chambers of a few cubic kilometres volume. Very little use has been made of the gravity and magnetic maps to model magma chambers. The number of available case histories, though few, indicate that these data can be very useful to give constraints on the density and temperature in magma chambers. Seismic, gravity, and electromagnetic techniques have been used to model regional structure in several rift zones of the world. Together the data indicate lithospheric thinning under the rifts with possible subcrustal storage of magma and diapiric intrusions into the crust. The current status of the use of geophysical techniques in magma chamber studies can be summarized as follows. Though powerful experimental methods for data collection and mathematical and computational techniques for modelling are available, the two dozen or so available case histories seem to represent isolated, technique-oriented studies. Only in a few regions, such as Kamchatka, U.S.S.R., and Yellowstone and Socorro, U.S.A., are data from multiple geophysical techniques becoming available. Several studies in different tectonic and volcanic environments, which use a suite of geophysical experiments capable of measuring different physical properties of rocks and having a wide range of resolutions, are needed to understand the problems of magma generation, migration, and storage. Many figures, data and results presented in this paper are from several different publications. I am indebted to the authors and publishers for permitting their use. I am very grateful to some of the authors who supplied photo prints of figures. Tim Hitchcock's help in preparing and organizing the figures was invaluable. Dr F. W. Klein, Dr W. D. Mooney, and Dr R. S. J. Sparks reviewed the manuscript and made useful suggestions.

  6. CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.

    2013-12-01

    As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over 1100 hazard curves. We will report on the performance of this CyberShake study, four times larger than previous studies. Additionally, we will examine the challenges we face applying these workflow techniques to additional open-science HPC systems and discuss whether our workflow solutions continue to provide value to our large-scale PSHA calculations.

  7. Characteristics of Swarm Seismicity in Northern California

    NASA Astrophysics Data System (ADS)

    Chiorini, S.; Lekic, V.

    2017-12-01

    Seismic swarms are characterized by an anomalously large number of earthquakes compared to the background rate of seismicity that are tightly clustered in space (typically, one to tens of kilometers) and time (typically, days to weeks). However, why and how swarms occur is poorly understood, partly because of the difficulty of identifying the range of swarm behaviors within large seismic catalogs. Previous studies have found that swarms, compared to other earthquake sequences, appear to be more common in extensional (Vidale & Shearer, 2006) and volcanic settings (Hayashi & Morita, 2003). In addition, swarms more commonly exhibit migration patterns, consistent with either fluid diffusion (Chen & Shearer, 2011; Chen et al., 2012) or aseismic creep (Lohman & McGuire, 2007), and are preferentially found in areas of enhanced heat flow (Enescu, 2009; Zaliapin & Ben Zion, 2016). While the swarm seismicity of Southern California has been studied extensively, that of Northern California has not been systematically documented and characterized. We employed two complementary methods of swarm identification: the approach of Vidale and Shearer (2006; henceforth VS2006) based on a priori threshold distances and timings of quakes, and the spatio-temporal distance metric proposed by Zaliapin et al. (2008; henceforth Z2008) in order to build a complete catalog of swarm seismicity in Northern California spanning 1984-2016 (Waldhauser & Schaff, 2008). Once filtered for aftershocks, the catalog allows us to describe the main features of swarm seismicity in Northern California, including spatial distribution, association or lack thereof with known faults and volcanic systems, and seismically quiescent regions. We then apply a robust technique to characterize the morphology of swarms, leading to subsets of swarms that are oriented either vertically or horizontally in space. When mapped, vertical swarms show a significant association with volcanic regions, and horizontal swarms with tectonic and volcanic settings. Finally, we demonstrate some swarms underlie areas that are typically quiescent, like Redding and the Sutter Buttes. Our catalog presents a more complete snapshot of the diversity of swarm characteristics in Northern California, and motivates further study of their relationship to tectonic and volcanic processes.

  8. The shallow elastic structure of the lunar crust: New insights from seismic wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-10-01

    Enigmatic lunar seismograms recorded during the Apollo 17 mission in 1972 have so far precluded the identification of shear-wave arrivals and hence the construction of a comprehensive elastic model of the shallow lunar subsurface. Here, for the first time, we extract shear-wave information from the Apollo active seismic data using a novel waveform analysis technique based on spatial seismic wavefield gradients. The star-like recording geometry of the active seismic experiment lends itself surprisingly well to compute spatial wavefield gradients and rotational ground motion as a function of time. These observables, which are new to seismic exploration in general, allowed us to identify shear waves in the complex lunar seismograms, and to derive a new model of seismic compressional and shear-wave velocities in the shallow lunar crust, critical to understand its lithology and constitution, and its impact on other geophysical investigations of the Moon's deep interior.

  9. The Geological Susceptibility of Induced Earthquakes in the Duvernay Play

    NASA Astrophysics Data System (ADS)

    Pawley, Steven; Schultz, Ryan; Playter, Tiffany; Corlett, Hilary; Shipman, Todd; Lyster, Steven; Hauck, Tyler

    2018-02-01

    Presently, consensus on the incorporation of induced earthquakes into seismic hazard has yet to be established. For example, the nonstationary, spatiotemporal nature of induced earthquakes is not well understood. Specific to the Western Canada Sedimentary Basin, geological bias in seismogenic activation potential has been suggested to control the spatial distribution of induced earthquakes regionally. In this paper, we train a machine learning algorithm to systemically evaluate tectonic, geomechanical, and hydrological proxies suspected to control induced seismicity. Feature importance suggests that proximity to basement, in situ stress, proximity to fossil reef margins, lithium concentration, and rate of natural seismicity are among the strongest model predictors. Our derived seismogenic potential map faithfully reproduces the current distribution of induced seismicity and is suggestive of other regions which may be prone to induced earthquakes. The refinement of induced seismicity geological susceptibility may become an important technique to identify significant underlying geological features and address induced seismic hazard forecasting issues.

  10. Advances in Predicting Magnetic Fields on the Far Side of the Sun

    NASA Astrophysics Data System (ADS)

    Lindsey, C. A.

    2016-12-01

    Techniques in local solar seismology applied to observations of seismic oscillations in the Sun's near hemisphere allow us to map large magnetic regions in the Sun's far hemisphere. Seismic signatures are not nearly as sensitive to magnetic flux as observations in electromagnetic radiation. However, they clearly identify and locate the 400 or so largest active regions in a typical solar cycle, i.e., those of most concern for space-weather forecasting. By themselves, seismic observations are insensitive to magnetic polarity. However, the Hale polarity law offers tantalizing avenues for guessing polarity distributions from seismic signatures as they evolve. I will review what we presently know about the relationship between seismic signatures of active regions and their magnetic and radiative properties, and offer a preliminary assessment of the potential of far-side seismic maps for space-weather forecasting in the coming decade.

  11. Borehole seismic monitoring of seismic stimulation at OccidentalPermian Ltd's -- South Wason Clear Fork Unit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daley, Tom; Majer, Ernie

    2007-04-30

    Seismic stimulation is a proposed enhanced oil recovery(EOR) technique which uses seismic energy to increase oil production. Aspart of an integrated research effort (theory, lab and field studies),LBNL has been measuring the seismic amplitude of various stimulationsources in various oil fields (Majer, et al., 2006, Roberts,et al.,2001, Daley et al., 1999). The amplitude of the seismic waves generatedby a stimulation source is an important parameter for increased oilmobility in both theoretical models and laboratory core studies. Theseismic amplitude, typically in units of seismic strain, can be measuredin-situ by use of a borehole seismometer (geophone). Measuring thedistribution of amplitudes within amore » reservoir could allow improved designof stimulation source deployment. In March, 2007, we provided in-fieldmonitoring of two stimulation sources operating in Occidental (Oxy)Permian Ltd's South Wasson Clear Fork (SWCU) unit, located near DenverCity, Tx. The stimulation source is a downhole fluid pulsation devicedeveloped by Applied Seismic Research Corp. (ASR). Our monitoring used aborehole wall-locking 3-component geophone operating in two nearbywells.« less

  12. Building configuration and seismic design: The architecture of earthquake resistance

    NASA Astrophysics Data System (ADS)

    Arnold, C.; Reitherman, R.; Whitaker, D.

    1981-05-01

    The architecture of a building in relation to its ability to withstand earthquakes was determined. Aspects of round motion which are significant to building behavior are discussed. Results of a survey of configuration decisions that affect the performance of buildings with a focus on the architectural aspects of configuration design are provided. Configuration derivation, building type as it relates to seismic design, and seismic design, and seismic issues in the design process are examined. Case studies of the Veterans' Administration Hospital in Loma Linda, California, and the Imperial Hotel in Tokyo, Japan, are presented. The seismic design process is described paying special attention to the configuration issues. The need is stressed for guidelines, codes, and regulations to ensure design solutions that respect and balance the full range of architectural, engineering, and material influences on seismic hazards.

  13. Deep-towed high resolution seismic imaging II: Determination of P-wave velocity distribution

    NASA Astrophysics Data System (ADS)

    Marsset, B.; Ker, S.; Thomas, Y.; Colin, F.

    2018-02-01

    The acquisition of high resolution seismic data in deep waters requires the development of deep towed seismic sources and receivers able to deal with the high hydrostatic pressure environment. The low frequency piezoelectric transducer of the SYSIF (SYstème Sismique Fond) deep towed seismic device comply with the former requirement taking advantage of the coupling of a mechanical resonance (Janus driver) and a fluid resonance (Helmholtz cavity) to produce a large frequency bandwidth acoustic signal (220-1050 Hz). The ability to perform deep towed multichannel seismic imaging with SYSIF was demonstrated in 2014, yet, the ability to determine P-wave velocity distribution wasn't achieved. P-wave velocity analysis relies on the ratio between the source-receiver offset range and the depth of the seismic reflectors, thus towing the seismic source and receivers closer to the sea bed will provide a better geometry for P-wave velocity determination. Yet, technical issues, related to the acoustic source directivity, arise for this approach in the particular framework of piezoelectric sources. A signal processing sequence is therefore added to the initial processing flow. Data acquisition took place during the GHASS (Gas Hydrates, fluid Activities and Sediment deformations in the western Black Sea) cruise in the Romanian waters of the Black Sea. The results of the imaging processing are presented for two seismic data sets acquired over gas hydrates and gas bearing sediments. The improvement in the final seismic resolution demonstrates the validity of the velocity model.

  14. Seismic tomography as a tool for measuring stress in mines

    USGS Publications Warehouse

    Scott, Douglas F.; Williams, T.J.; Denton, D.K.; Friedel, M.J.

    1999-01-01

    Spokane Research Center personnel have been investigating the use of seismic tomography to monitor the behavior of a rock mass, detect hazardous ground conditions and assess the mechanical integrity of a rock mass affected by mining. Seismic tomography can be a valuable tool for determining relative stress in deep, >1,220-m (>4,000-ft), underground pillars. If high-stress areas are detected, they can be destressed prior to development or they can be avoided. High-stress areas can be monitored with successive seismic surveys to determine if stress decreases to a level where development can be initiated safely. There are several benefits to using seismic tomography to identify high stress in deep underground pillars. The technique is reliable, cost-effective, efficient and noninvasive. Also, investigators can monitor large rock masses, as well as monitor pillars during the mining cycle. By identifying areas of high stress, engineers will be able to assure that miners are working in a safer environment.Spokane Research Center personnel have been investigating the use of seismic tomography to monitor the behavior of a rock mass, detect hazardous ground conditions and assess the mechanical integrity of a rock mass affected by mining. Seismic tomography can be a valuable tool for determining relative stress in deep, >1,200-m (>4,000-ft), underground pillars. If high-stress areas are detected, they can be destressed prior to development or they can be avoided. High-stress areas can be monitored with successive seismic surveys to determine if stress decreases to a level where development can be initiated safely. There are several benefits to using seismic tomography to identify high stress in deep underground pillars. The technique is reliable, cost-effective, efficient and noninvasive. Also, investigators can monitor large rock masses, as well as monitor pillars during the mining cycle. By identifying areas of high stress. engineers will be able to assure that miners are working in a safer environment.

  15. EMERALD: A Flexible Framework for Managing Seismic Data

    NASA Astrophysics Data System (ADS)

    West, J. D.; Fouch, M. J.; Arrowsmith, R.

    2010-12-01

    The seismological community is challenged by the vast quantity of new broadband seismic data provided by large-scale seismic arrays such as EarthScope’s USArray. While this bonanza of new data enables transformative scientific studies of the Earth’s interior, it also illuminates limitations in the methods used to prepare and preprocess those data. At a recent seismic data processing focus group workshop, many participants expressed the need for better systems to minimize the time and tedium spent on data preparation in order to increase the efficiency of scientific research. Another challenge related to data from all large-scale transportable seismic experiments is that there currently exists no system for discovering and tracking changes in station metadata. This critical information, such as station location, sensor orientation, instrument response, and clock timing data, may change over the life of an experiment and/or be subject to post-experiment correction. Yet nearly all researchers utilize metadata acquired with the downloaded data, even though subsequent metadata updates might alter or invalidate results produced with older metadata. A third long-standing issue for the seismic community is the lack of easily exchangeable seismic processing codes. This problem stems directly from the storage of seismic data as individual time series files, and the history of each researcher developing his or her preferred data file naming convention and directory organization. Because most processing codes rely on the underlying data organization structure, such codes are not easily exchanged between investigators. To address these issues, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The goal of the EMERALD project is to provide seismic researchers with a unified, user-friendly, extensible system for managing seismic event data, thereby increasing the efficiency of scientific enquiry. EMERALD stores seismic data and metadata in a state-of-the-art open source relational database (PostgreSQL), and can, on a timed basis or on demand, download the most recent metadata, compare it with previously acquired values, and alert the user to changes. The backend relational database is capable of easily storing and managing many millions of records. The extensible, plug-in architecture of the EMERALD system allows any researcher to contribute new visualization and processing methods written in any of 12 programming languages, and a central Internet-enabled repository for such methods provides users with the opportunity to download, use, and modify new processing methods on demand. EMERALD includes data acquisition tools allowing direct importation of seismic data, and also imports data from a number of existing seismic file formats. Pre-processed clean sets of data can be exported as standard sac files with user-defined file naming and directory organization, for use with existing processing codes. The EMERALD system incorporates existing acquisition and processing tools, including SOD, TauP, GMT, and FISSURES/DHI, making much of the functionality of those tools available in a unified system with a user-friendly web browser interface. EMERALD is now in beta test. See emerald.asu.edu or contact john.d.west@asu.edu for more details.

  16. Cavity Detection and Delineation Research. Report 2. Seismic Methodology: Medford Cave Site, Florida.

    DTIC Science & Technology

    1983-06-01

    energy. A distance of 50 ft was maintained between source and detector for one test and 25 ft for the other tests. Since the seismic unit was capable...during the tests. After a recording was made, the seismic source and geophone were each moved 5 ft, thus maintaining the 50- or 25-ft source-to- detector ...produced by cavities; therefore, detection using this technique was not achieved. The sensitivity of the uphole refraction method to the presence of

  17. Correlative weighted stacking for seismic data in the wavelet domain

    USGS Publications Warehouse

    Zhang, S.; Xu, Y.; Xia, J.; ,

    2004-01-01

    Horizontal stacking plays a crucial role for modern seismic data processing, for it not only compresses random noise and multiple reflections, but also provides a foundational data for subsequent migration and inversion. However, a number of examples showed that random noise in adjacent traces exhibits correlation and coherence. The average stacking and weighted stacking based on the conventional correlative function all result in false events, which are caused by noise. Wavelet transform and high order statistics are very useful methods for modern signal processing. The multiresolution analysis in wavelet theory can decompose signal on difference scales, and high order correlative function can inhibit correlative noise, for which the conventional correlative function is of no use. Based on the theory of wavelet transform and high order statistics, high order correlative weighted stacking (HOCWS) technique is presented in this paper. Its essence is to stack common midpoint gathers after the normal moveout correction by weight that is calculated through high order correlative statistics in the wavelet domain. Synthetic examples demonstrate its advantages in improving the signal to noise (S/N) ration and compressing the correlative random noise.

  18. First seismic shear wave velocity profile of the lunar crust as extracted from the Apollo 17 active seismic data by wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-04-01

    We present a new seismic velocity model of the shallow lunar crust, including, for the first time, shear wave velocity information. So far, the shear wave velocity structure of the lunar near-surface was effectively unconstrained due to the complexity of lunar seismograms. Intense scattering and low attenuation in the lunar crust lead to characteristic long-duration reverberations on the seismograms. The reverberations obscure later arriving shear waves and mode conversions, rendering them impossible to identify and analyze. Additionally, only vertical component data were recorded during the Apollo active seismic experiments, which further compromises the identification of shear waves. We applied a novel processing and analysis technique to the data of the Apollo 17 lunar seismic profiling experiment (LSPE), which involved recording seismic energy generated by several explosive packages on a small areal array of four vertical component geophones. Our approach is based on the analysis of the spatial gradients of the seismic wavefield and yields key parameters such as apparent phase velocity and rotational ground motion as a function of time (depth), which cannot be obtained through conventional seismic data analysis. These new observables significantly enhance the data for interpretation of the recorded seismic wavefield and allow, for example, for the identification of S wave arrivals based on their lower apparent phase velocities and distinct higher amount of generated rotational motion relative to compressional (P-) waves. Using our methodology, we successfully identified pure-mode and mode-converted refracted shear wave arrivals in the complex LSPE data and derived a P- and S-wave velocity model of the shallow lunar crust at the Apollo 17 landing site. The extracted elastic-parameter model supports the current understanding of the lunar near-surface structure, suggesting a thin layer of low-velocity lunar regolith overlying a heavily fractured crust of basaltic material showing high (>0.4 down to 60 m) Poisson's ratios. Our new model can be used in future studies to better constrain the deep interior of the Moon. Given the rich information derived from the minimalistic recording configuration, our results demonstrate that wavefield gradient analysis should be critically considered for future space missions that aim to explore the interior structure of extraterrestrial objects by seismic methods. Additionally, we anticipate that the proposed shear wave identification methodology can also be applied to the routinely recorded vertical component data from land seismic exploration on Earth.

  19. Microseismic monitoring of soft-rock landslide: contribution of a 3D velocity model for the location of seismic sources.

    NASA Astrophysics Data System (ADS)

    Floriane, Provost; Jean-Philippe, Malet; Cécile, Doubre; Julien, Gance; Alessia, Maggi; Agnès, Helmstetter

    2015-04-01

    Characterizing the micro-seismic activity of landslides is an important parameter for a better understanding of the physical processes controlling landslide behaviour. However, the location of the seismic sources on landslides is a challenging task mostly because of (a) the recording system geometry, (b) the lack of clear P-wave arrivals and clear wave differentiation, (c) the heterogeneous velocities of the ground. The objective of this work is therefore to test whether the integration of a 3D velocity model in probabilistic seismic source location codes improves the quality of the determination especially in depth. We studied the clay-rich landslide of Super-Sauze (French Alps). Most of the seismic events (rockfalls, slidequakes, tremors...) are generated in the upper part of the landslide near the main scarp. The seismic recording system is composed of two antennas with four vertical seismometers each located on the east and west sides of the seismically active part of the landslide. A refraction seismic campaign was conducted in August 2014 and a 3D P-wave model has been estimated using the Quasi-Newton tomography inversion algorithm. The shots of the seismic campaign are used as calibration shots to test the performance of the different location methods and to further update the 3D velocity model. Natural seismic events are detected with a semi-automatic technique using a frequency threshold. The first arrivals are picked using a kurtosis-based method and compared to the manual picking. Several location methods were finally tested. We compared a non-linear probabilistic method coupled with the 3D P-wave model and a beam-forming method inverted for an apparent velocity. We found that the Quasi-Newton tomography inversion algorithm provides results coherent with the original underlaying topography. The velocity ranges from 500 m.s-1 at the surface to 3000 m.s-1 in the bedrock. For the majority of the calibration shots, the use of a 3D velocity model significantly improve the results of the location procedure using P-wave arrivals. All the shots were made 50 centimeters below the surface and hence the vertical error could not be determined with the seismic campaign. We further discriminate the rockfalls and the slidequakes occurring on the landslide with the depth computed thanks to the 3D velocity model. This could be an additional criteria to automatically classify the events.

  20. On the difficulties of detecting PP precursors

    NASA Astrophysics Data System (ADS)

    Lessing, Stephan; Thomas, Christine; Saki, Morvarid; Schmerr, Nicholas; Vanacore, Elizabeth

    2015-06-01

    The PP precursors are seismic waves that form from underside reflections of P waves off discontinuities in the upper mantle transition zone (MTZ). These seismic phases are used to map discontinuity topography, sharpness, and impedance contrasts; the resulting structural variations are then often interpreted as evidence for temperature and/or mineralogy variations within the mantle. The PP precursors as well as other seismic phases have been used to establish the global presence of seismic discontinuities at 410 and 660 km depth. Intriguingly, in more than 80 per cent of PP precursor observations the seismic wave amplitudes are significantly weaker than the amplitudes predicted by seismic reference models. Even more perplexing is the observation that 1-5 per cent of all earthquakes (which are 20-25 per cent of earthquakes with clear PP waveforms) do not show any evidence for the PP precursors from the discontinuities even in the presence of well-developed PP waveforms. Non-detections are found in six different data sets consisting of tens to hundreds of events. We use synthetic modelling to examine a suite of factors that could be responsible for the absence of the PP precursors. The take-off angles for PP and the precursors differ by only 1.2-1.5°; thus source-related complexity would affect PP and the precursors. A PP wave attenuated in the upper mantle would increase the relative amplitude of the PP precursors. Attenuation within the transition zone could reduce precursor amplitudes, but this would be a regional phenomenon restricted to particular source receiver geometries. We also find little evidence for deviations from the theoretical travel path of seismic rays expected for scattered arrivals. Factors that have a strong influence include the stacking procedures used in seismic array techniques in the presence of large, interfering phases, the presence of topography on the discontinuities on the order of tens of kilometres, and 3-D lateral heterogeneity in the velocity and density changes with depth across the transition zone. We also compare the observed precursors' amplitudes with seismic models from calculations of phase equilibria and find that a seismic velocity model derived from a pyrolite composition reproduces the data better than the currently available 1-D earth models. This largely owes to the pyrolite models producing a stronger minimum in the reflection coefficient across the epicentral distances where the reduction in amplitudes of the PP precursors is observed. To suppress the precursors entirely in a small subset of earthquakes, other effects, such as localized discontinuity topography and seismic signal processing effects are required in addition to the changed velocity model.

  1. Seismic gradiometry using ambient seismic noise in an anisotropic Earth

    NASA Astrophysics Data System (ADS)

    de Ridder, S. A. L.; Curtis, A.

    2017-05-01

    We introduce a wavefield gradiometry technique to estimate both isotropic and anisotropic local medium characteristics from short recordings of seismic signals by inverting a wave equation. The method exploits the information in the spatial gradients of a seismic wavefield that are calculated using dense deployments of seismic arrays. The application of the method uses the surface wave energy in the ambient seismic field. To estimate isotropic and anisotropic medium properties we invert an elliptically anisotropic wave equation. The spatial derivatives of the recorded wavefield are evaluated by calculating finite differences over nearby recordings, which introduces a systematic anisotropic error. A two-step approach corrects this error: finite difference stencils are first calibrated, then the output of the wave-equation inversion is corrected using the linearized impulse response to the inverted velocity anomaly. We test the procedure on ambient seismic noise recorded in a large and dense ocean bottom cable array installed over Ekofisk field. The estimated azimuthal anisotropy forms a circular geometry around the production-induced subsidence bowl. This conforms with results from studies employing controlled sources, and with interferometry correlating long records of seismic noise. Yet in this example, the results were obtained using only a few minutes of ambient seismic noise.

  2. NON-INVASIVE DETERMINATION OF THE LOCATION AND DISTRBUTION OF FREE-PHASE DENSE NONAQUEOUS PHASE LIQUIDS (DNAPL) BY SEISMIC REFLECTION TECHNIQUES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael G. Waddell; William J. Domoracki; Jerome Eyer

    2003-01-01

    The Earth Sciences and Resources Institute, University of South Carolina is conducting a proof of concept study to determine the location and distribution of subsurface DNAPL carbon tetrachloride (CCl{sub 4}) contamination at the 216-Z-9 crib, 200 West area, DOE Hanford Site, Washington by use of two-dimensional high-resolution seismic reflection surveys and borehole geophysical data. The study makes use of recent advances in seismic reflection amplitude versus offset (AVO) technology to directly detect the presence of subsurface DNAPL. The techniques proposed are noninvasive means of site characterization and direct free-phase DNAPL detection. This final report covers the results of Tasks 1,more » 2, and 3. Task (1) contains site evaluation and seismic modeling studies. The site evaluation consists of identifying and collecting preexisting geological and geophysical information regarding subsurface structure and the presence and quantity of DNAPL. The seismic modeling studies were undertaken to determine the likelihood that an AVO response exists and its probable manifestation. Task (2) is the design and acquisition of 2-D seismic reflection data to image areas of probable high concentration of DNAPL. Task (3) is the processing and interpretation of the 2-D data. During the commission of these tasks four seismic reflection profiles were collected. Subsurface velocity information was obtained by vertical seismic profile surveys in three wells. The interpretation of these data is in two parts. Part one is the construction and interpretation of structural contour maps of the contact between the Hanford Fine unit and the underlying Plio/Pleistocene unit and of the contact between the Plio/Pleistocene unit and the underlying caliche layer. These two contacts were determined to be the most likely surfaces to contain the highest concentration CCl{sub 4}. Part two of the interpretation uses the results of the AVO modeling to locate any seismic amplitude anomalies that might be associated with the presence of high concentrations of CCl{sub 4}. Based on the modeling results three different methods of AVO analysis were preformed on the seismic data: enhanced amplitude stacks, offset range limited stacks, and gradient stacks. Seismic models indicate that the reflection from the contact between the Hanford Fine and the Plio/Pleistocene should exhibit amplitude variations where there are high concentrations of CCl{sub 4}. A series of different scenarios were modeled. The first scenario is the Hanford Fine pores are 100% saturated with CCl{sub 4} and the underlying Plio/Pleistocene pores are saturated with air. In this scenario the reflection coefficients are slightly negative at the small angles of incidence and become increasing more negative at the larger angles of incidence (dim-out). The second scenario is the Hanford Fine pores are saturated with air and Plio/Pleistocene pores are saturated with CCl{sub 4}. In this scenario the reflection coefficients are slightly positive at the small angles of incidence and become negative at the large angles of incidence (polarity reversal). Finally the third scenario is both the Hanford Fine and the Plio/Pleistocene pores are saturated CCl{sub 4}. In this scenario the reflection coefficients at the small angles of incidence are slightly positive, but much less than background response, and with increasing angle of incidence the reflection coefficients become slightly more positive. On the field data areas where extraction wells have high concentrations of CCl{sub 4} a corresponding dim-out and/or a polarity reversal is noted.« less

  3. Seismic signature of crustal magma and fluid from deep seismic sounding data across Tengchong volcanic area

    NASA Astrophysics Data System (ADS)

    Bai, Z. M.; Zhang, Z. Z.; Wang, C. Y.; Klemperer, S. L.

    2012-04-01

    The weakened lithosphere around eastern syntax of Tibet plateau has been revealed by the Average Pn and Sn velocities, the 3D upper mantle velocity variations of P wave and S wave, and the iimaging results of magnetotelluric data. Tengchong volcanic area is neighboring to core of eastern syntax and famous for its springs, volcanic-geothermal activities and remarkable seismicity in mainland China. To probe the deep environment for the Tengchong volcanic-geothermal activity a deep seismic sounding (DSS) project was carried out across the this area in 1999. In this paper the seismic signature of crustal magma and fluid is explored from the DSS data with the seismic attribute fusion (SAF) technique, hence four possible positions for magma generation together with some locations for porous and fractured fluid beneath the Tengchong volcanic area were disclosed from the final fusion image of multi seismic attributes. The adopted attributes include the Vp, Vs and Vp/Vs results derived from a new inversion method based on the No-Ray-Tomography technique, and the migrated instantaneous attributes of central frequency, bandwidth and high frequency energy of pressure wave. Moreover, the back-projected ones which are mainly consisted by the attenuation factor Qp , the delay-time of shear wave splitting, and the amplitude ratio between S wave and P wave + S wave were also considered in this fusion process. Our fusion image indicates such a mechanism for the surface springs: a large amount of heat and the fluid released by the crystallization of magma were transmitted upward into the fluid-filled rock, and the fluid upwells along some pipeline since the high pressure in deep, thus the widespread springs of Tengchong volcanic area were developed. Moreover, the fusion image, regional volcanic and geothermal activities, and the seismicity suggest that the main risk of volcanic eruption was concentrated to the south of Tengchong city, especially around the shot point (SP) Tuantian. There are typical tectonic and deep origin mechanisms for the moderate-strong earthquakes nearby SP Tuantian, and precaution should be added on this area in case of the potential earthquake. Our fusion image also clearly revealed that there exist two remarkable positions on the Moho discontinuity through which the heat from the upper mantle was transmitted upward, and this is attributed to the widely distributed hot material within the crust and upper mantle. We acknowledge the financial support of the Ministry of Land and Resources of China (SinoProbe-02-02), and the National Nature Science Foundation of China (No. 41074033 and No. 40830315). Key Words: Seismic Signature, Magma, Tengchong Volcanic Area, Deep Seismic Sounding, Seismic Attribute Fusion Li, Chang, van der Hilst, D., Meltzer, A.S., Engdahl, E.R., 2008. Subduction of the Indian lithosphere beneath the Tibetan Plateau and Burma. Earth Planet. Sci. Lett. 274. doi:10.1016/j.epsl.2008.07.016. Lebedev, S., van der Hilst, R.D., 2008. Global upper-mantle tomography with the automated multi-mode surface and S waveforms. Geophys. J. Int. 173 (2), 505-518. Wang C.Y. and Huangfu G.. 2004. Crustal structure in Tengchong Volcano-Geothermal Area, western Yunnan, China. Tectonophysics, 380: 69-87.

  4. Numerical simulation analysis on Wenchuan seismic strong motion in Hanyuan region

    NASA Astrophysics Data System (ADS)

    Chen, X.; Gao, M.; Guo, J.; Li, Z.; Li, T.

    2015-12-01

    69227 deaths, 374643 injured, 17923 people missing, direct economic losses 845.1 billion, and a large number houses collapse were caused by Wenchuan Ms8 earthquake in Sichuan Province on May 12, 2008, how to reproduce characteristics of its strong ground motion and predict its intensity distribution, which have important role to mitigate disaster of similar giant earthquake in the future. Taking Yunnan-Sichuan Province, Wenchuan town, Chengdu city, Chengdu basin and its vicinity as the research area, on the basis of the available three-dimensional velocity structure model and newly topography data results from ChinaArray of Institute of Geophysics, China Earthquake Administration, 2 type complex source rupture process models with the global and local source parameters are established, we simulated the seismic wave propagation of Wenchuan Ms8 earthquake throughout the whole three-dimensional region by the GMS discrete grid finite-difference techniques with Cerjan absorbing boundary conditions, and obtained the seismic intensity distribution in this region through analyzing 50×50 stations data (simulated ground motion output station). The simulated results indicated that: (1)Simulated Wenchuan earthquake ground motion (PGA) response and the main characteristics of the response spectrum are very similar to those of the real Wenchuan earthquake records. (2)Wenchuan earthquake ground motion (PGA) and the response spectra of the Plain are much greater than that of the left Mountain area because of the low velocity of the shallow surface media and the basin effect of the Chengdu basin structure. Simultaneously, (3) the source rupture process (inversion) with far-field P-wave, GPS data and InSAR information and the Longmenshan Front Fault (source rupture process) are taken into consideration in GMS numerical simulation, significantly different waveform and frequency component of the ground motion are obtained, though the strong motion waveform is distinct asymmetric, which should be much more real. It indicated that the Longmenshan Front Fault may be also involved in seismic activity during the long time(several minutes) Wenchuan earthquake process. (4) Simulated earthquake records in Hanyuan region are indeed very strong, which reveals source mechanism is one reason of Hanyuan intensity abnormaly.

  5. Ground Displacement Measurement of the 2013 Balochistan Earthquake with interferometric TerraSAR-X ScanSAR data

    NASA Astrophysics Data System (ADS)

    Yague-Martinez, N.; Fielding, E. J.; Haghshenas-Haghighi, M.; Cong, X.; Motagh, M.

    2014-12-01

    This presentation will address the 24 September 2013 Mw 7.7 Balochistan Earthquake in western Pakistan from the point of view of interferometric processing algorithms of wide-swath TerraSAR-X ScanSAR images. The algorithms are also valid for TOPS acquisition mode, the operational mode of the Sentinel-1A ESA satellite that was successfully launched in April 2014. Spectral properties of burst-mode data and an overview of the interferometric processing steps of burst-mode acquisitions, emphasizing the importance of the co-registration stage, will be provided. A co-registration approach based on incoherent cross-correlation will be presented and applied to seismic scenarios. Moreover geodynamic corrections due to differential atmospheric path delay and differential solid Earth tides are considered to achieve accuracy in the order of several centimeters. We previously derived a 3D displacement map using cross-correlation techniques applied to optical images from Landsat-8 satellite and TerraSAR-X ScanSAR amplitude images. The Landsat-8 cross-correlation measurements cover two horizontal directions, and the TerraSAR-X displacements include both horizontal along-track and slant-range (radar line-of-sight) measurements that are sensitive to vertical and horizontal deformation. It will be justified that the co-seismic displacement map from TerraSAR-X ScanSAR data may be contaminated by postseismic deformation due to the fact that the post-seismic acquisition took place one month after the main shock, confirmed in part by a TerraSAR-X stripmap interferogram (processed with conventional InSAR) covering part of the area starting on 27 September 2013. We have arranged the acquisition of a burst-synchronized stack of TerraSAR-X ScanSAR images over the affected area after the earthquake. It will be possible to apply interferometry to these data to measure the lower magnitude of the expected postseismic displacements. The processing of single interferograms will be discussed. A quicklook of the wrapped differential TerraSAR-X ScanSAR co-seismic interferogram is provided in the attachment (range coverage is 100 km by using 4 subswaths).

  6. A bird's eye view: the cognitive strategies of experts interpreting seismic profiles

    NASA Astrophysics Data System (ADS)

    Bond, C. E.; Butler, R.

    2012-12-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that techniques and strategies are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments we have focused on a small number of experts to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.

  7. Metamorphic records of multiple seismic cycles during subduction

    PubMed Central

    Hacker, Bradley R.; Seward, Gareth G. E.; Kelley, Chris S.

    2018-01-01

    Large earthquakes occur in rocks undergoing high-pressure/low-temperature metamorphism during subduction. Rhythmic major-element zoning in garnet is a common product of such metamorphism, and one that must record a fundamental subduction process. We argue that rhythmic major-element zoning in subduction zone garnets from the Franciscan Complex, California, developed in response to growth-dissolution cycles driven by pressure pulses. Using electron probe microanalysis and novel techniques in Raman and synchrotron Fourier transform infrared microspectroscopy, we demonstrate that at least four such pressure pulses, of magnitude 100–350 MPa, occurred over less than 300,000 years. These pressure magnitude and time scale constraints are most consistent with the garnet zoning having resulted from periodic overpressure development-dissipation cycles, related to pore-fluid pressure fluctuations linked to earthquake cycles. This study demonstrates that some metamorphic reactions can track individual earthquake cycles and thereby opens new avenues to the study of seismicity. PMID:29568800

  8. A seismic reflection velocity study of a Mississippian mud-mound in the Illinois basin

    NASA Astrophysics Data System (ADS)

    Ranaweera, Chamila Kumari

    Two mud-mounds have been reported in the Ullin limestone near, but not in, the Aden oil field in Hamilton County, Illinois. One mud-mound is in the Broughton oil field of Hamilton County 25 miles to the south of Aden. The second mud-mound is in the Johnsonville oil field in Wayne County 20 miles to the north of Aden. Seismic reflection profiles were shot in 2012 adjacent to the Aden oil field to evaluate the oil prospects and to investigate the possibility of detecting Mississippian mud-mounds near the Aden field. A feature on one of the seismic profiles was interpreted to be a mud-mound or carbonate buildup. A well drilled at the location of this interpreted structure provided digital geophysical logs and geological logs used to refine the interpretation of the seismic profiles. Geological data from the new well at Aden, in the form of drill cuttings, have been used to essentially confirm the existence of a mud-mound in the Ullin limestone at a depth of 4300 feet. Geophysical well logs from the new well near Aden were used to create 1-D computer models and synthetic seismograms for comparison to the seismic data. The reflection seismic method is widely used to aid interpreting subsurface geology. Processing seismic data is an important step in the method as a properly processed seismic section can give a better image of the subsurface geology whereas a poorly processed section could mislead the interpretation. Seismic reflections will be more accurately depicted with careful determination of seismic velocities and by carefully choosing the processing steps and parameters. Various data processing steps have been applied and parameters refined to produce improved stacked seismic records. The resulting seismic records from the Aden field area indicate a seismic response similar to what is expected from a carbonate mud-mound. One-dimensional synthetic seismograms were created using the available sonic and density logs from the well drilled near the Aden seismic lines. The 1-D synthetics were used by Cory Cantrell of Royal Drilling and Producing Company to identify various reflections on the seismic records. Seismic data was compared with the modeled synthetic seismograms to identify what appears to be a carbonate mud-mound within the Aden study area. No mud-mounds have been previously found in the Aden oil field. Average and interval velocities obtained from the geophysical logs from the wells drilled in the Aden area was compared with the same type of well velocities from the Broughton known mud-mound area to observe the significance of velocity variation related to the un-known mud-mound in the Aden study area. The results of the velocity study shows a similar trends in the wells from both areas and are higher at the bottom of the wells. Another approach was used to observe the variation of root mean square velocities calculated from the sonic log from the well velocity from the Aden area and the stacking velocities obtained from the seismic data adjacent to the well.

  9. Solfatara volcano subsurface imaging: two different approaches to process and interpret multi-variate data sets

    NASA Astrophysics Data System (ADS)

    Bernardinetti, Stefano; Bruno, Pier Paolo; Lavoué, François; Gresse, Marceau; Vandemeulebrouck, Jean; Revil, André

    2017-04-01

    The need to reduce model uncertainty and produce a more reliable geophysical imaging and interpretations is nowadays a fundamental task required to geophysics techniques applied in complex environments such as Solfatara Volcano. The use of independent geophysical methods allows to obtain many information on the subsurface due to the different sensitivities of the data towards parameters such as compressional and shearing wave velocities, bulk electrical conductivity, or density. The joint processing of these multiple physical properties can lead to a very detailed characterization of the subsurface and therefore enhance our imaging and our interpretation. In this work, we develop two different processing approaches based on reflection seismology and seismic P-wave tomography on one hand, and electrical data acquired over the same line, on the other hand. From these data, we obtain an image-guided electrical resistivity tomography and a post processing integration of tomographic results. The image-guided electrical resistivity tomography is obtained by regularizing the inversion of the electrical data with structural constraints extracted from a migrated seismic section using image processing tools. This approach enables to focus the reconstruction of electrical resistivity anomalies along the features visible in the seismic section, and acts as a guide for interpretation in terms of subsurface structures and processes. To integrate co-registrated P-wave velocity and electrical resistivity values, we apply a data mining tool, the k-means algorithm, to individuate relationships between the two set of variables. This algorithm permits to individuate different clusters with the objective to minimize the sum of squared Euclidean distances within each cluster and maximize it between clusters for the multivariate data set. We obtain a partitioning of the multivariate data set in a finite number of well-correlated clusters, representative of the optimum clustering of our geophysical variables (P-wave velocities and electrical resistivities). The result is an integrated tomography that shows a finite number of homogeneous geophysical facies, and therefore permits to highlight the main geological features of the subsurface.

  10. Gabor Deconvolution as Preliminary Method to Reduce Pitfall in Deeper Target Seismic Data

    NASA Astrophysics Data System (ADS)

    Oktariena, M.; Triyoso, W.

    2018-03-01

    Anelastic attenuation process during seismic wave propagation is the trigger of seismic non-stationary characteristic. An absorption and a scattering of energy are causing the seismic energy loss as the depth increasing. A series of thin reservoir layers found in the study area is located within Talang Akar Fm. Level, showing an indication of interpretation pitfall due to attenuation effect commonly occurred in deeper level seismic data. Attenuation effect greatly influences the seismic images of deeper target level, creating pitfalls in several aspect. Seismic amplitude in deeper target level often could not represent its real subsurface character due to a low amplitude value or a chaotic event nearing the Basement. Frequency wise, the decaying could be seen as the frequency content diminishing in deeper target. Meanwhile, seismic amplitude is the simple tool to point out Direct Hydrocarbon Indicator (DHI) in preliminary Geophysical study before a further advanced interpretation method applied. A quick-look of Post-Stack Seismic Data shows the reservoir associated with a bright spot DHI while another bigger bright spot body detected in the North East area near the field edge. A horizon slice confirms a possibility that the other bright spot zone has smaller delineation; an interpretation pitfall commonly occurs in deeper level of seismic. We evaluates this pitfall by applying Gabor Deconvolution to address the attenuation problem. Gabor Deconvolution forms a Partition of Unity to factorize the trace into smaller convolution window that could be processed as stationary packets. Gabor Deconvolution estimates both the magnitudes of source signature alongside its attenuation function. The enhanced seismic shows a better imaging in the pitfall area that previously detected as a vast bright spot zone. When the enhanced seismic is used for further advanced reprocessing process, the Seismic Impedance and Vp/Vs Ratio slices show a better reservoir delineation, in which the pitfall area is reduced and some morphed as background lithology. Gabor Deconvolution removes the attenuation by performing Gabor Domain spectral division, which in extension also reduces interpretation pitfall in deeper target seismic.

  11. Volcanic tremor and local earthquakes at Copahue volcanic complex, Southern Andes, Argentina

    NASA Astrophysics Data System (ADS)

    Ibáñez, J. M.; Del Pezzo, E.; Bengoa, C.; Caselli, A.; Badi, G.; Almendros, J.

    2008-07-01

    In the present paper we describe the results of a seismic field survey carried out at Copahue Volcano, Southern Andes, Argentina, using a small-aperture, dense seismic antenna. Copahue Volcano is an active volcano that exhibited a few phreatic eruptions in the last 20 years. The aim of this experiment was to record and classify the background seismic activity of this volcanic area, and locate the sources of local earthquakes and volcanic tremor. Data consist of several volcano-tectonic (VT) earthquakes, and many samples of back-ground seismic noise. We use both ordinary spectral, and multi-spectral techniques to measure the spectral content, and an array technique [Zero Lag Cross Correlation technique] to measure the back-azimuth and apparent slowness of the signals propagating across the array. We locate VT earthquakes using a procedure based on the estimate of slowness vector components and S-P time. VT events are located mainly along the border of the Caviahue caldera lake, positioned at the South-East of Copahue volcano, in a depth interval of 1-3 km below the surface. The background noise shows the presence of many transients with high correlation among the array stations in the frequency band centered at 2.5 Hz. These transients are superimposed to an uncorrelated background seismic signal. Array solutions for these transients show a predominant slowness vector pointing to the exploited geothermal field of "Las Maquinitas" and "Copahue Village", located about 6 km north of the array site. We interpret this coherent signal as a tremor generated by the activity of the geothermal field.

  12. Vibration sensors

    NASA Astrophysics Data System (ADS)

    Gupta, Amita; Singh, Ranvir; Ahmad, Amir; Kumar, Mahesh

    2003-10-01

    Today, vibration sensors with low and medium sensitivities are in great demand. Their applications include robotics, navigation, machine vibration monitoring, isolation of precision equipment & activation of safety systems e.g. airbags in automobiles. Vibration sensors have been developed at SSPL, using silicon micromachining to sense vibrations in a system in the 30 - 200 Hz frequency band. The sensing element in the silicon vibration sensor is a seismic mass suspended by thin silicon hinges mounted on a metallized glass plate forming a parallel plate capacitor. The movement of the seismic mass along the vertical axis is monitored to sense vibrations. This is obtained by measuring the change in capacitance. The movable plate of the parallel plate capacitor is formed by a block connected to a surrounding frame by four cantilever beams located on sides or corners of the seismic mass. This element is fabricated by silicon micromachining. Several sensors in the chip sizes 1.6 cm x 1.6 cm, 1 cm x 1 cm and 0.7 cm x 0.7 cm have been fabricated. Work done on these sensors, techniques used in processing and silicon to glass bonding are presented in the paper. Performance evaluation of these sensors is also discussed.

  13. Crustal deformation associated with an M8.1 earthquake in the Solomon Islands, detected by ALOS/PALSAR

    NASA Astrophysics Data System (ADS)

    Miyagi, Yousuke; Ozawa, Taku; Shimada, Masanobu

    2009-10-01

    On April 1, 2007 (UTC), a large Mw 8.1 interplate earthquake struck the Solomon Islands subduction zone where complicated tectonics result from the subduction of four plates. Extensive ground movements and a large tsunami occurred in the epicentral area causing severe damage over a wide area. Using ALOS/PALSAR data and the DInSAR technique, we detected crustal deformation exceeding 2 m in islands close to the epicenter. A slip distribution of the inferred seismic fault was estimated using geodetic information derived from DInSAR processing and field investigations. The result indicates large slip areas around the hypocenter and the centroid. It is possible that the largest slip area is related to subduction of the plate boundary between the Woodlark and Australian plates. A small slip area between those large slip areas may indicate weak coupling due to thermal activity related to volcanic activity on Simbo Island. The 2007 earthquake struck an area where large earthquake has not occurred since 1970. Most of this seismic gap was filled by the 2007 events, however a small seismic gap still remains in the southeastern region of the 2007 earthquake.

  14. Full Waveform Modelling for Subsurface Characterization with Converted-Wave Seismic Reflection

    NASA Astrophysics Data System (ADS)

    Triyoso, Wahyu; Oktariena, Madaniya; Sinaga, Edycakra; Syaifuddin, Firman

    2017-04-01

    While a large number of reservoirs have been explored using P-waves seismic data, P-wave seismic survey ceases to provide adequate result in seismically and geologically challenging areas, like gas cloud, shallow drilling hazards, strong multiples, highly fractured, anisotropy. Most of these reservoir problems can be addressed using P and PS seismic data combination. Multicomponent seismic survey records both P-wave and S-wave unlike conventional survey that only records compressional P-wave. Under certain conditions, conventional energy source can be used to record P and PS data using the fact that compressional wave energy partly converts into shear waves at the reflector. Shear component can be recorded using down going P-wave and upcoming S-wave by placing a horizontal component geophone on the ocean floor. A synthetic model is created based on real data to analyze the effect of gas cloud existence to PP and PS wave reflections which has a similar characteristic to Sub-Volcanic imaging. The challenge within the multicomponent seismic is the different travel time between P-wave and S-wave, therefore the converted-wave seismic data should be processed with different approach. This research will provide a method to determine an optimum converted point known as Common Conversion Point (CCP) that can solve the Asymmetrical Conversion Point of PS data. The value of γ (Vp/Vs) is essential to estimate the right CCP that will be used in converted-wave seismic processing. This research will also continue to the advanced processing method of converted-wave seismic by applying Joint Inversion to PP&PS seismic. Joint Inversion is a simultaneous model-based inversion that estimates the P&S-wave impedance which are consistent with the PP&PS amplitude data. The result reveals a more complex structure mirrored in PS data below the gas cloud area. Through estimated γ section resulted from Joint Inversion, we receive a better imaging improvement below gas cloud area tribute to the converted-wave seismic as additional constrain.

  15. Opto-mechanical lab-on-fibre seismic sensors detected the Norcia earthquake.

    PubMed

    Pisco, Marco; Bruno, Francesco Antonio; Galluzzo, Danilo; Nardone, Lucia; Gruca, Grzegorz; Rijnveld, Niek; Bianco, Francesca; Cutolo, Antonello; Cusano, Andrea

    2018-04-27

    We have designed and developed lab-on-fibre seismic sensors containing a micro-opto-mechanical cavity on the fibre tip. The mechanical cavity is designed as a double cantilever suspended on the fibre end facet and connected to a proof mass to tune its response. Ground acceleration leads to displacement of the cavity length, which in turn can be remotely detected using an interferometric interrogation technique. After the sensors characterization, an experimental validation was conducted at the Italian National Institute of Geophysics and Volcanology (INGV), which is responsible for seismic surveillance over the Italian country. The fabricated sensors have been continuously used for long periods to demonstrate their effectiveness as seismic accelerometer sensors. During the tests, fibre optic seismic accelerometers clearly detected the seismic sequence that culminated in the severe Mw6.5 Norcia earthquake that struck central Italy on October 30, 2016. The seismic data provided by the optical sensors were analysed by specialists at the INGV. The wave traces were compared with state-of-the-art traditional sensors typically incorporated into the INGV seismic networks. The comparison verifies the high fidelity of the optical sensors in seismic wave detection, indicating their suitability for a novel class of seismic sensors to be employed in practical scenarios.

  16. Improving fault image by determination of optimum seismic survey parameters using ray-based modeling

    NASA Astrophysics Data System (ADS)

    Saffarzadeh, Sadegh; Javaherian, Abdolrahim; Hasani, Hossein; Talebi, Mohammad Ali

    2018-06-01

    In complex structures such as faults, salt domes and reefs, specifying the survey parameters is more challenging and critical owing to the complicated wave field behavior involved in such structures. In the petroleum industry, detecting faults has become crucial for reservoir potential where faults can act as traps for hydrocarbon. In this regard, seismic survey modeling is employed to construct a model close to the real structure, and obtain very realistic synthetic seismic data. Seismic modeling software, the velocity model and parameters pre-determined by conventional methods enable a seismic survey designer to run a shot-by-shot virtual survey operation. A reliable velocity model of structures can be constructed by integrating the 2D seismic data, geological reports and the well information. The effects of various survey designs can be investigated by the analysis of illumination maps and flower plots. Also, seismic processing of the synthetic data output can describe the target image using different survey parameters. Therefore, seismic modeling is one of the most economical ways to establish and test the optimum acquisition parameters to obtain the best image when dealing with complex geological structures. The primary objective of this study is to design a proper 3D seismic survey orientation to achieve fault zone structures through ray-tracing seismic modeling. The results prove that a seismic survey designer can enhance the image of fault planes in a seismic section by utilizing the proposed modeling and processing approach.

  17. Seismic signal processing on heterogeneous supercomputers

    NASA Astrophysics Data System (ADS)

    Gokhberg, Alexey; Ermert, Laura; Fichtner, Andreas

    2015-04-01

    The processing of seismic signals - including the correlation of massive ambient noise data sets - represents an important part of a wide range of seismological applications. It is characterized by large data volumes as well as high computational input/output intensity. Development of efficient approaches towards seismic signal processing on emerging high performance computing systems is therefore essential. Heterogeneous supercomputing systems introduced in the recent years provide numerous computing nodes interconnected via high throughput networks, every node containing a mix of processing elements of different architectures, like several sequential processor cores and one or a few graphical processing units (GPU) serving as accelerators. A typical representative of such computing systems is "Piz Daint", a supercomputer of the Cray XC 30 family operated by the Swiss National Supercomputing Center (CSCS), which we used in this research. Heterogeneous supercomputers provide an opportunity for manifold application performance increase and are more energy-efficient, however they have much higher hardware complexity and are therefore much more difficult to program. The programming effort may be substantially reduced by the introduction of modular libraries of software components that can be reused for a wide class of seismology applications. The ultimate goal of this research is design of a prototype for such library suitable for implementing various seismic signal processing applications on heterogeneous systems. As a representative use case we have chosen an ambient noise correlation application. Ambient noise interferometry has developed into one of the most powerful tools to image and monitor the Earth's interior. Future applications will require the extraction of increasingly small details from noise recordings. To meet this demand, more advanced correlation techniques combined with very large data volumes are needed. This poses new computational problems that require dedicated HPC solutions. The chosen application is using a wide range of common signal processing methods, which include various IIR filter designs, amplitude and phase correlation, computing the analytic signal, and discrete Fourier transforms. Furthermore, various processing methods specific for seismology, like rotation of seismic traces, are used. Efficient implementation of all these methods on the GPU-accelerated systems represents several challenges. In particular, it requires a careful distribution of work between the sequential processors and accelerators. Furthermore, since the application is designed to process very large volumes of data, special attention had to be paid to the efficient use of the available memory and networking hardware resources in order to reduce intensity of data input and output. In our contribution we will explain the software architecture as well as principal engineering decisions used to address these challenges. We will also describe the programming model based on C++ and CUDA that we used to develop the software. Finally, we will demonstrate performance improvements achieved by using the heterogeneous computing architecture. This work was supported by a grant from the Swiss National Supercomputing Centre (CSCS) under project ID d26.

  18. Seismological mechanism analysis of 2015 Luanxian swarm, Hebei province,China

    NASA Astrophysics Data System (ADS)

    Tan, Yipei; Liao, Xu; Ma, Hongsheng; Zhou, Longquan; Wang, Xingzhou

    2017-04-01

    The seismological mechanism of an earthquake swarm, a kind of seismic burst activity, means the physical and dynamic process in earthquakes triggering in the swarm. Here we focus on the seismological mechanism of 2015 Luanxian swarm in Hebei province, China. The process of digital seismic waveform data processing is divided into four steps. (1) Choose the three components waveform of earthquakes in the catalog as templates, and detect missing earthquakes by scanning the continues waveforms with matched filter technique. (2) Recalibrate P and S-wave phase arrival time using waveform cross-correlation phase detection technique to eliminate the artificial error in phase picking in the observation report made by Hebei seismic network, and then we obtain a more complete catalog and a more precise seismic phase report. (3) Relocate the earthquakes in the swarm using hypoDD based on phase arrival time we recalibrated, and analyze the characteristics of swarm epicenter migration based on the earthquake relocation result. (4) Detect whether there are repeating earthquakes activity using both waveform cross-correlation standard and whether rupture areas can overlapped. We finally detect 106 missing earthquakes in the swarm, 66 of them have the magnitude greater than ML0.0, include 2 greater than ML1.0. Relocation result shows that the epicenters of earthquakes in the swarm have a strip distribution in NE-SW direction, which indicates the seismogenic structure may be a NE-SW trending fault. The spatial-temporal distribution variation of epicenters in the swarm shows a kind of two stages linear migration characteristics, in which the first stage has appeared with a higher migration velocity as 1.2 km per day, and the velocity of the second step is 0.0024 km per day. According to the three basic models to explain the seismological mechanism of earthquake swarms: cascade model, slow slip model and fluid diffusion model, repeating earthquakes activity is difficult to explain by previous earthquakes stress triggering, however, it can be explained by continuing stress loading at the same asperity from fault slow slip. The phenomena of linear migration is more fitting slow slip model than the migration characteristics of fluid diffusion which satisfied diffusion equation. Comparing the phenomena we observed and the seismological mechanism models, we find that the Luanxian earthquake swarm may be associated with fault slow slip. Fault slow slip may play a role in Luanxian earthquake swarm triggering and sustained activity.

  19. Development of Vertical Cable Seismic System (3)

    NASA Astrophysics Data System (ADS)

    Asakawa, E.; Murakami, F.; Tsukahara, H.; Mizohata, S.; Ishikawa, K.

    2013-12-01

    The VCS (Vertical Cable Seismic) is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. Because VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed the method for the hydrothermal deposit survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We are now developing a VCS system, including not only data acquisition hardware but data processing and analysis technique. We carried out several VCS surveys combining with surface towed source, deep towed source and ocean bottom source. The water depths of the survey are from 100m up to 2100m. The target of the survey includes not only hydrothermal deposit but oil and gas exploration. Through these experiments, our VCS data acquisition system has been completed. But the data processing techniques are still on the way. One of the most critical issues is the positioning in the water. The uncertainty in the positions of the source and of the hydrophones in water degraded the quality of subsurface image. GPS navigation system are available on sea surface, but in case of deep-towed source or ocean bottom source, the accuracy of shot position with SSBL/USBL is not sufficient for the very high-resolution imaging. We have developed another approach to determine the positions in water using the travel time data from the source to VCS hydrophones. In the data acquisition stage, we estimate the position of VCS location with slant ranging method from the sea surface. The deep-towed source or ocean bottom source is estimated by SSBL/USBL. The water velocity profile is measured by XCTD. After the data acquisition, we pick the first break times of the VCS recorded data. The estimated positions of shot points and receiver points in the field include the errors. We use these data as initial guesses, we invert iteratively shot and receiver positions to match the travel time data. After several iterations we could finally estimate the most probable positions. Integration of the constraint of VCS hydrophone positions, such as the spacing is 10m, can accelerate the convergence of the iterative inversion and improve results. The accuracy of the estimated positions from the travel time date is enough for the VCS data processing.

  20. Shear velocity of the Rotokawa geothermal field using ambient noise

    NASA Astrophysics Data System (ADS)

    Civilini, F.; Savage, M. K.; Townend, J.

    2014-12-01

    Ambient noise correlation is an increasingly popular seismological technique that uses the ambient seismic noise recorded at two stations to construct an empirical Green's function. Applications of this technique include determining shear velocity structure and attenuation. An advantage of ambient noise is that it does not rely on external sources of seismic energy such as local or teleseismic earthquakes. This method has been used in the geothermal industry to determine the depths at which magmatic processes occur, to distinguish between production and non-production areas, and to observe seismic velocity perturbations associated with fluid extraction. We will present a velocity model for the Rotokawa geothermal field near Taupo, New Zealand, produced from ambient noise cross correlations. Production at Rotokawa is based on the "Rotokawa A" combined cycle power station established in 1997 and the "Nga Awa Purua" triple flash power plant established in 2010. Rotokawa Joint Venture, a partnership between Mighty River Power and Tauhara North No. 2 Trust currently operates 174 MW of generation at Rotokawa. An array of short period seismometers was installed in 2008 and occupies an area of roughly 5 square kilometers around the site. Although both cultural and natural noise sources are recorded at the stations, the instrument separation distance provides a unique challenge for analyzing cross correlations produced by both signal types. The inter-station spacing is on the order of a few kilometers, so waves from cultural sources generally are not coherent from one station to the other, while the wavelength produced by natural noise is greater than the station separation. Velocity models produced from these two source types will be compared to known geological models of the site. Depending on the amount of data needed to adequately construct cross-correlations, a time-dependent model of velocity will be established and compared with geothermal production processes.

  1. 2008 Gordon Research Conference on Rock Deformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirth, James G.; Gray, Nancy Ryan

    2009-09-21

    The GRC on Rock Deformation highlights the latest research in brittle and ductile rock mechanics from experimental, field and theoretical perspectives. The conference promotes a multi-disciplinary forum for assessing our understanding of rock strength and related physical properties in the Earth. The theme for the 2008 conference is 'Real-time Rheology'. Using ever-improving geophysical techniques, our ability to constrain the rheological behavior during earthquakes and post-seismic creep has improved significantly. Such data are used to investigate the frictional behavior of faults, processes responsible for strain localization, the viscosity of the lower crust, and viscous coupling between the crust and mantle. Seismologicalmore » data also provide information on the rheology of the lower crust and mantle through analysis of seismic attenuation and anisotropy. Geologists are improving our understanding of rheology by combining novel analyses of microstructures in naturally deformed rocks with petrologic data. This conference will bring together experts and students in these research areas with experimentalists and theoreticians studying the same processes. We will discuss and assess where agreement exists on rheological constraints derived at different length/time scales using different techniques - and where new insight is required. To encompass the elements of these topics, speakers and discussion leaders with backgrounds in geodesy, experimental rock deformation, structural geology, earthquake seismology, geodynamics, glaciology, materials science, and mineral physics will be invited to the conference. Thematic sessions will be organized on the dynamics of earthquake rupture, the rheology of the lower crust and coupling with the upper mantle, the measurement and interpretation of seismic attenuation and anisotropy, the dynamics of ice sheets and the coupling of reactive porous flow and brittle deformation for understanding geothermal and chemical properties of the shallow crust that are important for developing ideas in CO2 sequestration, geothermal and petrochemical research and the mechanics of shallow faults.« less

  2. High-resolution seismic data regularization and wavefield separation

    NASA Astrophysics Data System (ADS)

    Cao, Aimin; Stump, Brian; DeShon, Heather

    2018-04-01

    We present a new algorithm, non-equispaced fast antileakage Fourier transform (NFALFT), for irregularly sampled seismic data regularization. Synthetic tests from 1-D to 5-D show that the algorithm may efficiently remove leaked energy in the frequency wavenumber domain, and its corresponding regularization process is accurate and fast. Taking advantage of the NFALFT algorithm, we suggest a new method (wavefield separation) for the detection of the Earth's inner core shear wave with irregularly distributed seismic arrays or networks. All interfering seismic phases that propagate along the minor arc are removed from the time window around the PKJKP arrival. The NFALFT algorithm is developed for seismic data, but may also be used for other irregularly sampled temporal or spatial data processing.

  3. An automated cross-correlation based event detection technique and its application to surface passive data set

    USGS Publications Warehouse

    Forghani-Arani, Farnoush; Behura, Jyoti; Haines, Seth S.; Batzle, Mike

    2013-01-01

    In studies on heavy oil, shale reservoirs, tight gas and enhanced geothermal systems, the use of surface passive seismic data to monitor induced microseismicity due to the fluid flow in the subsurface is becoming more common. However, in most studies passive seismic records contain days and months of data and manually analysing the data can be expensive and inaccurate. Moreover, in the presence of noise, detecting the arrival of weak microseismic events becomes challenging. Hence, the use of an automated, accurate and computationally fast technique for event detection in passive seismic data is essential. The conventional automatic event identification algorithm computes a running-window energy ratio of the short-term average to the long-term average of the passive seismic data for each trace. We show that for the common case of a low signal-to-noise ratio in surface passive records, the conventional method is not sufficiently effective at event identification. Here, we extend the conventional algorithm by introducing a technique that is based on the cross-correlation of the energy ratios computed by the conventional method. With our technique we can measure the similarities amongst the computed energy ratios at different traces. Our approach is successful at improving the detectability of events with a low signal-to-noise ratio that are not detectable with the conventional algorithm. Also, our algorithm has the advantage to identify if an event is common to all stations (a regional event) or to a limited number of stations (a local event). We provide examples of applying our technique to synthetic data and a field surface passive data set recorded at a geothermal site.

  4. New relationship between fundamental site frequency and thickness of soft sediments from seismic ambient noise

    NASA Astrophysics Data System (ADS)

    Abd el-aal, Abd el-aziz Khairy

    2018-05-01

    In this contribution, new relationship between the fundamental site frequency and the thickness of soft sediments is obtained for many sites in Egypt. The Horizontal-to-Vertical Spectral Ratio ("H/V") technique (known as Nakamura technique) can be used as a robust tool to determine the thickness of soft sediments layers overlaying bedrock from observations and measurements of seismic ambient noise data. In Egypt, numerous seismic ambient noise measurements have been conducted in several areas to determine the dynamic properties of soft soil for engineering purposes. At each site in each studied area, the fundamental site frequency was accurately estimated from the main peak in the spectral ratio between the horizontal and vertical component. Consequently, an extensive database of microtremor measurements, well logging data, and shallow seismic refraction data have been configured and assembled for the studied areas. New formula between fundamental site frequency (f 0 ) and thickness of soft sediments (h) is established. The new formula has been validated and compared with other formulas of earlier scientists, and the results indicate that the calculated depth and geometry of the bedrock surface using new formula are in a good agreement with well logs data and previously published seismic refraction surveys in the investigated sites.

  5. Crustal structure of the St. Elias Mountains region, southern Alaska, from regional earthquakes and ambient noise tomography

    NASA Astrophysics Data System (ADS)

    Ruppert, N. A.; Stachnik, J. C.; Hansen, R. A.

    2011-12-01

    STEEP (SainT Elias TEctonics and Erosion Project) is a multi-disciplinary research project that took place in southern Alaska between 2005 and 2010. An important component of this undertaking was installation and operation of a dense array of 22 broadband seismometers to augment and improve the existing regional seismic network in the St. Elias Mountains. This allowed for a lower detection threshold and better accuracy for local seismicity and also provided a rich dataset of teleseismic recordings. While the seismic stations were designed to transmit the data in real time, due to harsh weather and difficult terrain conditions some data were recorded only on site and had to be post-processed months and years later. Despite these difficulties, the recorded dataset detected and located regional earthquakes as small as magnitude 0.5 in the network core area. The recorded seismicity shows some clear patterns. A majority of the earthquakes are concentrated along the coast in a distributed area up to 100 km wide. The coastal seismicity can be further subdivided into 3 distinct clusters: Icy Bay, Bering Glacier, and the Copper River delta. This coastal seismicity is abutted by a somewhat aseismic zone that roughly follows the Bagley Ice Field. Farther inland another active region of seismicity is associated with the Denali Fault system. All this seismicity is concentrated in the upper 25 km of the crust. The only region where earthquakes as deep as 100 km occur is beneath the Wrangell volcanoes in the northwestern corner of the study area. The earthquake focal mechanisms are predominately reverse, with some areas of strike-slip faulting also present. The seismicity patterns and faulting mechanisms indicate a high concentration of thrust faulting in the coastal region. The ambient noise cross correlations from the stations in the STEEP region reveal Rayleigh wave packets with good signal-to-noise ratios yielding well-defined interstation phase velocity dispersion curves. These dispersion measurements are inverted for two-dimensional phase velocity maps from 4 to 40 second period. Preliminary analysis indicates slower velocities in a 100-km-wide zone along the southern Alaska coast, with distinctly higher velocities farther inland. We will present results of precise earthquake relocations using waveform cross-correlation and double difference relocation techniques and interpret these within the framework of regional tectonics and subsurface structures as evidenced by the ambient noise tomography.

  6. Full waveform seismic modelling of Chalk Group rocks from the Danish North Sea - implications for velocity analysis

    NASA Astrophysics Data System (ADS)

    Montazeri, Mahboubeh; Moreau, Julien; Uldall, Anette; Nielsen, Lars

    2015-04-01

    This study aims at understanding seismic wave propagation in the fine-layered Chalk Group, which constitutes the main reservoir for oil and gas production in the Danish North Sea. The starting point of our analysis is the Nana-1XP exploration well, which shows strong seismic contrasts inside the Chalk Group. For the purposes of seismic waveform modelling, we here assume a one-dimensional model with homogeneous and isotropic layers designed to capture the main fluctuations in petrophysical properties observed in the well logs. The model is representative of the stratigraphic sequences of the area and it illustrates highly contrasting properties of the Chalk Group. Finite-difference (FD) full wave technique, both acoustic and elastic equations are applied to the model. Velocity analysis of seismic data is a crucial step for stacking, multiple suppression, migration, and depth conversion of the seismic record. Semblance analysis of the synthetic seismic records shows strong amplitude peaks outside the expected range for the time interval representing the Chalk Group, especially at the base. The various synthetic results illustrate the occurrence and the impact of different types of waves including multiples, converted waves and refracted waves. The interference of these different wave types with the primary reflections can explain the strong anomalous amplitudes in the semblance plot. In particular, the effect of strongly contrasting thin beds plays an important role in the generation of the high anomalous amplitude values. If these anomalous amplitudes are used to pick the velocities, it would impede proper stacking of the data and may result in sub-optimal migration and depth conversion. Consequently this may lead to erroneous or sub-optimal seismic images of the Chalk Group and the underlying layers. Our results highlight the importance of detailed velocity analysis and proper picking of velocity functions in the Chalk Group intervals. We show that application of standard front mutes in the mid- and far-offset ranges does not significantly improve the results of the standard semblance analysis. These synthetic modelling results could be used as starting points for defining optimized processing flows for the seismic data sets acquired in the study area with the aim of improving the imaging of the Chalk Group.

  7. Back-Projection Imaging of extended, diffuse seismic sources in volcanic and hydrothermal systems

    NASA Astrophysics Data System (ADS)

    Kelly, C. L.; Lawrence, J. F.; Beroza, G. C.

    2017-12-01

    Volcanic and hydrothermal systems exhibit a wide range of seismicity that is directly linked to fluid and volatile activity in the subsurface and that can be indicative of imminent hazardous activity. Seismograms recorded near volcanic and hydrothermal systems typically contain "noisy" records, but in fact, these complex signals are generated by many overlapping low-magnitude displacements and pressure changes at depth. Unfortunately, excluding times of high-magnitude eruptive activity that typically occur infrequently relative to the length of a system's entire eruption cycle, these signals often have very low signal-to-noise ratios and are difficult to identify and study using established seismic analysis techniques (i.e. phase-picking, template matching). Arrays of short-period and broadband seismic sensors are proven tools for monitoring short- and long-term changes in volcanic and hydrothermal systems. Time-reversal techniques (i.e. back-projection) that are improved by additional seismic observations have been successfully applied to locating volcano-seismic sources recorded by dense sensor arrays. We present results from a new computationally efficient back-projection method that allows us to image the evolution of extended, diffuse sources of volcanic and hydrothermal seismicity. We correlate short time-window seismograms from receiver-pairs to find coherent signals and propagate them back in time to potential source locations in a 3D subsurface model. The strength of coherent seismic signal associated with any potential source-receiver-receiver geometry is equal to the correlation of the short time-windows of seismic records at appropriate time lags as determined by the velocity structure and ray paths. We stack (sum) all short time-window correlations from all receiver-pairs to determine the cumulative coherence of signals at each potential source location. Through stacking, coherent signals from extended and/or repeating sources of short-period energy radiation interfere constructively while background noise signals interfere destructively, such that the most likely source locations of the observed seismicity are illuminated. We compile results to analyze changes in the distribution and prevalence of these sources throughout a systems entire eruptive cycle.

  8. Velocity structures of Geothermal sites: A comparative study between different tomography techniques on the EGS-Soultz-sous-Forêts Site (France)

    NASA Astrophysics Data System (ADS)

    Calo', M. C.; Dorbath, C.

    2009-12-01

    One major goal of monitoring seismicity accompanying hydraulic fracturing of a reservoir is to recover the seismic velocity field in and around the geothermal site. In many cases the seismicity induced by the hydraulic stimulations allows us to roughly describe the velocity anomalies close to the hypocentral location, but only during the time period of the stimulation. Several studies have shown that the 4D (time dependent) seismic tomographies are very useful to illustrate and study the temporal variation of the seismic velocities conditioned by injected fluids. Nevertheless in geothermal fields local earthquake tomography (LET) is often inadequate to study the seismic velocities during the inter-injection periods, due to the lack of seismicity. In July 2000 an injection test that lasted 15 days performed at the Enhanced Geothermal System (EGS) site of Soultz-sous-Forêts (Alsace, France) produced about 7200 micro-earthquakes with Duration Magnitude ranging from -0.9 to 2.5. the earthquakes were located by down hole and surface seismic stations. We present here a comparison between three tomographic studies, 1) the “traditional” seismic tomography of Cuneot et al., 2008, 2) a Double Difference tomography using the TomoDD code of Zhang and Thurber (2003) and, 3) the models obtained by applying the Weighted Average Model method (WAM, Calo’ et al., 2009). the velocity models were obtained using the same dataset recorded during the stimulation. The WAM technique produces a more reliable reconstruction of the structures around and above the cluster of earthquakes, as demonstrated by the distribution of the velocity standard deviations. Although the velocity distributions obtained by the three tomographic approaches are qualitatively similar, the WAM results correlate better with independent data such the fracturing directions measured in the down-holes, the location of the clustered seimsicity) than those of the traditional and DD tomographies. To overcome the limits of LET during the inter-injection periods we plan to perform a seismic noise tomography study. In geothermal sites, the elastic characteristics of the volume at rest, i.e. during the inter-injection periods, are often poorly known.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farfour, Mohammed; Yoon, Wang Jung; Yoon-Geun

    Defining and understanding hydrocarbon expressions in seismic expression is main concern of geoscientists in oil and gas exploration and production. Over the last decades several mathematical approaches have been developed in this regard. Most of approaches have addressed information in amplitude of seismic data. Recently, more attention has been drawn towards frequency related information in order to extract frequency behaviors of hydrocarbons bearing sediments. Spectrally decomposing seismic data into individual frequencies found to be an excellent tool for investigating geological formations and their pore fluids. To accomplish this, several mathematical approaches have been invoked. Continuous wavelet transform and Short Timemore » Window Fourier transform are widely used techniques for this purpose. This paper gives an overview of some widely used mathematical technique in hydrocarbon reservoir detection and mapping. This is followed by an application on real data from Boonsville field.« less

  10. Advanced geophysical underground coal gasification monitoring

    DOE PAGES

    Mellors, Robert; Yang, X.; White, J. A.; ...

    2014-07-01

    Underground Coal Gasification (UCG) produces less surface impact, atmospheric pollutants and greenhouse gas than traditional surface mining and combustion. Therefore, it may be useful in mitigating global change caused by anthropogenic activities. Careful monitoring of the UCG process is essential in minimizing environmental impact. Here we first summarize monitoring methods that have been used in previous UCG field trials. We then discuss in more detail a number of promising advanced geophysical techniques. These methods – seismic, electromagnetic, and remote sensing techniques – may provide improved and cost-effective ways to image both the subsurface cavity growth and surface subsidence effects. Activemore » and passive seismic data have the promise to monitor the burn front, cavity growth, and observe cavity collapse events. Electrical resistance tomography (ERT) produces near real time tomographic images autonomously, monitors the burn front and images the cavity using low-cost sensors, typically running within boreholes. Interferometric synthetic aperture radar (InSAR) is a remote sensing technique that has the capability to monitor surface subsidence over the wide area of a commercial-scale UCG operation at a low cost. It may be possible to infer cavity geometry from InSAR (or other surface topography) data using geomechanical modeling. The expected signals from these monitoring methods are described along with interpretive modeling for typical UCG cavities. They are illustrated using field results from UCG trials and other relevant subsurface operations.« less

  11. Energy Partitioning of Seismic Phases: Current Datasets and Techniques Aimed at Improving the Future of Event Identification

    NASA Astrophysics Data System (ADS)

    Bonner, J.

    2006-05-01

    Differences in energy partitioning of seismic phases from earthquakes and explosions provide the opportunity for event identification. In this talk, I will briefly review teleseismic Ms:mb and P/S ratio techniques that help identify events based on differences in compressional, shear, and surface wave energy generation from explosions and earthquakes. With the push to identify smaller yield explosions, the identification process has become increasingly complex as varied types of explosions, including chemical, mining, and nuclear, must be identified at regional distances. Thus, I will highlight some of the current views and problems associated with the energy partitioning of seismic phases from single- and delay-fired chemical explosions. One problem yet to have a universally accepted answer is whether the explosion and earthquake populations, based on the Ms:mb discriminants, should be separated at smaller magnitudes. I will briefly describe the datasets and theory that support either converging or parallel behavior of these populations. Also, I will discuss improvement to the currently used methods that will better constrain this problem in the future. I will also discuss the role of regional P/S ratios in identifying explosions. In particular, recent datasets from South Africa, Scandinavia, and the Western United States collected from earthquakes, single-fired chemical explosions, and/or delay-fired mining explosions have provide new insight into regional P, S, Lg, and Rg energy partitioning. Data from co-located mining and chemical explosions suggest that some mining explosions may be used for limited calibration of regional discriminants in regions where no historic explosion data is available.

  12. High-resolution imaging of the low velocity layer in Alaskan subduction zone with scattered waves and interferometry

    NASA Astrophysics Data System (ADS)

    Kim, D.; Keranen, K. M.; Abers, G. A.; Kim, Y.; Li, J.; Shillington, D. J.; Brown, L. D.

    2017-12-01

    The physical factors that control the rupture process of great earthquakes at convergent plate boundaries remain incompletely understood. While recent developments in imaging using the teleseismic wavefield have led to marked advances at wavelengths of a couple kilometers to tens of kilometers, higher resolution imaging of the rupture zone would improve the resolution of imaging and thus provide improved parameter estimation, as the teleseismic wavefield is fundamentally limited by its low frequency content. This study compares and evaluates two seismic imaging techniques using the high-frequency signals from teleseismic coda versus earthquake scattered waves to image the subducting Yakutat oceanic plateau in the Alaska subduction zone. We use earthquakes recorded by the MOOS PASSCAL broadband deployment in southern Alaska. In our first method, we select local earthquakes that lie directly beneath and laterally near the recording array for imaging, and extract body wave information via a simple autocorrelation and stacking. Profiles analogous to seismic reflection profile are constructed using the near-vertically travelling waves. In our second method, we compute teleseismic receiver functions within the 0.02-1.0 Hz frequency band. Both results image interfaces that we associate with the subducting oceanic plate in Alaska-Aleutian system, with greater resolution than commonly used methods with teleseismic sources. Structural details from our results can further our understanding of the conditions and materials that characterize the subduction megathrusts, and the techniques can be employed in other regions along the Alaska-Aleutian system and at other convergent margins with suitable seismic arrays.

  13. Seismic risk assessment and application in the central United States

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic risk is a somewhat subjective, but important, concept in earthquake engineering and other related decision-making. Another important concept that is closely related to seismic risk is seismic hazard. Although seismic hazard and seismic risk have often been used interchangeably, they are fundamentally different: seismic hazard describes the natural phenomenon or physical property of an earthquake, whereas seismic risk describes the probability of loss or damage that could be caused by a seismic hazard. The distinction between seismic hazard and seismic risk is of practical significance because measures for seismic hazard mitigation may differ from those for seismic risk reduction. Seismic risk assessment is a complicated process and starts with seismic hazard assessment. Although probabilistic seismic hazard analysis (PSHA) is the most widely used method for seismic hazard assessment, recent studies have found that PSHA is not scientifically valid. Use of PSHA will lead to (1) artifact estimates of seismic risk, (2) misleading use of the annual probability of exccedance (i.e., the probability of exceedance in one year) as a frequency (per year), and (3) numerical creation of extremely high ground motion. An alternative approach, which is similar to those used for flood and wind hazard assessments, has been proposed. ?? 2011 ASCE.

  14. Blind tests of methods for InSight Mars mission: Open scientific challenge

    NASA Astrophysics Data System (ADS)

    Clinton, John; Ceylan, Savas; Giardini, Domenico; Khan, Amir; van Driel, Martin; Böse, Maren; Euchner, Fabian; Garcia, Raphael F.; Drilleau, Mélanie; Lognonné, Philippe; Panning, Mark; Banerdt, Bruce

    2017-04-01

    The Marsquake Service (MQS) will be the ground segment service within the InSight mission to Mars, which will deploy a single seismic station on Elysium Planitia in November 2018. The main tasks of the MQS are the identification and characterisation of seismicity, and managing the Martian seismic event catalogue. In advance of the mission, we have developed a series of single station event location methods that rely on a priori 1D and 3D structural models. In coordination with the Mars Structural Service, we expect to use iterative inversion techniques to revise these structural models and event locations. In order to seek methodological advancements and test our current approaches, we have designed a blind test case using Martian synthetics combined with realistic noise models for the Martian surface. We invite all scientific parties that are interested in single station approaches and in exploring the Martian time-series to participate and contribute to our blind test. We anticipate the test will can improve currently developed location and structural inversion techniques, and also allow us explore new single station techniques for moment tensor and magnitude determination. The waveforms for our test case are computed employing AxiSEM and Instaseis for a randomly selected 1D background model and event catalogue that is statistically consistent with our current expectation of Martian seismicity. Realistic seismic surface noise is superimposed to generate a continuous time-series spanning 6 months. The event catalog includes impacts as well as Martian quakes. The temporal distribution of the seismicity in the timeseries, as well as the true structural model, are not be known to any participating parties including MQS till the end of competition. We provide our internal tools such as event location codes, suite of background models, seismic phase travel times, in order to support researchers who are willing to use/improve our current methods. Following the deadline of our blind test in late 2017, we plan to combine all outcomes in an article with all participants as co-authors.

  15. OCT structure, COB location and magmatic type of the SE Brazilian & S Angolan margins from integrated quantitative analysis of deep seismic reflection and gravity anomaly data

    NASA Astrophysics Data System (ADS)

    Cowie, L.; Kusznir, N. J.; Horn, B.

    2013-12-01

    Knowledge of ocean-continent transition (OCT) structure, continent-ocean boundary (COB) location and magmatic type are of critical importance for understanding rifted continental margin formation processes and in evaluating petroleum systems in deep-water frontier oil and gas exploration. The OCT structure, COB location and magmatic type of the SE Brazilian and S Angolan rifted continental margins are much debated; exhumed and serpentinised mantle have been reported at these margins. Integrated quantitative analysis using deep seismic reflection data and gravity inversion have been used to determine OCT structure, COB location and magmatic type for the SE Brazilian and S Angolan margins. Gravity inversion has been used to determine Moho depth, crustal basement thickness and continental lithosphere thinning. Residual Depth Anomaly (RDA) analysis has been used to investigate OCT bathymetric anomalies with respect to expected oceanic bathymetries and subsidence analysis has been used to determine the distribution of continental lithosphere thinning. These techniques have been validated on the Iberian margin for profiles IAM9 and ISE-01. In addition a joint inversion technique using deep seismic reflection and gravity anomaly data has been applied to the ION-GXT BS1-575 SE Brazil and ION-GXT CS1-2400 S Angola. The joint inversion method solves for coincident seismic and gravity Moho in the time domain and calculates the lateral variations in crustal basement densities and velocities along profile. Gravity inversion, RDA and subsidence analysis along the S Angolan ION-GXT CS1-2400 profile has been used to determine OCT structure and COB location. Analysis suggests that exhumed mantle, corresponding to a magma poor margin, is absent beneath the allochthonous salt. The thickness of earliest oceanic crust, derived from gravity and deep seismic reflection data is approximately 7km. The joint inversion predicts crustal basement densities and seismic velocities which are slightly less than expected for 'normal' oceanic crust. The difference between the sediment corrected RDA and that predicted from gravity inversion crustal thickness variation implies that this margin is experiencing ~300m of anomalous uplift attributed to mantle dynamic uplift. Gravity inversion, RDA and subsidence analysis have also been used to determine OCT structure and COB location along the ION-GXT BS1-575 profile, crossing the Sao Paulo Plateau and Florianopolis Ridge of the SE Brazilian margin. Gravity inversion, RDA and subsidence analysis predict the COB to be located SE of the Florianopolis Ridge. Analysis shows no evidence for exhumed mantle on this margin profile. The joint inversion technique predicts normal oceanic basement seismic velocities and densities and beneath the Sao Paulo Plateau and Florianopolis Ridge predicts crustal basement thicknesses between 10-15km. The Sao Paulo Plateau and Florianopolis Ridge are separated by a thin region of crustal basement beneath the salt interpreted as a regional transtensional structure. Sediment corrected RDAs and gravity derived 'synthetic' RDAs are of a similar magnitude on oceanic crust, implying negligible mantle dynamic topography.

  16. Can repeating glacial seismic events be used to monitor stress changes within the underlying volcano? -Case study from the glacier overlain Katla volcano, Iceland

    NASA Astrophysics Data System (ADS)

    Jonsdottir, K.; Vogfjord, K. S.; Bean, C. J.; Martini, F.

    2013-12-01

    The glacier overlain Katla volcano in South Iceland, is one of the most active and hazardous volcano in Europe. Katla eruptions result in hazardous glacial floods and intense tephra fall. On average there are eruptions every 50 years but the volcano is long overdue and we are now witnessing the longest quiescence period in 1000 years or since the settlement. Because of the hazard the volcano poses, it is under constant surveillance and gets a good share of the seismic stations from the national seismic network. Every year the seismic network records thousands of seismic events at Katla with magnitudes seldom exceeding M3. The bulk of the seismicity is however not due to volcano tectonics but seems to be caused mainly by shallow processes involving glacial deformation. Katla's ice filled caldera forms a glacier plateau of several hundred meters thick ice. The 9x14 km oval caldera is surrounded by higher rims where the glacier in some places gently and in others abruptly falls off tens and up to hundred meters to the surrounding lowland. The glacier surface is marked with dozen depressions or cauldrons which manifest geothermal activity below, probably coinciding with circular faults around the caldera. Our current understanding is that there are several glacial processes which cause seismicity; these include dry calving, where steep valley glaciers fall off cliffs and movements of glacier ice as the cauldrons deform due to hydraulic changes and geothermal activity at the glacier/bedrock boundary. These glacial events share a common feature of containing low frequency (2-4 hz) and long coda. Because of their shallow origin, surface waves are prominent. In our analysis we use waveforms from all of Katla's seismic events between years 2003-2013, with the criteria M>1 and minimum 4 p-wave picks. We correlate the waveforms of these events with each other and group them into families of highly similar events. Looking at the occurrence of these families we find that individual families are usually clustered in time over several months, and sometimes families may reappear even up to several years later. Using families including many events and covering long periods (10-20 months) we compare the coda (the tail) of individual events within a family. This is repeated for all the surrounding stations. The analysis, coda wave interferometry (cwi) is a correlation method that builds on the fact that changes in stress in the edifice lead to changes in seismic velocities. The coda waves are highly sensitive to small stress changes. By using a repeating source, implying we have the same source mechanism and the same path, we can track temporal stress changes in the medium between the source and the receiver. Preliminary results from Katla suggest that by using the repeating glacial events and the coda wave interferometry technique we observe annual seismic velocity changes around the volcano of ca. 0.7%. We find that seismic velocities increase from January through July and decrease in August to December. These changes can be explained by pore-water pressure changes and/or loading and de-loading of the overlain glacier. We do not find immediate precursors for an impending eruption at Katla; however we now have a better understanding of its background seismicity.

  17. Modeling Forced Imbibition Processes and the Associated Seismic Attenuation in Heterogeneous Porous Rocks

    NASA Astrophysics Data System (ADS)

    Solazzi, Santiago G.; Guarracino, Luis; Rubino, J. Germán.; Müller, Tobias M.; Holliger, Klaus

    2017-11-01

    Quantifying seismic attenuation during laboratory imbibition experiments can provide useful information toward the use of seismic waves for monitoring injection and extraction of fluids in the Earth's crust. However, a deeper understanding of the physical causes producing the observed attenuation is needed for this purpose. In this work, we analyze seismic attenuation due to mesoscopic wave-induced fluid flow (WIFF) produced by realistic fluid distributions representative of imbibition experiments. To do so, we first perform two-phase flow simulations in a heterogeneous rock sample to emulate a forced imbibition experiment. We then select a subsample of the considered rock containing the resulting time-dependent saturation fields and apply a numerical upscaling procedure to compute the associated seismic attenuation. By exploring both saturation distributions and seismic attenuation, we observe that two manifestations of WIFF arise during imbibition experiments: the first one is produced by the compressibility contrast associated with the saturation front, whereas the second one is due to the presence of patches containing very high amounts of water that are located behind the saturation front. We demonstrate that while the former process is expected to play a significant role in the case of high injection rates, which are associated with viscous-dominated imbibition processes, the latter becomes predominant during capillary-dominated processes, that is, for relatively low injection rates. We conclude that this kind of joint numerical analysis constitutes a useful tool for improving our understanding of the physical mechanisms producing seismic attenuation during laboratory imbibition experiments.

  18. ConvNetQuake: Convolutional Neural Network for Earthquake Detection and Location

    NASA Astrophysics Data System (ADS)

    Denolle, M.; Perol, T.; Gharbi, M.

    2017-12-01

    Over the last decades, the volume of seismic data has increased exponentially, creating a need for efficient algorithms to reliably detect and locate earthquakes. Today's most elaborate methods scan through the plethora of continuous seismic records, searching for repeating seismic signals. In this work, we leverage the recent advances in artificial intelligence and present ConvNetQuake, a highly scalable convolutional neural network for probabilistic earthquake detection and location from single stations. We apply our technique to study two years of induced seismicity in Oklahoma (USA). We detect 20 times more earthquakes than previously cataloged by the Oklahoma Geological Survey. Our algorithm detection performances are at least one order of magnitude faster than other established methods.

  19. Gravity Survey of the Rye Patch KGRA, Rye Patch, Nevada

    NASA Astrophysics Data System (ADS)

    Mcdonald, M. R.; Gosnold, W. D.

    2011-12-01

    The Rye Patch Known Geothermal Resource Area (KGRA) is located in Pershing County Nevada on the west side of the Humboldt Range and east of the Rye Patch Reservoir approximately 200 km northeast of Reno, Nevada. Previous studies include an earlier gravity survey, 3-D seismic reflection, vertical seismic profiling (VSP) on a single well, 3-D seismic imaging, and a report of the integrated seismic studies. Recently, Presco Energy conducted an aeromagnetic survey and is currently in the process of applying 2-D VSP methods to target exploration and production wells at the site. These studies have indicated that geothermal fluid flow primarily occurs along faults and fractures and that two potential aquifers include a sandstone/siltstone member of the Triassic Natchez Pass Formation and a karst zone that occurs at the interface between Mesozoic limestone and Tertiary volcanics. We hypothesized that addition of a high-resolution gravity survey would better define the locations, trends, lengths, and dip angles of faults and possible solution cavity features. The gravity survey encompassed an area of approximately 78 km2 (30 mi2) within the boundary of the KGRA along with portions of 8 sections directly to the west and 8 sections directly to the east. The survey included 203 stations that were spaced at 400 m intervals. The simple Bouguer anomaly patterns were coincident with elevation, and those patterns remained after terrain corrections were performed. To remove this signal, the data were further processed using wave-length (bandpass) filtering techniques. The results of the filtering and comparison with the recent aeromagnetic survey indicate that the location and trend of major fault systems can be identified using this technique. Dip angles can be inferred by the anomaly contour gradients. By further reductions in the bandpass window, other features such as possible karst solution channels may also be recognizable. Drilling or other geophysical methods such as a magnetotelluric survey may assist in confirming the results. However, lengths of the features were difficult to interpret as the wavelength filtering tends to truncate features in accordance with the bandpass window. Additional gravity measurements would aid in providing higher resolution for the identification and interpretation of features, particularly in the vicinity of the Humboldt House to the north and in an area located to the south of the study area where a large feature was identified in both the aeromagnetic and gravity surveys.

  20. Time-lapse joint inversion of geophysical data with automatic joint constraints and dynamic attributes

    NASA Astrophysics Data System (ADS)

    Rittgers, J. B.; Revil, A.; Mooney, M. A.; Karaoulis, M.; Wodajo, L.; Hickey, C. J.

    2016-12-01

    Joint inversion and time-lapse inversion techniques of geophysical data are often implemented in an attempt to improve imaging of complex subsurface structures and dynamic processes by minimizing negative effects of random and uncorrelated spatial and temporal noise in the data. We focus on the structural cross-gradient (SCG) approach (enforcing recovered models to exhibit similar spatial structures) in combination with time-lapse inversion constraints applied to surface-based electrical resistivity and seismic traveltime refraction data. The combination of both techniques is justified by the underlying petrophysical models. We investigate the benefits and trade-offs of SCG and time-lapse constraints. Using a synthetic case study, we show that a combined joint time-lapse inversion approach provides an overall improvement in final recovered models. Additionally, we introduce a new approach to reweighting SCG constraints based on an iteratively updated normalized ratio of model sensitivity distributions at each time-step. We refer to the new technique as the Automatic Joint Constraints (AJC) approach. The relevance of the new joint time-lapse inversion process is demonstrated on the synthetic example. Then, these approaches are applied to real time-lapse monitoring field data collected during a quarter-scale earthen embankment induced-piping failure test. The use of time-lapse joint inversion is justified by the fact that a change of porosity drives concomitant changes in seismic velocities (through its effect on the bulk and shear moduli) and resistivities (through its influence upon the formation factor). Combined with the definition of attributes (i.e. specific characteristics) of the evolving target associated with piping, our approach allows localizing the position of the preferential flow path associated with internal erosion. This is not the case using other approaches.

Top