Mapping Diffuse Seismicity Using Empirical Matched Field Processing Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, J; Templeton, D C; Harris, D B
The objective of this project is to detect and locate more microearthquakes using the empirical matched field processing (MFP) method than can be detected using only conventional earthquake detection techniques. We propose that empirical MFP can complement existing catalogs and techniques. We test our method on continuous seismic data collected at the Salton Sea Geothermal Field during November 2009 and January 2010. In the Southern California Earthquake Data Center (SCEDC) earthquake catalog, 619 events were identified in our study area during this time frame and our MFP technique identified 1094 events. Therefore, we believe that the empirical MFP method combinedmore » with conventional methods significantly improves the network detection ability in an efficient matter.« less
Fuzzy logic and image processing techniques for the interpretation of seismic data
NASA Astrophysics Data System (ADS)
Orozco-del-Castillo, M. G.; Ortiz-Alemán, C.; Urrutia-Fucugauchi, J.; Rodríguez-Castellanos, A.
2011-06-01
Since interpretation of seismic data is usually a tedious and repetitive task, the ability to do so automatically or semi-automatically has become an important objective of recent research. We believe that the vagueness and uncertainty in the interpretation process makes fuzzy logic an appropriate tool to deal with seismic data. In this work we developed a semi-automated fuzzy inference system to detect the internal architecture of a mass transport complex (MTC) in seismic images. We propose that the observed characteristics of a MTC can be expressed as fuzzy if-then rules consisting of linguistic values associated with fuzzy membership functions. The constructions of the fuzzy inference system and various image processing techniques are presented. We conclude that this is a well-suited problem for fuzzy logic since the application of the proposed methodology yields a semi-automatically interpreted MTC which closely resembles the MTC from expert manual interpretation.
Improving Vintage Seismic Data Quality through Implementation of Advance Processing Techniques
NASA Astrophysics Data System (ADS)
Latiff, A. H. Abdul; Boon Hong, P. G.; Jamaludin, S. N. F.
2017-10-01
It is essential in petroleum exploration to have high resolution subsurface images, both vertically and horizontally, in uncovering new geological and geophysical aspects of our subsurface. The lack of success may have been from the poor imaging quality which led to inaccurate analysis and interpretation. In this work, we re-processed the existing seismic dataset with an emphasis on two objectives. Firstly, to produce a better 3D seismic data quality with full retention of relative amplitudes and significantly reduce seismic and structural uncertainty. Secondly, to facilitate further prospect delineation through enhanced data resolution, fault definitions and events continuity, particularly in syn-rift section and basement cover contacts and in turn, better understand the geology of the subsurface especially in regard to the distribution of the fluvial and channel sands. By adding recent, state-of-the-art broadband processing techniques such as source and receiver de-ghosting, high density velocity analysis and shallow water de-multiple, the final results produced a better overall reflection detail and frequency in specific target zones, particularly in the deeper section.
Miller, John J.; Agena, W.F.; Lee, M.W.; Zihlman, F.N.; Grow, J.A.; Taylor, D.J.; Killgore, Michele; Oliver, H.L.
2000-01-01
This CD-ROM contains stacked, migrated, 2-Dimensional seismic reflection data and associated support information for 22 regional seismic lines (3,470 line-miles) recorded in the National Petroleum Reserve ? Alaska (NPRA) from 1974 through 1981. Together, these lines constitute about one-quarter of the seismic data collected as part of the Federal Government?s program to evaluate the petroleum potential of the Reserve. The regional lines, which form a grid covering the entire NPRA, were created by combining various individual lines recorded in different years using different recording parameters. These data were reprocessed by the USGS using modern, post-stack processing techniques, to create a data set suitable for interpretation on interactive seismic interpretation computer workstations. Reprocessing was done in support of ongoing petroleum resource studies by the USGS Energy Program. The CD-ROM contains the following files: 1) 22 files containing the digital seismic data in standard, SEG-Y format; 2) 1 file containing navigation data for the 22 lines in standard SEG-P1 format; 3) 22 small scale graphic images of each seismic line in Adobe Acrobat? PDF format; 4) a graphic image of the location map, generated from the navigation file, with hyperlinks to the graphic images of the seismic lines; 5) an ASCII text file with cross-reference information for relating the sequential trace numbers on each regional line to the line number and shotpoint number of the original component lines; and 6) an explanation of the processing used to create the final seismic sections (this document). The SEG-Y format seismic files and SEG-P1 format navigation file contain all the information necessary for loading the data onto a seismic interpretation workstation.
1D Seismic reflection technique to increase depth information in surface seismic investigations
NASA Astrophysics Data System (ADS)
Camilletti, Stefano; Fiera, Francesco; Umberto Pacini, Lando; Perini, Massimiliano; Prosperi, Andrea
2017-04-01
1D seismic methods, such as MASW Re.Mi. and HVSR, have been extensively used in engineering investigations, bedrock research, Vs profile and to some extent for hydrologic applications, during the past 20 years. Recent advances in equipment, sound sources and computer interpretation techniques, make 1D seismic methods highly effective in shallow subsoil modeling. Classical 1D seismic surveys allows economical collection of subsurface data however they fail to return accurate information for depths greater than 50 meters. Using a particular acquisition technique it is possible to collect data that can be quickly processed through reflection technique in order to obtain more accurate velocity information in depth. Furthermore, data processing returns a narrow stratigraphic section, alongside the 1D velocity model, where lithological boundaries are represented. This work will show how collect a single-CMP to determine: (1) depth of bedrock; (2) gravel layers in clayey domains; (3) accurate Vs profile. Seismic traces was processed by means a new software developed in collaboration with SARA electronics instruments S.r.l company, Perugia - ITALY. This software has the great advantage of being able to be used directly in the field in order to reduce the times elapsing between acquisition and processing.
Swept Impact Seismic Technique (SIST)
Park, C.B.; Miller, R.D.; Steeples, D.W.; Black, R.A.
1996-01-01
A coded seismic technique is developed that can result in a higher signal-to-noise ratio than a conventional single-pulse method does. The technique is cost-effective and time-efficient and therefore well suited for shallow-reflection surveys where high resolution and cost-effectiveness are critical. A low-power impact source transmits a few to several hundred high-frequency broad-band seismic pulses during several seconds of recording time according to a deterministic coding scheme. The coding scheme consists of a time-encoded impact sequence in which the rate of impact (cycles/s) changes linearly with time providing a broad range of impact rates. Impact times used during the decoding process are recorded on one channel of the seismograph. The coding concept combines the vibroseis swept-frequency and the Mini-Sosie random impact concepts. The swept-frequency concept greatly improves the suppression of correlation noise with much fewer impacts than normally used in the Mini-Sosie technique. The impact concept makes the technique simple and efficient in generating high-resolution seismic data especially in the presence of noise. The transfer function of the impact sequence simulates a low-cut filter with the cutoff frequency the same as the lowest impact rate. This property can be used to attenuate low-frequency ground-roll noise without using an analog low-cut filter or a spatial source (or receiver) array as is necessary with a conventional single-pulse method. Because of the discontinuous coding scheme, the decoding process is accomplished by a "shift-and-stacking" method that is much simpler and quicker than cross-correlation. The simplicity of the coding allows the mechanical design of the source to remain simple. Several different types of mechanical systems could be adapted to generate a linear impact sweep. In addition, the simplicity of the coding also allows the technique to be used with conventional acquisition systems, with only minor modifications.
Application of seismic-refraction techniques to hydrologic studies
Haeni, F.P.
1986-01-01
During the past 30 years, seismic-refraction methods have been used extensively in petroleum, mineral, and engineering investigations, and to some extent for hydrologic applications. Recent advances in equipment, sound sources, and computer interpretation techniques make seismic refraction a highly effective and economical means of obtaining subsurface data in hydrologic studies. Aquifers that can be defined by one or more high seismic-velocity surfaces, such as (1) alluvial or glacial deposits in consolidated rock valleys, (2) limestone or sandstone underlain by metamorphic or igneous rock, or (3) saturated unconsolidated deposits overlain by unsaturated unconsolidated deposits,are ideally suited for applying seismic-refraction methods. These methods allow the economical collection of subsurface data, provide the basis for more efficient collection of data by test drilling or aquifer tests, and result in improved hydrologic studies.This manual briefly reviews the basics of seismic-refraction theory and principles. It emphasizes the use of this technique in hydrologic investigations and describes the planning, equipment, field procedures, and intrepretation techniques needed for this type of study.Examples of the use of seismic-refraction techniques in a wide variety of hydrologic studies are presented.
Application of seismic-refraction techniques to hydrologic studies
Haeni, F.P.
1988-01-01
During the past 30 years, seismic-refraction methods have been used extensively in petroleum, mineral, and engineering investigations and to some extent for hydrologic applications. Recent advances in equipment, sound sources, and computer interpretation techniques make seismic refraction a highly effective and economical means of obtaining subsurface data in hydrologic studies. Aquifers that can be defined by one or more high-seismic-velocity surface, such as (1) alluvial or glacial deposits in consolidated rock valleys, (2) limestone or sandstone underlain by metamorphic or igneous rock, or (3) saturated unconsolidated deposits overlain by unsaturated unconsolidated deposits, are ideally suited for seismic-refraction methods. These methods allow economical collection of subsurface data, provide the basis for more efficient collection of data by test drilling or aquifer tests, and result in improved hydrologic studies. This manual briefly reviews the basics of seismic-refraction theory and principles. It emphasizes the use of these techniques in hydrologic investigations and describes the planning, equipment, field procedures, and interpretation techniques needed for this type of study. Further-more, examples of the use of seismic-refraction techniques in a wide variety of hydrologic studies are presented.
NASA Astrophysics Data System (ADS)
Vinogradov, Y.; Baryshnikov, A.
2003-04-01
Since September 2001 3 infrasound membrane type sensors "K-304 AM" have been installed on the territory seismic array "Apatity" near the lake Imandra. A seismic array comprising 11 short-period sensors (type "Geotech S-500"), disposed on small and large circle (0.4 and 1 km diameter). Infrasound sensors located on small circle near the seismograths. All data are digitized at the array site and transmitted in real time to a processing center in Apatity to the Kola Regional Seismological Centre (KRSC). Common complex we are called - Seismic &Infrasound Integrated Array (SISIA) "Apatity". To support temporary storage the transmitting data in a disk loop and access to the data "NEWNORAC" program was created. This program replaced "NORAC" system developed by Norwegian Institute NORSAR, which was in use in KRSC before. A program package EL (event locator) for display and processing of the data has been modified. Now it includes the following : - quick access to the data stored in the disk loop (last two weeks); - data convertation from disk loop format to CSS 3.0 format; - data filtering using bandpass, highpass, lowpass, adaptive or rejector filters; - calculation of spectra and sonograms (spectral diagrams); - seismic events location with plotting on a map; - calculation of backazimuth and apparent velocity of acoustic wave by similar parts of wave recordings; - loading and processing CSS 3.0 seismic and acoustic data from KRSC archive. To store the acoustic data permanently the program BARCSS was made. It rewrites the data from the disk loop to KRSC archive in CSS 3.0 format. For comparison of acoustic noise level with wind we use data from meteorological station in Kandalaksha city, sampling rate is 3 hours. During the period from October 2001 to October 2002 more than 745 seismic events, which basically connected with mine technical activity of the large mining enterprises at the Kola Peninsula, were registered. The most part of events, caused by ground explosions
Broadband seismic : case study modeling and data processing
NASA Astrophysics Data System (ADS)
Cahyaningtyas, M. B.; Bahar, A.
2018-03-01
Seismic data with wide range of frequency is needed due to its close relation to resolution and the depth of the target. Low frequency provides deeper penetration for the imaging of deep target. In addition, the wider the frequency bandwidth, the sharper the wavelet. Sharp wavelet is responsible for high-resolution imaging and is very helpful to resolve thin bed. As a result, the demand for broadband seismic data is rising and it spurs the technology development of broadband seismic in oil and gas industry. An obstacle that is frequently found on marine seismic data is the existence of ghost that affects the frequency bandwidth contained on the seismic data. Ghost alters bandwidth to bandlimited. To reduce ghost effect and to acquire broadband seismic data, lots of attempts are used, both on the acquisition and on the processing of seismic data. One of the acquisition technique applied is the multi-level streamer, where some streamers are towed on some levels of depth. Multi-level streamer will yield data with varied ghost notch shown on frequency domain. If the ghost notches are not overlapping, the summation of multi-level streamer data will reduce the ghost effect. The result of the multi-level streamer data processing shows that reduction of ghost notch on frequency domain indeed takes place.
Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems
NASA Astrophysics Data System (ADS)
Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.
2011-12-01
The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed
Seismic Techniques for Subsurface Voids Detection
NASA Astrophysics Data System (ADS)
Gritto, Roland; Korneev, Valeri; Elobaid Elnaiem, Ali; Mohamed, Fathelrahman; Sadooni, Fadhil
2016-04-01
A major hazards in Qatar is the presence of karst, which is ubiquitous throughout the country including depressions, sinkholes, and caves. Causes for the development of karst include faulting and fracturing where fluids find pathways through limestone and dissolve the host rock to form caverns. Of particular concern in rapidly growing metropolitan areas that expand in heretofore unexplored regions are the collapse of such caverns. Because Qatar has seen a recent boom in construction, including the planning and development of complete new sub-sections of metropolitan areas, the development areas need to be investigated for the presence of karst to determine their suitability for the planned project. In this paper, we present the results of a study to demonstrate a variety of seismic techniques to detect the presence of a karst analog in form of a vertical water-collection shaft located on the campus of Qatar University, Doha, Qatar. Seismic waves are well suited for karst detection and characterization. Voids represent high-contrast seismic objects that exhibit strong responses due to incident seismic waves. However, the complex geometry of karst, including shape and size, makes their imaging nontrivial. While karst detection can be reduced to the simple problem of detecting an anomaly, karst characterization can be complicated by the 3D nature of the problem of unknown scale, where irregular surfaces can generate diffracted waves of different kind. In our presentation we employ a variety of seismic techniques to demonstrate the detection and characterization of a vertical water collection shaft analyzing the phase, amplitude and spectral information of seismic waves that have been scattered by the object. We used the reduction in seismic wave amplitudes and the delay in phase arrival times in the geometrical shadow of the vertical shaft to independently detect and locate the object in space. Additionally, we use narrow band-pass filtered data combining two
Automated Processing Workflow for Ambient Seismic Recordings
NASA Astrophysics Data System (ADS)
Girard, A. J.; Shragge, J.
2017-12-01
Structural imaging using body-wave energy present in ambient seismic data remains a challenging task, largely because these wave modes are commonly much weaker than surface wave energy. In a number of situations body-wave energy has been extracted successfully; however, (nearly) all successful body-wave extraction and imaging approaches have focused on cross-correlation processing. While this is useful for interferometric purposes, it can also lead to the inclusion of unwanted noise events that dominate the resulting stack, leaving body-wave energy overpowered by the coherent noise. Conversely, wave-equation imaging can be applied directly on non-correlated ambient data that has been preprocessed to mitigate unwanted energy (i.e., surface waves, burst-like and electromechanical noise) to enhance body-wave arrivals. Following this approach, though, requires a significant preprocessing effort on often Terabytes of ambient seismic data, which is expensive and requires automation to be a feasible approach. In this work we outline an automated processing workflow designed to optimize body wave energy from an ambient seismic data set acquired on a large-N array at a mine site near Lalor Lake, Manitoba, Canada. We show that processing ambient seismic data in the recording domain, rather than the cross-correlation domain, allows us to mitigate energy that is inappropriate for body-wave imaging. We first develop a method for window selection that automatically identifies and removes data contaminated by coherent high-energy bursts. We then apply time- and frequency-domain debursting techniques to mitigate the effects of remaining strong amplitude and/or monochromatic energy without severely degrading the overall waveforms. After each processing step we implement a QC check to investigate improvements in the convergence rates - and the emergence of reflection events - in the cross-correlation plus stack waveforms over hour-long windows. Overall, the QC analyses suggest that
Digital processing of array seismic recordings
Ryall, Alan; Birtill, John
1962-01-01
This technical letter contains a brief review of the operations which are involved in digital processing of array seismic recordings by the methods of velocity filtering, summation, cross-multiplication and integration, and by combinations of these operations (the "UK Method" and multiple correlation). Examples are presented of analyses by the several techniques on array recordings which were obtained by the U.S. Geological Survey during chemical and nuclear explosions in the western United States. Seismograms are synthesized using actual noise and Pn-signal recordings, such that the signal-to-noise ratio, onset time and velocity of the signal are predetermined for the synthetic record. These records are then analyzed by summation, cross-multiplication, multiple correlation and the UK technique, and the results are compared. For all of the examples presented, analysis by the non-linear techniques of multiple correlation and cross-multiplication of the traces on an array recording are preferred to analyses by the linear operations involved in summation and the UK Method.
A Comparison of seismic instrument noise coherence analysis techniques
Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.
2011-01-01
The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.
Seismic signal processing on heterogeneous supercomputers
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Ermert, Laura; Fichtner, Andreas
2015-04-01
The processing of seismic signals - including the correlation of massive ambient noise data sets - represents an important part of a wide range of seismological applications. It is characterized by large data volumes as well as high computational input/output intensity. Development of efficient approaches towards seismic signal processing on emerging high performance computing systems is therefore essential. Heterogeneous supercomputing systems introduced in the recent years provide numerous computing nodes interconnected via high throughput networks, every node containing a mix of processing elements of different architectures, like several sequential processor cores and one or a few graphical processing units (GPU) serving as accelerators. A typical representative of such computing systems is "Piz Daint", a supercomputer of the Cray XC 30 family operated by the Swiss National Supercomputing Center (CSCS), which we used in this research. Heterogeneous supercomputers provide an opportunity for manifold application performance increase and are more energy-efficient, however they have much higher hardware complexity and are therefore much more difficult to program. The programming effort may be substantially reduced by the introduction of modular libraries of software components that can be reused for a wide class of seismology applications. The ultimate goal of this research is design of a prototype for such library suitable for implementing various seismic signal processing applications on heterogeneous systems. As a representative use case we have chosen an ambient noise correlation application. Ambient noise interferometry has developed into one of the most powerful tools to image and monitor the Earth's interior. Future applications will require the extraction of increasingly small details from noise recordings. To meet this demand, more advanced correlation techniques combined with very large data volumes are needed. This poses new computational problems that
Alsep data processing: How we processed Apollo Lunar Seismic Data
NASA Technical Reports Server (NTRS)
Latham, G. V.; Nakamura, Y.; Dorman, H. J.
1979-01-01
The Apollo lunar seismic station network gathered data continuously at a rate of 3 x 10 to the 8th power bits per day for nearly eight years until the termination in September, 1977. The data were processed and analyzed using a PDP-15 minicomputer. On the average, 1500 long-period seismic events were detected yearly. Automatic event detection and identification schemes proved unsuccessful because of occasional high noise levels and, above all, the risk of overlooking unusual natural events. The processing procedures finally settled on consist of first plotting all the data on a compressed time scale, visually picking events from the plots, transferring event data to separate sets of tapes and performing detailed analyses using the latter. Many problems remain especially for automatically processing extraterrestrial seismic signals.
Exploring the interior of Venus with seismic and infrasonic techniques
NASA Astrophysics Data System (ADS)
Jackson, J. M.; Cutts, J. A.; Pauken, M.; Komjathy, A.; Smrekar, S. E.; Kedar, S.; Mimoun, D.; Garcia, R.; Schubert, G.; Lebonnois, S.; Stevenson, D. J.; Lognonne, P. H.; Zhan, Z.; Ko, J. Y. T.; Tsai, V. C.
2016-12-01
The dense atmosphere of Venus, which efficiently couples seismic energy into the atmosphere as infrasonic waves, enables an alternative to conventional seismology: detection of infrasonic waves in the upper atmosphere using either high altitude balloons or orbiting spacecraft. Infrasonic techniques for probing the interior of Venus can be implemented without exposing sensors to the severe surface environments on Venus. This approach takes advantage of the fact that approximately sixty-times the energy from a seismic event on Venus is coupled into the atmosphere on Venus as would occur for a comparable event on Earth. The direct or epicentral wave propagates vertically above the event, and the indirect wave propagates through the planet as a Rayleigh wave and then couples to an infrasonic wave. Although there is abundant evidence of tectonic activity on Venus, questions remain as to whether the planet is still active and whether energy releases are seismic or aseismic. In recent years, seismologists have developed techniques for probing crustal and interior structure in parts of the Earth where there are very few quakes. We have begun an effort to determine if this is also possible for Venus. Just as seismic energy propagates more efficiently upward across the surface atmosphere interface, equally acoustic energy originating in the atmosphere will propagate downwards more effectively. Measurements from a balloon platform in the atmosphere of Venus could assess the nature and spectral content of such sources, while having the ability to identify and discriminate signatures from volcanic events, storm activity, and meteor impacts. We will discuss our ongoing assessment on the feasibility of a balloon acoustic monitoring system. In particular, we will highlight our results of the flight experiment on Earth that will focus on using barometer instruments on a tethered helium-filled balloon in the vicinity of a known seismic source generated by a seismic hammer
NASA Astrophysics Data System (ADS)
Labak, Peter; Lindblom, Pasi; Malich, Gregor
2017-04-01
The Integrated Field Exercise of 2014 (IFE14) was a field event held in the Hashemite Kingdom of Jordan (with concurrent activities in Austria) during which the operational and technical capabilities of a Comprehensive Test Ban Treaty's (CTBT) on-site inspection (OSI) were tested in integrated manner. Many of the inspection techniques permitted by the CTBT were applied during IFE14 including a range of geophysical techniques, however, one of the techniques foreseen by the CTBT but not yet developed is resonance seismometry. During August and September 2016, seismic field measurements have been conducted in the region of Kylylahti, Finland, in support of the further development of geophysical seismic techniques for OSIs. 45 seismic stations were used to continuously acquire seismic signals. During that period, data from local, regional and teleseismic natural events and man-made events were acquired, including from a devastating earthquake in Italy and the nuclear explosion announced by the Democratic People's Republic of Korea on 9 September 2016. Also, data were acquired following the small-scale use of man-made chemical explosives in the area and of vibratory sources. This presentation will show examples from the data set and will discuss its use for the development of resonance seimometry for OSIs.
Processing Approaches for DAS-Enabled Continuous Seismic Monitoring
NASA Astrophysics Data System (ADS)
Dou, S.; Wood, T.; Freifeld, B. M.; Robertson, M.; McDonald, S.; Pevzner, R.; Lindsey, N.; Gelvin, A.; Saari, S.; Morales, A.; Ekblaw, I.; Wagner, A. M.; Ulrich, C.; Daley, T. M.; Ajo Franklin, J. B.
2017-12-01
Distributed Acoustic Sensing (DAS) is creating a "field as laboratory" capability for seismic monitoring of subsurface changes. By providing unprecedented spatial and temporal sampling at a relatively low cost, DAS enables field-scale seismic monitoring to have durations and temporal resolutions that are comparable to those of laboratory experiments. Here we report on seismic processing approaches developed during data analyses of three case studies all using DAS-enabled seismic monitoring with applications ranging from shallow permafrost to deep reservoirs: (1) 10-hour downhole monitoring of cement curing at Otway, Australia; (2) 2-month surface monitoring of controlled permafrost thaw at Fairbanks, Alaska; (3) multi-month downhole and surface monitoring of carbon sequestration at Decatur, Illinois. We emphasize the data management and processing components relevant to DAS-based seismic monitoring, which include scalable approaches to data management, pre-processing, denoising, filtering, and wavefield decomposition. DAS has dramatically increased the data volume to the extent that terabyte-per-day data loads are now typical, straining conventional approaches to data storage and processing. To achieve more efficient use of disk space and network bandwidth, we explore improved file structures and data compression schemes. Because noise floor of DAS measurements is higher than that of conventional sensors, optimal processing workflow involving advanced denoising, deconvolution (of the source signatures), and stacking approaches are being established to maximize signal content of DAS data. The resulting workflow of data management and processing could accelerate the broader adaption of DAS for continuous monitoring of critical processes.
The use of vertical seismic profiles in seismic investigations of the earth
Balch, Alfred H.; Lee, M.W.; Miller, J.J.; Ryder, Robert T.
1982-01-01
During the past 8 years, the U.S. Geological Survey has conducted an extensive investigation on the use of vertical seismic profiles (VSP) in a variety of seismic exploration applications. Seismic sources used were surface air guns, vibrators, explosives, marine air guns, and downhole air guns. Source offsets have ranged from 100 to 7800 ft. Well depths have been from 1200 to over 10,000 ft. We have found three specific ways in which VSPs can be applied to seismic exploration. First, seismic events observed at the surface of the ground can be traced, level by level, to their point of origin within the earth. Thus, one can tie a surface profile to a well log with an extraordinarily high degree of confidence. Second, one can establish the detectability of a target horizon, such as a porous zone. One can determine (either before or after surface profiling) whether or not a given horizon or layered sequence returns a detectable reflection to the surface. The amplitude and character of the reflection can also be observed. Third, acoustic properties of a stratigraphic sequence can be measured and sometimes correlated to important exploration parameters. For example, sometimes a relationship between apparent attenuation and sand percentage can be established. The technique shows additional promise of aiding surface exploration indirectly through studies of the evolution of the seismic pulse, studies of ghosts and multiples, and studies of seismic trace inversion techniques. Nearly all current seismic data‐processing techniques are adaptable to the processing of VSP data, such as normal moveout (NMO) corrections, stacking, single‐and multiple‐channel filtering, deconvolution, and wavelet shaping.
Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing
NASA Astrophysics Data System (ADS)
Rowe, C. A.; Stead, R. J.; Begnaud, M. L.
2013-12-01
Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for
On vertical seismic profile processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tariel, P.; Michon, D.
1984-10-01
From the wealth of information which can be deduced from VSP, the information most directly comparable to well logs is considered: P-wave and S-wave interval velocity, acoustic impedance, and the velocity ratio ..gamma.. = V /SUB s/ /V /SUB p/ . This information not only allows better interpretation of surface seismic sections but also improves processing. For these results to be usable a number of precautions must be taken during acquisition and processing; the sampling in depth should be chosen in such a way that aliasing phenomena do not unnecessarily limit the spectra during the separation of upwards and downwardsmore » travelling waves. True amplitudes should be respected and checked by recording of signatures, and the interference of upwards and downwards travelling waves should be taken into account for the picking of first arrivals. The different steps in processing and the combination of results in the interpretation of surface seismic results are described with actual records.« less
Improved Seismic Acquisition System and Data Processing for the Italian National Seismic Network
NASA Astrophysics Data System (ADS)
Badiali, L.; Marcocci, C.; Mele, F.; Piscini, A.
2001-12-01
A new system for acquiring and processing digital signals has been developed in the last few years at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). The system makes extensive use of the internet communication protocol standards such as TCP and UDP which are used as the transport highway inside the Italian network, and possibly in a near future outside, to share or redirect data among processes. The Italian National Seismic Network has been working for about 18 years equipped with vertical short period seismometers and transmitting through analog lines, to the computer center in Rome. We are now concentrating our efforts on speeding the migration towards a fully digital network based on about 150 stations equipped with either broad band or 5 seconds sensors connected to the data center partly through wired digital communication and partly through satellite digital communication. The overall process is layered through intranet and/or internet. Every layer gathers data in a simple format and provides data in a processed format, ready to be distributed towards the next layer. The lowest level acquires seismic data (raw waveforms) coming from the remote stations. It handshakes, checks and sends data in LAN or WAN according to a distribution list where other machines with their programs are waiting for. At the next level there are the picking procedures, or "pickers", on a per instrument basis, looking for phases. A picker spreads phases, again through the LAN or WAN and according to a distribution list, to one or more waiting locating machines tuned to generate a seismic event. The event locating procedure itself, the higher level in this stack, can exchange information with other similar procedures. Such a layered and distributed structure with nearby targets allows other seismic networks to join the processing and data collection of the same ongoing event, creating a virtual network larger than the original one. At present we plan to cooperate with other
Visualization of volumetric seismic data
NASA Astrophysics Data System (ADS)
Spickermann, Dela; Böttinger, Michael; Ashfaq Ahmed, Khawar; Gajewski, Dirk
2015-04-01
Mostly driven by demands of high quality subsurface imaging, highly specialized tools and methods have been developed to support the processing, visualization and interpretation of seismic data. 3D seismic data acquisition and 4D time-lapse seismic monitoring are well-established techniques in academia and industry, producing large amounts of data to be processed, visualized and interpreted. In this context, interactive 3D visualization methods proved to be valuable for the analysis of 3D seismic data cubes - especially for sedimentary environments with continuous horizons. In crystalline and hard rock environments, where hydraulic stimulation techniques may be applied to produce geothermal energy, interpretation of the seismic data is a more challenging problem. Instead of continuous reflection horizons, the imaging targets are often steep dipping faults, causing a lot of diffractions. Without further preprocessing these geological structures are often hidden behind the noise in the data. In this PICO presentation we will present a workflow consisting of data processing steps, which enhance the signal-to-noise ratio, followed by a visualization step based on the use the commercially available general purpose 3D visualization system Avizo. Specifically, we have used Avizo Earth, an extension to Avizo, which supports the import of seismic data in SEG-Y format and offers easy access to state-of-the-art 3D visualization methods at interactive frame rates, even for large seismic data cubes. In seismic interpretation using visualization, interactivity is a key requirement for understanding complex 3D structures. In order to enable an easy communication of the insights gained during the interactive visualization process, animations of the visualized data were created which support the spatial understanding of the data.
Determining the metallicity of the solar envelope using seismic inversion techniques
NASA Astrophysics Data System (ADS)
Buldgen, G.; Salmon, S. J. A. J.; Noels, A.; Scuflaire, R.; Dupret, M. A.; Reese, D. R.
2017-11-01
The solar metallicity issue is a long-lasting problem of astrophysics, impacting multiple fields and still subject to debate and uncertainties. While spectroscopy has mostly been used to determine the solar heavy elements abundance, helioseismologists attempted providing a seismic determination of the metallicity in the solar convective envelope. However, the puzzle remains since two independent groups provided two radically different values for this crucial astrophysical parameter. We aim at providing an independent seismic measurement of the solar metallicity in the convective envelope. Our main goal is to help provide new information to break the current stalemate amongst seismic determinations of the solar heavy element abundance. We start by presenting the kernels, the inversion technique and the target function of the inversion we have developed. We then test our approach in multiple hare-and-hounds exercises to assess its reliability and accuracy. We then apply our technique to solar data using calibrated solar models and determine an interval of seismic measurements for the solar metallicity. We show that our inversion can indeed be used to estimate the solar metallicity thanks to our hare-and-hounds exercises. However, we also show that further dependencies in the physical ingredients of solar models lead to a low accuracy. Nevertheless, using various physical ingredients for our solar models, we determine metallicity values between 0.008 and 0.014.
A Technique to Determine the Self-Noise of Seismic Sensors for Performance Screening
NASA Astrophysics Data System (ADS)
Rademacher, H.; Hart, D.; Guralp, C.
2012-04-01
Seismic noise affects the performance of a seismic sensor and is thereby a limiting factor for the detection threshold of monitoring networks. Among the various sources of noise, the intrinsic self-noise of a seismic sensor is most diffcult to determine, because it is mostly masked by natural and anthropogenic ground noise and is also affected by the noise characteristic of the digitizer. Here we present a new technique to determine the self-noise of a seismic system (digitizer + sensors). It is based on a method introduced by Sleeman et al. (2005) to test the noise performance of digitizers. We infer the self-noise of a triplet of identical sensors by comparing coherent waveforms over a wide spectral band across the set-up. We will show first results from a proof-of-concept study done in a vault near Albuquerque, New Mexico. We will show, how various methods of shielding the sensors affect the results of this technique. This method can also be used as a means of quality control during sensor production, because poorly performing sensors can easily be identified.
Wang, Lei; Tian, Wei; Shi, Yongmin
2017-08-07
The morphology and structure of plumbing systems can provide key information on the eruption rate and style of basalt lava fields. The most powerful way to study subsurface geo-bodies is to use industrial 3D reflection seismological imaging. However, strategies to image subsurface volcanoes are very different from that of oil and gas reservoirs. In this study, we process seismic data cubes from the Northern Tarim Basin, China, to illustrate how to visualize sills through opacity rendering techniques and how to image the conduits by time-slicing. In the first case, we isolated probes by the seismic horizons marking the contacts between sills and encasing strata, applying opacity rendering techniques to extract sills from the seismic cube. The resulting detailed sill morphology shows that the flow direction is from the dome center to the rim. In the second seismic cube, we use time-slices to image the conduits, which corresponds to marked discontinuities within the encasing rocks. A set of time-slices obtained at different depths show that the Tarim flood basalts erupted from central volcanoes, fed by separate pipe-like conduits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levander, Alan Richard; Zelt, Colin A.
2015-03-17
The work plan for this project was to develop and apply advanced seismic reflection and wide-angle processing and inversion techniques to high resolution seismic data for the shallow subsurface to seismically characterize the shallow subsurface at hazardous waste sites as an aid to containment and cleanup activities. We proposed to continue work on seismic data that we had already acquired under a previous DoE grant, as well as to acquire additional new datasets for analysis. The project successfully developed and/or implemented the use of 3D reflection seismology algorithms, waveform tomography and finite-frequency tomography using compressional and shear waves for highmore » resolution characterization of the shallow subsurface at two waste sites. These two sites have markedly different near-surface structures, groundwater flow patterns, and hazardous waste problems. This is documented in the list of refereed documents, conference proceedings, and Rice graduate theses, listed below.« less
Using Seismic Signals to Forecast Volcanic Processes
NASA Astrophysics Data System (ADS)
Salvage, R.; Neuberg, J. W.
2012-04-01
Understanding seismic signals generated during volcanic unrest have the ability to allow scientists to more accurately predict and understand active volcanoes since they are intrinsically linked to rock failure at depth (Voight, 1988). In particular, low frequency long period signals (LP events) have been related to the movement of fluid and the brittle failure of magma at depth due to high strain rates (Hammer and Neuberg, 2009). This fundamentally relates to surface processes. However, there is currently no physical quantitative model for determining the likelihood of an eruption following precursory seismic signals, or the timing or type of eruption that will ensue (Benson et al., 2010). Since the beginning of its current eruptive phase, accelerating LP swarms (< 10 events per hour) have been a common feature at Soufriere Hills volcano, Montserrat prior to surface expressions such as dome collapse or eruptions (Miller et al., 1998). The dynamical behaviour of such swarms can be related to accelerated magma ascent rates since the seismicity is thought to be a consequence of magma deformation as it rises to the surface. In particular, acceleration rates can be successfully used in collaboration with the inverse material failure law; a linear relationship against time (Voight, 1988); in the accurate prediction of volcanic eruption timings. Currently, this has only been investigated for retrospective events (Hammer and Neuberg, 2009). The identification of LP swarms on Montserrat and analysis of their dynamical characteristics allows a better understanding of the nature of the seismic signals themselves, as well as their relationship to surface processes such as magma extrusion rates. Acceleration and deceleration rates of seismic swarms provide insights into the plumbing system of the volcano at depth. The application of the material failure law to multiple LP swarms of data allows a critical evaluation of the accuracy of the method which further refines current
Modernization of the USGS Hawaiian Volcano Observatory Seismic Processing Infrastructure
NASA Astrophysics Data System (ADS)
Antolik, L.; Shiro, B.; Friberg, P. A.
2016-12-01
The USGS Hawaiian Volcano Observatory (HVO) operates a Tier 1 Advanced National Seismic System (ANSS) seismic network to monitor, characterize, and report on volcanic and earthquake activity in the State of Hawaii. Upgrades at the observatory since 2009 have improved the digital telemetry network, computing resources, and seismic data processing with the adoption of the ANSS Quake Management System (AQMS) system. HVO aims to build on these efforts by further modernizing its seismic processing infrastructure and strengthen its ability to meet ANSS performance standards. Most notably, this will also allow HVO to support redundant systems, both onsite and offsite, in order to provide better continuity of operation during intermittent power and network outages. We are in the process of implementing a number of upgrades and improvements on HVO's seismic processing infrastructure, including: 1) Virtualization of AQMS physical servers; 2) Migration of server operating systems from Solaris to Linux; 3) Consolidation of AQMS real-time and post-processing services to a single server; 4) Upgrading database from Oracle 10 to Oracle 12; and 5) Upgrading to the latest Earthworm and AQMS software. These improvements will make server administration more efficient, minimize hardware resources required by AQMS, simplify the Oracle replication setup, and provide better integration with HVO's existing state of health monitoring tools and backup system. Ultimately, it will provide HVO with the latest and most secure software available while making the software easier to deploy and support.
Post-processing of seismic parameter data based on valid seismic event determination
McEvilly, Thomas V.
1985-01-01
An automated seismic processing system and method are disclosed, including an array of CMOS microprocessors for unattended battery-powered processing of a multi-station network. According to a characterizing feature of the invention, each channel of the network is independently operable to automatically detect, measure times and amplitudes, and compute and fit Fast Fourier transforms (FFT's) for both P- and S- waves on analog seismic data after it has been sampled at a given rate. The measured parameter data from each channel are then reviewed for event validity by a central controlling microprocessor and if determined by preset criteria to constitute a valid event, the parameter data are passed to an analysis computer for calculation of hypocenter location, running b-values, source parameters, event count, P- wave polarities, moment-tensor inversion, and Vp/Vs ratios. The in-field real-time analysis of data maximizes the efficiency of microearthquake surveys allowing flexibility in experimental procedures, with a minimum of traditional labor-intensive postprocessing. A unique consequence of the system is that none of the original data (i.e., the sensor analog output signals) are necessarily saved after computation, but rather, the numerical parameters generated by the automatic analysis are the sole output of the automated seismic processor.
NASA Astrophysics Data System (ADS)
Schenini, L.; Beslier, M. O.; Sage, F.; Badji, R.; Galibert, P. Y.; Lepretre, A.; Dessa, J. X.; Aidi, C.; Watremez, L.
2014-12-01
Recent studies on the Algerian and the North-Ligurian margins in the Western Mediterranean have evidenced inversion-related superficial structures, such as folds and asymmetric sedimentary perched basins whose geometry hints at deep compressive structures dipping towards the continent. Deep seismic imaging of these margins is difficult due to steep slope and superficial multiples, and, in the Mediterranean context, to the highly diffractive Messinian evaporitic series in the basin. During the Algerian-French SPIRAL survey (2009, R/V Atalante), 2D marine multi-channel seismic (MCS) reflection data were collected along the Algerian Margin using a 4.5 km, 360 channel digital streamer and a 3040 cu. in. air-gun array. An advanced processing workflow has been laid out using Geocluster CGG software, which includes noise attenuation, 2D SRME multiple attenuation, surface consistent deconvolution, Kirchhoff pre-stack time migration. This processing produces satisfactory seismic images of the whole sedimentary cover, and of southward dipping reflectors in the acoustic basement along the central part of the margin offshore Great Kabylia, that are interpreted as inversion-related blind thrusts as part of flat-ramp systems. We applied this successful processing workflow to old 2D marine MCS data acquired on the North-Ligurian Margin (Malis survey, 1995, R/V Le Nadir), using a 2.5 km, 96 channel streamer and a 1140 cu. in. air-gun array. Particular attention was paid to multiple attenuation in adapting our workflow. The resulting reprocessed seismic images, interpreted with a coincident velocity model obtained by wide-angle data tomography, provide (1) enhanced imaging of the sedimentary cover down to the top of the acoustic basement, including the base of the Messinian evaporites and the sub-salt Miocene series, which appear to be tectonized as far as in the mid-basin, and (2) new evidence of deep crustal structures in the margin which the initial processing had failed to
Processing grounded-wire TEM signal in time-frequency-pseudo-seismic domain: A new paradigm
NASA Astrophysics Data System (ADS)
Khan, M. Y.; Xue, G. Q.; Chen, W.; Huasen, Z.
2017-12-01
Grounded-wire TEM has received great attention in mineral, hydrocarbon and hydrogeological investigations for the last several years. Conventionally, TEM soundings have been presented as apparent resistivity curves as function of time. With development of sophisticated computational algorithms, it became possible to extract more realistic geoelectric information by applying inversion programs to 1-D & 3-D problems. Here, we analyze grounded-wire TEM data by carrying out analysis in time, frequency and pseudo-seismic domain supported by borehole information. At first, H, K, A & Q type geoelectric models are processed using a proven inversion program (1-D Occam inversion). Second, time-to-frequency transformation is conducted from TEM ρa(t) curves to magneto telluric MT ρa(f) curves for the same models based on all-time apparent resistivity curves. Third, 1-D Bostick's algorithm was applied to the transformed resistivity. Finally, EM diffusion field is transformed into propagating wave field obeying the standard wave equation using wavelet transformation technique and constructed pseudo-seismic section. The transformed seismic-like wave indicates that some reflection and refraction phenomena appear when the EM wave field interacts with geoelectric interface at different depth intervals due to contrast in resistivity. The resolution of the transformed TEM data is significantly improved in comparison to apparent resistivity plots. A case study illustrates the successful hydrogeophysical application of proposed approach in recovering water-filled mined-out area in a coal field located in Ye county, Henan province, China. The results support the introduction of pseudo-seismic imaging technology in short-offset version of TEM which can also be an useful aid if integrated with seismic reflection technique to explore possibilities for high resolution EM imaging in future.
Paleobathymetric Reconstruction of Ross Sea: seismic data processing and regional reflectors mapping
NASA Astrophysics Data System (ADS)
Olivo, Elisabetta; De Santis, Laura; Wardell, Nigel; Geletti, Riccardo; Busetti, Martina; Sauli, Chiara; Bergamasco, Andrea; Colleoni, Florence; Vanzella, Walter; Sorlien, Christopher; Wilson, Doug; De Conto, Robert; Powell, Ross; Bart, Phil; Luyendyk, Bruce
2017-04-01
PURPOSE: New maps of some major unconformities of the Ross Sea have been reconstructed, by using seismic data grids, combined with the acoustic velocities from previous works, from new and reprocessed seismic profiles. This work is carried out with the support of PNRA and in the frame of the bilateral Italy-USA project GLAISS (Global Sea Level Rise & Antarctic Ice Sheet Stability predictions), funded by the Ministry of Foreign Affairs. Paleobathymetric maps of 30, 14 and 4 million years ago, three 'key moments' for the glacial history of the Antarctic Ice Sheet, coinciding with global climatic changes. The paleobathymetric maps will then be used for numeric simulations focused on the width and thickness of the Ross Sea Ice Sheet. PRELIMINARY RESULTS: The first step was to create TWT maps of three main unconformity (RSU6, RSU4, and RSU2) of Ross Sea, revisiting and updating the ANTOSTRAT maps, through the interpretation of sedimentary bodies and erosional features, used to infer active or old processes along the slope, we identified the main seismic unconformities. We used the HIS Kingdom academic license. The different groups contribution was on the analysis of the Eastern Ross Sea continental slope and rise (OGS), of the Central Basin (KOPRI) of the western and central Ross Sea (Univ. of Santa Barbara and OGS), where new drill sites and seismic profiles were collected after the publication of the ANTOSTRAT maps. Than we joined our interpretation with previous interpretations. We examined previous processing of several seismic lines and all the old acoustic velocity analysis. In addiction we reprocessed some lines in order to have a higher data coverage. Then, combining the TWT maps of the unconformity with the old and new speed data we created new depth maps of the study area. The new depth maps will then be used for reconstructing the paleobathymetry of the Ross Sea by applying backstripping technique.
Studies of short and long memory in mining-induced seismic processes
NASA Astrophysics Data System (ADS)
Węglarczyk, Stanisław; Lasocki, Stanisław
2009-09-01
Memory of a stochastic process implies its predictability, understood as a possibility to gain information on the future above the random guess level. Here we search for memory in the mining-induced seismic process (MIS), that is, a process induced or triggered by mining operations. Long memory is investigated by means of the Hurst rescaled range analysis, and the autocorrelation function estimate is used to test for short memory. Both methods are complemented with result uncertainty analyses based on different resampling techniques. The analyzed data comprise event series from Rudna copper mine in Poland. The studies show that the interevent time and interevent distance processes have both long and short memory. MIS occurrences and locations are internally interrelated. Internal relations among the sizes of MIS events are apparently weaker than those of other two studied parameterizations and are limited to long term interactions.
NASA Astrophysics Data System (ADS)
Dietrich, Carola; Wölbern, Ingo; Faria, Bruno; Rümpker, Georg
2017-04-01
Fogo is the only island of the Cape Verde archipelago with regular occurring volcanic eruptions since its discovery in the 15th century. The volcanism of the archipelago originates from a mantle plume beneath an almost stationary tectonic plate. With an eruption interval of approximately 20 years, Fogo belongs to the most active oceanic volcanoes. The latest eruption started in November 2014 and ceased in February 2015. This study aims to characterize and investigate the seismic activity and the magmatic plumbing system of Fogo, which is believed to be related to a magmatic source close to the neighboring island of Brava. According to previous studies, using conventional seismic network configurations, most of the seismic activity occurs offshore. Therefore, seismological array techniques represent powerful tools in investigating earthquakes and other volcano-related events located outside of the networks. Another advantage in the use of seismic arrays is their possibility to detect events of relatively small magnitude and to locate seismic signals without a clear onset of phases, such as volcanic tremors. Since October 2015 we have been operating a test array on Fogo as part of a pilot study. This array consists of 10 seismic stations, distributed in a circular shape with an aperture of 700 m. The stations are equipped with Omnirecs CUBE dataloggers, and either 4.5 Hz geophones (7 stations) or Trillium-Compact broad-band seismometers (3 stations). In January 2016 we installed three additional broad-band stations distributed across the island of Fogo to improve the capabilities for event localization. The data of the pilot study is dominated by seismic activity around Brava, but also exhibit tremors and hybrid events of unknown origin within the caldera of Fogo volcano. The preliminary analysis of these events includes the characterization and localization of the different event types using seismic array processing in combination with conventional localization
New Crustal Thickness for Djibouti, Afar, Using Seismic Techniques
NASA Astrophysics Data System (ADS)
Dugda, Mulugeta; Bililign, Solomon
2008-10-01
Crustal thickness and Poisson's ratio for the seismic station ATD in Djibouti, Afar, has been investigated using two seismic techniques (H-κ stacking of receiver functions and a joint inversion of receiver functions and surface wave group velocities). Both techniques give consistent results of crustal thickness 23±1.5 km and Poisson's ratio 0.31±0.02. We also determined a mean P-wave velocity (Vp) of ˜6.2 km/s but ˜6.9-7.0 km/s below a 2 - 5 km thick low velocity layer at the surface. Previous studies of crustal structure for Djibouti reported that the crust is 6 to 11 km thick while our study shows that the crust beneath Djibouti is between 20 and 25 km. This study argues that the crustal thickness values reported for Djibouti for the last 3 decades were not consistent with the reports for the other neighboring region in central and eastern Afar. Our results for ATD in Djibouti, however, are consistent with the reports of crustal thickness in many other parts of central and eastern Afar. We attribute this difference to how the Moho (the crust-mantle discontinuity) is defined (an increase of Vp to 7.4 km/s in this study vs. 6.9 km/s in previous studies).
pySeismicDQA: open source post experiment data quality assessment and processing
NASA Astrophysics Data System (ADS)
Polkowski, Marcin
2017-04-01
Seismic Data Quality Assessment is python based, open source set of tools dedicated for data processing after passive seismic experiments. Primary goal of this toolset is unification of data types and formats from different dataloggers necessary for further processing. This process requires additional data checks for errors, equipment malfunction, data format errors, abnormal noise levels, etc. In all such cases user needs to decide (manually or by automatic threshold) if data is removed from output dataset. Additionally, output dataset can be visualized in form of website with data availability charts and waveform visualization with earthquake catalog (external). Data processing can be extended with simple STA/LTA event detection. pySeismicDQA is designed and tested for two passive seismic experiments in central Europe: PASSEQ 2006-2008 and "13 BB Star" (2013-2016). National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.
Using Seismic Interferometry to Investigate Seismic Swarms
NASA Astrophysics Data System (ADS)
Matzel, E.; Morency, C.; Templeton, D. C.
2017-12-01
Seismicity provides a direct means of measuring the physical characteristics of active tectonic features such as fault zones. Hundreds of small earthquakes often occur along a fault during a seismic swarm. This seismicity helps define the tectonically active region. When processed using novel geophysical techniques, we can isolate the energy sensitive to the fault, itself. Here we focus on two methods of seismic interferometry, ambient noise correlation (ANC) and the virtual seismometer method (VSM). ANC is based on the observation that the Earth's background noise includes coherent energy, which can be recovered by observing over long time periods and allowing the incoherent energy to cancel out. The cross correlation of ambient noise between a pair of stations results in a waveform that is identical to the seismogram that would result if an impulsive source located at one of the stations was recorded at the other, the Green function (GF). The calculation of the GF is often stable after a few weeks of continuous data correlation, any perturbations to the GF after that point are directly related to changes in the subsurface and can be used for 4D monitoring.VSM is a style of seismic interferometry that provides fast, precise, high frequency estimates of the Green's function (GF) between earthquakes. VSM illuminates the subsurface precisely where the pressures are changing and has the potential to image the evolution of seismicity over time, including changes in the style of faulting. With hundreds of earthquakes, we can calculate thousands of waveforms. At the same time, VSM collapses the computational domain, often by 2-3 orders of magnitude. This allows us to do high frequency 3D modeling in the fault region. Using data from a swarm of earthquakes near the Salton Sea, we demonstrate the power of these techniques, illustrating our ability to scale from the far field, where sources are well separated, to the near field where their locations fall within each other
Seismic rupture process of the 2010 Haiti Earthquake (Mw7.0) inferred from seismic and SAR data
NASA Astrophysics Data System (ADS)
Santos, Rúben; Caldeira, Bento; Borges, José; Bezzeghoud, Mourad
2013-04-01
On January 12th 2010 at 21:53, the Port-au-Prince - Haiti region was struck by an Mw7 earthquake, the second most deadly of the history. The last seismic significant events in the region occurred in November 1751 and June 1770 [1]. Geodetic and geological studies, previous to the 2010 earthquake [2] have warned to the potential of the destructive seismic events in that region and this event has confirmed those warnings. Some aspects of the source of this earthquake are nonconsensual. There is no agreement in the mechanism of rupture or correlation with the fault that should have it generated [3]. In order to better understand the complexity of this rupture, we combined several techniques and data of different nature. We used teleseismic body-wave and Synthetic Aperture Radar data (SAR) based on the following methodology: 1) analysis of the rupture process directivity [4] to determine the velocity and direction of rupture; 2) teleseismic body-wave inversion to obtain the spatiotemporal fault slip distribution and a detailed rupture model; 3) near field surface deformation modeling using the calculated seismic rupture model and compared with the measured deformation field using SAR data of sensor Advanced Land Observing Satellite - Phased Array L-band SAR (ALOS-PALSAR). The combined application of seismic and geodetic data reveals a complex rupture that spread during approximately 12s mainly from WNW to ESE with average velocity of 2,5km/s, on a north-dipping fault plane. Two main asperities are obtained: the first (and largest) occurs within the first ~ 5sec and extends for approximately 6km around the hypocenter; the second one, that happens in the remaining 6s, covers a near surface rectangular strip with about 12km long by 3km wide. The first asperity is compatible with a left lateral strike-slip motion with a small reverse component; the mechanism of second asperity is predominantly reverse. The obtained rupture process allows modeling a coseismic deformation
NASA Astrophysics Data System (ADS)
Poggi, V.; Burjanek, J.; Michel, C.; Fäh, D.
2017-08-01
The Swiss Seismological Service (SED) has recently finalised the installation of ten new seismological broadband stations in northern Switzerland. The project was led in cooperation with the National Cooperative for the Disposal of Radioactive Waste (Nagra) and Swissnuclear to monitor micro seismicity at potential locations of nuclear-waste repositories. To further improve the quality and usability of the seismic recordings, an extensive characterization of the sites surrounding the installation area was performed following a standardised investigation protocol. State-of-the-art geophysical techniques have been used, including advanced active and passive seismic methods. The results of all analyses converged to the definition of a set of best-representative 1-D velocity profiles for each site, which are the input for the computation of engineering soil proxies (traveltime averaged velocity and quarter-wavelength parameters) and numerical amplification models. Computed site response is then validated through comparison with empirical site amplification, which is currently available for any station connected to the Swiss seismic networks. With the goal of a high-sensitivity network, most of the NAGRA stations have been installed on stiff-soil sites of rather high seismic velocity. Seismic characterization of such sites has always been considered challenging, due to lack of relevant velocity contrast and the large wavelengths required to investigate the frequency range of engineering interest. We describe how ambient vibration techniques can successfully be applied in these particular conditions, providing practical recommendations for best practice in seismic site characterization of high-velocity sites.
Fallon, Nevada FORGE Seismic Reflection Profiles
Blankenship, Doug; Faulds, James; Queen, John; Fortuna, Mark
2018-02-01
Newly reprocessed Naval Air Station Fallon (1994) seismic lines: pre-stack depth migrations, with interpretations to support the Fallon FORGE (Phase 2B) 3D Geologic model. Data along seven profiles (>100 km of total profile length) through and adjacent to the Fallon site were re-processed. The most up-to-date, industry-tested seismic processing techniques were utilized to improve the signal strength and coherency in the sedimentary, volcanic, and Mesozoic crystalline basement sections, in conjunction with fault diffractions in order to improve the identification and definition of faults within the study area.
NASA Astrophysics Data System (ADS)
Waldhauser, F.; Schaff, D. P.
2012-12-01
Archives of digital seismic data recorded by seismometer networks around the world have grown tremendously over the last several decades helped by the deployment of seismic stations and their continued operation within the framework of monitoring earthquake activity and verification of the Nuclear Test-Ban Treaty. We show results from our continuing effort in developing efficient waveform cross-correlation and double-difference analysis methods for the large-scale processing of regional and global seismic archives to improve existing earthquake parameter estimates, detect seismic events with magnitudes below current detection thresholds, and improve real-time monitoring procedures. We demonstrate the performance of these algorithms as applied to the 28-year long seismic archive of the Northern California Seismic Network. The tools enable the computation of periodic updates of a high-resolution earthquake catalog of currently over 500,000 earthquakes using simultaneous double-difference inversions, achieving up to three orders of magnitude resolution improvement over existing hypocenter locations. This catalog, together with associated metadata, form the underlying relational database for a real-time double-difference scheme, DDRT, which rapidly computes high-precision correlation times and hypocenter locations of new events with respect to the background archive (http://ddrt.ldeo.columbia.edu). The DDRT system facilitates near-real-time seismicity analysis, including the ability to search at an unprecedented resolution for spatio-temporal changes in seismogenic properties. In areas with continuously recording stations, we show that a detector built around a scaled cross-correlation function can lower the detection threshold by one magnitude unit compared to the STA/LTA based detector employed at the network. This leads to increased event density, which in turn pushes the resolution capability of our location algorithms. On a global scale, we are currently building
NASA Astrophysics Data System (ADS)
Dimech, J. L.; Weber, R. C.; Knapmeyer-Endrun, B.; Arnold, R.; Savage, M. K.
2016-12-01
The field of planetary science is poised for a major advance with the upcoming InSight mission to Mars due to launch in May 2018. Seismic analysis techniques adapted for use on planetary data are therefore highly relevant to the field. The heart of this project is in the application of new seismic analysis techniques to the lunar seismic dataset to learn more about the Moon's crust and mantle structure, with particular emphasis on `deep' moonquakes which are situated half-way between the lunar surface and its core with no surface expression. Techniques proven to work on the Moon might also be beneficial for InSight and future planetary seismology missions which face similar technical challenges. The techniques include: (1) an event-detection and classification algorithm based on `Hidden Markov Models' to reclassify known moonquakes and look for new ones. Apollo 17 gravimeter and geophone data will also be included in this effort. (2) Measurements of anisotropy in the lunar mantle and crust using `shear-wave splitting'. Preliminary measurements on deep moonquakes using the MFAST program are encouraging, and continued evaluation may reveal new structural information on the Moon's mantle. (3) Probabilistic moonquake locations using NonLinLoc, a non-linear hypocenter location technique, using a modified version of the codes designed to work with the Moon's radius. Successful application may provide a new catalog of moonquake locations with rigorous uncertainty information, which would be a valuable input into: (4) new fault plane constraints from focal mechanisms using a novel approach to Bayes' theorem which factor in uncertainties in hypocenter coordinates and S-P amplitude ratios. Preliminary results, such as shear-wave splitting measurements, will be presented and discussed.
Seismic Characterization of EGS Reservoirs
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Pyle, M. L.; Matzel, E.; Myers, S.; Johannesson, G.
2014-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance the traditional microearthquake detection and location methodologies at two EGS systems. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP are typically smaller magnitude events or events that occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event seismic location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation could be real or simply within the anticipated error range. We apply this methodology to the Basel EGS data set and compare it to another EGS dataset. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
NASA Astrophysics Data System (ADS)
Ahmed, Ali; Hasan, Rafiq; Pekau, Oscar A.
2016-12-01
Two recent developments have come into the forefront with reference to updating the seismic design provisions for codes: (1) publication of new seismic hazard maps for Canada by the Geological Survey of Canada, and (2) emergence of the concept of new spectral format outdating the conventional standardized spectral format. The fourth -generation seismic hazard maps are based on enriched seismic data, enhanced knowledge of regional seismicity and improved seismic hazard modeling techniques. Therefore, the new maps are more accurate and need to incorporate into the Canadian Highway Bridge Design Code (CHBDC) for its next edition similar to its building counterpart National Building Code of Canada (NBCC). In fact, the code writers expressed similar intentions with comments in the commentary of CHBCD 2006. During the process of updating codes, NBCC, and AASHTO Guide Specifications for LRFD Seismic Bridge Design, American Association of State Highway and Transportation Officials, Washington (2009) lowered the probability level from 10 to 2% and 10 to 5%, respectively. This study has brought five sets of hazard maps corresponding to 2%, 5% and 10% probability of exceedance in 50 years developed by the GSC under investigation. To have a sound statistical inference, 389 Canadian cities are selected. This study shows the implications of the changes of new hazard maps on the design process (i.e., extent of magnification or reduction of the design forces).
MSNoise: a Python Package for Monitoring Seismic Velocity Changes using Ambient Seismic Noise
NASA Astrophysics Data System (ADS)
Lecocq, T.; Caudron, C.; Brenguier, F.
2013-12-01
Earthquakes occur every day all around the world and are recorded by thousands of seismic stations. In between earthquakes, stations are recording "noise". In the last 10 years, the understanding of this noise and its potential usage have been increasing rapidly. The method, called "seismic interferometry", uses the principle that seismic waves travel between two recorders and are multiple-scattered in the medium. By cross-correlating the two records, one gets an information on the medium below/between the stations. The cross-correlation function (CCF) is a proxy to the Green Function of the medium. Recent developments of the technique have shown those CCF can be used to image the earth at depth (3D seismic tomography) or study the medium changes with time. We present MSNoise, a complete software suite to compute relative seismic velocity changes under a seismic network, using ambient seismic noise. The whole is written in Python, from the monitoring of data archives, to the production of high quality figures. All steps have been optimized to only compute the necessary steps and to use 'job'-based processing. We present a validation of the software on a dataset acquired during the UnderVolc[1] project on the Piton de la Fournaise Volcano, La Réunion Island, France, for which precursory relative changes of seismic velocity are visible for three eruptions betwee 2009 and 2011.
Linking Surface and Subsurface Processes: Implications for Seismic Hazards in Southern California
NASA Astrophysics Data System (ADS)
Lin, J. C.; Moon, S.; Yong, A.; Meng, L.; Martin, A. J.; Davis, P. M.
2017-12-01
Earth's surface and subsurface processes such as bedrock weathering, soil production, and river incision can influence and be influenced by spatial variations in the mechanical strength of surface material. Mechanically weakened rocks tend to have reduced seismic velocity, which can result in larger ground-motion amplification and greater potential for earthquake-induced damages. However, the influence and extent of surface and subsurface processes on the mechanical strength of surface material and seismic site conditions in southern California remain unclear. In this study, we examine whether physics-based models of surface and subsurface processes can explain the spatial variability and non-linearity of near-surface seismic velocity in southern California. We use geophysical measurements (Yong et al., 2013; Ancheta et al., 2014), consisting of shear-wave velocity (Vs) tomography data, Vs profiles, and the time-averaged Vs in the upper 30 m of the crust (Vs30) to infer lateral and vertical variations of surface material properties. Then, we compare Vs30 values with geologic and topographic attributes such as rock type, slope, elevation, and local relief, as well as metrics for surface processes such as soil production and bedrock weathering from topographic stress, frost cracking, chemical reactions, and vegetation presence. Results from this study will improve our understanding of physical processes that control subsurface material properties and their influences on local variability in seismic site conditions.
Development of seismic tomography software for hybrid supercomputers
NASA Astrophysics Data System (ADS)
Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton
2015-04-01
Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on
NASA Astrophysics Data System (ADS)
Solazzi, Santiago G.; Guarracino, Luis; Rubino, J. Germán.; Müller, Tobias M.; Holliger, Klaus
2017-11-01
Quantifying seismic attenuation during laboratory imbibition experiments can provide useful information toward the use of seismic waves for monitoring injection and extraction of fluids in the Earth's crust. However, a deeper understanding of the physical causes producing the observed attenuation is needed for this purpose. In this work, we analyze seismic attenuation due to mesoscopic wave-induced fluid flow (WIFF) produced by realistic fluid distributions representative of imbibition experiments. To do so, we first perform two-phase flow simulations in a heterogeneous rock sample to emulate a forced imbibition experiment. We then select a subsample of the considered rock containing the resulting time-dependent saturation fields and apply a numerical upscaling procedure to compute the associated seismic attenuation. By exploring both saturation distributions and seismic attenuation, we observe that two manifestations of WIFF arise during imbibition experiments: the first one is produced by the compressibility contrast associated with the saturation front, whereas the second one is due to the presence of patches containing very high amounts of water that are located behind the saturation front. We demonstrate that while the former process is expected to play a significant role in the case of high injection rates, which are associated with viscous-dominated imbibition processes, the latter becomes predominant during capillary-dominated processes, that is, for relatively low injection rates. We conclude that this kind of joint numerical analysis constitutes a useful tool for improving our understanding of the physical mechanisms producing seismic attenuation during laboratory imbibition experiments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strack, K.M.; Vozoff, K.
The applications of electromagnetics have increased in the past two decades because of an improved understanding of the methods, improves service availability, and the increased focus of exploration in the more complex reservoir characterization issues. For electromagnetic methods surface applications for hydrocarbon Exploration and Production are still a special case, while applications in borehole and airborne research and for engineering and environmental objectives are routine. In the past, electromagnetic techniques, in particular deep transient electromagnetics, made up a completely different discipline in geophysics, although many of the principles are similar to the seismic one. With an understanding of the specificmore » problems related to data processing initially and then acquisition, the inclusion of principles learned from seismics happened almost naturally. Initially, the data processing was very similar to seismic full-waveform processing. The hardware was also changed to include multichannel acquisition systems, and the field procedures became very similar to seismic surveying. As a consequence, the integration and synergism of the interpretation process is becoming almost automatic. The long-offset transient electromagnetic (LOTEM) technique will be summarized from the viewpoint of its similarity to seismics. The complete concept of the method will also be reviewed. An interpretation case history that integrates seismic and LOTEM from a hydrocarbon area in China clearly demonstrates the limitations and benefits of the method.« less
Seismicity in Bohai Bay: New Features Revealed by Matched Filter Technique
NASA Astrophysics Data System (ADS)
Wu, M.; Mao, S.; Li, J.; Tang, C. C.; Ning, J.
2014-12-01
The Bohai Bay Basin (BBB) is a subsiding trough, which is located in northern China and bounded by outcropping Precambrian crystalline basement: to the north is the Yan Mountains, to the west the Taihang Mountains, to the southeast the Luxi Uplift, and to the east the Jiaodong Uplift and the Liaodong Uplift. It is not only cut through by famous right-lateral strike-slip fault, Tancheng-Lujiang Fault (TLF), but also rifled through by Zhangjiakou-Bohai Seismic Zone (ZBSZ). Its formation/evolution has close relation with continental dynamics, and is concerned greatly by Geoscientists. Although seismicity might shed light on this issue, there is no clear image of earthquake distribution in this region as result of difficulty in seismic observation of bay area. In this paper, we employ Matched Filter Technique (MFT) to better understand the local seismicity. MFT is originally used to detect duplicated events, thus is not capable to find new events with different locations. So we make some improvement on this method. Firstly, we adopt the idea proposed by David Shelly et al. (Nature, 2007) to conduct a strong detection and a weak detection simultaneously, which enable us to find more micro-events. Then, we relocate the detected events, which provides us with more accurate spatial distribution of new events as well as the geometry of related faults, comparing with traditional MFT. Results show that the sites of some famous historical strong events are obviously the locations concentrated with microearthquakes. Accordingly, we detect/determine/discuss the accurate positions of the historical strong events in BBB employing the results of the modified MFT. Moreover, the earthquakes in BBB form many seismic zones, of which the strikes mostly near the one of TLF although they together form the east end of ZBSZ. In the 2014 AGU fall meeting, we will introduce the details of our results and their geodynamical significance. Reference: Shelly, D. R., G. C. Beroza, and S. Ide, 2007
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ross, J.J.
Some of the same space-age signal technology being used to track events 200 miles above the earth is helping petroleum explorationists track down oil and natural gas two miles and more down into the earth. The breakthroughs, which have come in a technique called three-dimensional seismic work, could change the complexion of exploration for oil and natural gas. Thanks to this 3-D seismic approach, explorationists can make dynamic maps of sites miles beneath the surface. Then explorationists can throw these maps on space-age computer systems and manipulate them every which way - homing in sharply on salt domes, faults, sandsmore » and traps associated with oil and natural gas. ''The 3-D seismic scene has exploded within the last two years,'' says, Peiter Tackenberg, Marathon technical consultant who deals with both domestic and international exploration. The 3-D technique has been around for more than a decade, he notes, but recent achievements in space-age computer hardware and software have unlocked its full potential.« less
NASA Astrophysics Data System (ADS)
Lindquist, Kent Gordon
We constructed a near-real-time system, called Iceworm, to automate seismic data collection, processing, storage, and distribution at the Alaska Earthquake Information Center (AEIC). Phase-picking, phase association, and interprocess communication components come from Earthworm (U.S. Geological Survey). A new generic, internal format for digital data supports unified handling of data from diverse sources. A new infrastructure for applying processing algorithms to near-real-time data streams supports automated information extraction from seismic wavefields. Integration of Datascope (U. of Colorado) provides relational database management of all automated measurements, parametric information for located hypocenters, and waveform data from Iceworm. Data from 1997 yield 329 earthquakes located by both Iceworm and the AEIC. Of these, 203 have location residuals under 22 km, sufficient for hazard response. Regionalized inversions for local magnitude in Alaska yield Msb{L} calibration curves (logAsb0) that differ from the Californian Richter magnitude. The new curve is 0.2\\ Msb{L} units more attenuative than the Californian curve at 400 km for earthquakes north of the Denali fault. South of the fault, and for a region north of Cook Inlet, the difference is 0.4\\ Msb{L}. A curve for deep events differs by 0.6\\ Msb{L} at 650 km. We expand geographic coverage of Alaskan regional seismic monitoring to the Aleutians, the Bering Sea, and the entire Arctic by initiating the processing of four short-period, Alaskan seismic arrays. To show the array stations' sensitivity, we detect and locate two microearthquakes that were missed by the AEIC. An empirical study of the location sensitivity of the arrays predicts improvements over the Alaskan regional network that are shown as map-view contour plots. We verify these predictions by detecting an Msb{L} 3.2 event near Unimak Island with one array. The detection and location of four representative earthquakes illustrates the expansion
Liu, Yang; Xu, Caijun; Wen, Yangmao; Li, Zhicai
2016-01-01
On 28 August 2009, one thrust-faulting Mw 6.3 earthquake struck the northern Qaidam basin, China. Due to the lack of ground observations in this remote region, this study presents high-precision and high spatio-temporal resolution post-seismic deformation series with a small baseline subset InSAR technique. At the temporal scale, this changes from fast to slow with time, with a maximum uplift up to 7.4 cm along the line of sight 334 days after the event. At the spatial scale, this is more obvious at the hanging wall than that at the footwall, and decreases from the middle to both sides at the hanging wall. We then propose a method to calculate the correlation coefficient between co-seismic and post-seismic deformation by normalizing them. The correlation coefficient is found to be 0.73, indicating a similar subsurface process occurring during both phases. The results indicate that afterslip may dominate the post-seismic deformation during 19–334 days after the event, which mainly occurs with the fault geometry and depth similar to those of the c-seismic rupturing, and partly extends to the shallower and deeper depths. PMID:26861330
Liu, Yang; Xu, Caijun; Wen, Yangmao; Li, Zhicai
2016-02-05
On 28 August 2009, one thrust-faulting Mw 6.3 earthquake struck the northern Qaidam basin, China. Due to the lack of ground observations in this remote region, this study presents high-precision and high spatio-temporal resolution post-seismic deformation series with a small baseline subset InSAR technique. At the temporal scale, this changes from fast to slow with time, with a maximum uplift up to 7.4 cm along the line of sight 334 days after the event. At the spatial scale, this is more obvious at the hanging wall than that at the footwall, and decreases from the middle to both sides at the hanging wall. We then propose a method to calculate the correlation coefficient between co-seismic and post-seismic deformation by normalizing them. The correlation coefficient is found to be 0.73, indicating a similar subsurface process occurring during both phases. The results indicate that afterslip may dominate the post-seismic deformation during 19-334 days after the event, which mainly occurs with the fault geometry and depth similar to those of the c-seismic rupturing, and partly extends to the shallower and deeper depths.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazzolani, Federico M.
2008-07-08
The seismic protection of historical and monumental buildings, namely dating back from the ancient age up to the 20th Century, is being looked at with greater and greater interest, above all in the Euro-Mediterranean area, its cultural heritage being strongly susceptible to undergo severe damage or even collapse due to earthquake. The cultural importance of historical and monumental constructions limits, in many cases, the possibility to upgrade them from the seismic point of view, due to the fear of using intervention techniques which could have detrimental effects on their cultural value. Consequently, a great interest is growing in the developmentmore » of sustainable methodologies for the use of Reversible Mixed Technologies (RMTs) in the seismic protection of the existing constructions. RMTs, in fact, are conceived for exploiting the peculiarities of innovative materials and special devices, and they allow ease of removal when necessary. This paper deals with the experimental and numerical studies, framed within the EC PROHITECH research project, on the application of RMTs to the historical and monumental constructions mainly belonging to the cultural heritage of the Euro-Mediterranean area. The experimental tests and the numerical analyses are carried out at five different levels, namely full scale models, large scale models, sub-systems, devices, materials and elements.« less
Tutorial review of seismic surface waves' phenomenology
NASA Astrophysics Data System (ADS)
Levshin, A. L.; Barmin, M. P.; Ritzwoller, M. H.
2018-03-01
In recent years, surface wave seismology has become one of the leading directions in seismological investigations of the Earth's structure and seismic sources. Various applications cover a wide spectrum of goals, dealing with differences in sources of seismic excitation, penetration depths, frequency ranges, and interpretation techniques. Observed seismic data demonstrates the great variability of phenomenology which can produce difficulties in interpretation for beginners. This tutorial review is based on the many years' experience of authors in processing and interpretation of seismic surface wave observations and the lectures of one of the authors (ALL) at Workshops on Seismic Wave Excitation, Propagation and Interpretation held at the Abdus Salam International Center for Theoretical Physics (Trieste, Italy) in 1990-2012. We present some typical examples of wave patterns which could be encountered in different applications and which can serve as a guide to analysis of observed seismograms.
A review of seismoelectric data processing techniques
NASA Astrophysics Data System (ADS)
Warden, S. D.; Garambois, S.; Jouniaux, L.; Sailhac, P.
2011-12-01
Seismoelectric tomography is expected to combine the sensitivity of electromagnetic methods to hydrological properties such as water-content and permeability, to the high resolution of conventional seismic surveys. This innovative exploration technique seems very promising as it could characterize the fluids contained in reservoir rocks and detect thin layers invisible to other methods. However, it still needs to be improved before it can be successfully applied to real case problems. One of the main issues that need to be addressed is the development of wave separation techniques enabling to recover the signal of interest. Seismic waves passing through a fluid-saturated porous layered medium convert into at least two types of electromagnetic waves: the coseismic field (type I), accompanying seismic body and surface waves, and the independently propagating interface response (type II). The latter occurs when compressional waves encounter a contrast between electrical, chemical or mechanical properties in the subsurface, thus acting as a secondary source that can be generally approximated by a sum of electrical dipoles oscillating at the first Fresnel zone. Although properties of the medium in the vicinity of the receivers can be extracted from the coseismic waves, only the interface response provides subsurface information at depth, which makes it critical to separate both types of energy. This is a delicate problem, as the interface response may be several orders of magnitude weaker than the coseismic field. However, as reviewed by Haines et al. (2007), several properties of the interface response can be used to identify it: its dipolar amplitude pattern, its opposite polarity on opposite sides of the shot point and the electromagnetic velocity at which it travels, several orders of magnitude greater than seismic velocities. This latter attribute can be exploited to implement filtering techniques in frequency-wavenumber (f-k) and radon (tau-p) domain, which we
Wang, Jingbo; Templeton, Dennise C.; Harris, David B.
2015-07-30
Using empirical matched field processing (MFP), we compare 4 yr of continuous seismic data to a set of 195 master templates from within an active geothermal field and identify over 140 per cent more events than were identified using traditional detection and location techniques alone. In managed underground reservoirs, a substantial fraction of seismic events can be excluded from the official catalogue due to an inability to clearly identify seismic-phase onsets. Empirical MFP can improve the effectiveness of current seismic detection and location methodologies by using conventionally located events with higher signal-to-noise ratios as master events to define wavefield templatesmore » that could then be used to map normally discarded indistinct seismicity. Since MFP does not require picking, it can be carried out automatically and rapidly once suitable templates are defined. In this application, we extend MFP by constructing local-distance empirical master templates using Southern California Earthquake Data Center archived waveform data of events originating within the Salton Sea Geothermal Field. We compare the empirical templates to continuous seismic data collected between 1 January 2008 and 31 December 2011. The empirical MFP method successfully identifies 6249 additional events, while the original catalogue reported 4352 events. The majority of these new events are lower-magnitude events with magnitudes between M0.2–M0.8. Here, the increased spatial-temporal resolution of the microseismicity map within the geothermal field illustrates how empirical MFP, when combined with conventional methods, can significantly improve seismic network detection capabilities, which can aid in long-term sustainability and monitoring of managed underground reservoirs.« less
NASA Astrophysics Data System (ADS)
Nagai, S.; Wu, Y.; Suppe, J.; Hirata, N.
2009-12-01
The island of Taiwan is located in the site of ongoing arc-continent collision zone between the Philippine Sea Plate and the Eurasian Plate. Numerous geophysical and geological studies are done in and around Taiwan to develop various models to explain the tectonic processes in the Taiwan region. The active and young tectonics and the associated high seismicity in Taiwan provide us with unique opportunity to explore and understand the processes in the region related to the arc-continent collision. Nagai et al. [2009] imaged eastward dipping alternate high- and low-velocity bodies at depths of 5 to 25 km from the western side of the Central Mountain Range to the eastern part of Taiwan, by double-difference tomography [Zhang and Thurber, 2003] using three temporary seismic networks with the Central Weather Bureau Seismic Network(CWBSN). These three temporary networks are the aftershock observation after the 1999 Chi-Chi Taiwan earthquake and two dense linear array observations; one is across central Taiwan in 2001, another is across southern Taiwan in 2005, respectively. We proposed a new orogenic model, ’Upper Crustal Stacking Model’ inferred from our tomographic images. To understand the detailed seismic structure more, we carry on relocating earthquakes more precisely in central and southern Taiwan, using three-dimensional velocity model [Nagai et al., 2009] and P- and S-wave arrival times both from the CWBSN and three temporary networks. We use the double-difference tomography method to improve relative and absolute location accuracy simultaneously. The relocated seismicity is concentrated and limited along the parts of boundaries between low- and high-velocity bodies. Especially, earthquakes occurred beneath the Eastern Central Range, triggered by 1999 Chi-Chi earthquake, delineate subsurface structural boundaries, compared with profiles of estimated seismic velocity. The relocated catalog and 3-D seismic velocity model give us some constraints to reconstruct
Single station monitoring of volcanoes using seismic ambient noise
NASA Astrophysics Data System (ADS)
De Plaen, R. S.; Lecocq, T.; Caudron, C.; Ferrazzini, V.; Francis, O.
2016-12-01
During volcanic eruptions, magma transport causes gas release, pressure perturbations and fracturing in the plumbing system. The potential subsequent surface deformation that can be detected using geodetic techniques and deep mechanical processes associated with magma pressurization and/or migration and their spatial-temporal evolution can be monitored with volcanic seismicity. However, these techniques respectively suffer from limited sensitivity to deep changes and a too short-term temporal distribution to expose early aseismic processes such as magma pressurisation. Seismic ambient noise cross-correlation uses the multiple scattering of seismic vibrations by heterogeneities in the crust to retrieves the Green's function for surface waves between two stations by cross-correlating these diffuse wavefields. Seismic velocity changes are then typically measured from the cross-correlation functions with applications for volcanoes, large magnitude earthquakes in the far field and smaller magnitude earthquakes at smaller distances. This technique is increasingly used as a non-destructive way to continuously monitor small seismic velocity changes ( 0.1%) associated with volcanic activity, although it is usually limited to volcanoes equipped with large and dense networks of broadband stations. The single-station approach may provide a powerful and reliable alternative to the classical "cross-stations" approach when measuring variation of seismic velocities. We implemented it on the Piton de la Fournaise in Reunion Island, a very active volcano with a remarkable multi-disciplinary continuous monitoring. Over the past decade, this volcano was increasingly studied using the traditional cross-station approach and therefore represents a unique laboratory to validate our approach. Our results, tested on stations located up to 3.5 km from the eruptive site, performed as well as the classical approach to detect the volcanic eruption in the 1-2 Hz frequency band. This opens new
NASA Astrophysics Data System (ADS)
Dündar, Süleyman; Dias, Nuno A.; Silveira, Graça; Vinnik, Lev; Haberland, Christian
2013-04-01
An accurate knowledge of the structure of the earth's interior is of great importance to our understanding of tectonic processes. The WILAS-project (REF: PTDC/CTE-GIX/097946/2008) is a three-year collaborative project developed to study the subsurface structure of the western Iberian Peninsula, putting the main emphases on the lithosphere-asthenosphere system beneath the mainland of Portugal. The tectonic evolution of the target area has been driven by major plate-tectonic processes such as the historical opening of the Central Atlantic and the subsequent African-Eurasian convergence. Still, very little is known about the spatial structure of the continental collision. Within the framework of this research, a temporary network of 30 broadband three-component digital stations was operated between 2010 and 2012 in the target area. To carry out a large-scale structural analysis and facilitate a dense station-coverage for the area under investigation, the permanent Global Seismic Network stations, and temporary broadband stations deployed within the scope of the several seismic experiments (e.g. Doctar Network, Portuguese National Seismic Network), were included in the research analysis. In doing so, an unprecedented volume of high-quality data of a ca. 60X60 km density along with a combined network of 65 temporary and permanent broadband seismic stations are currently available for research purposes. One of the tasks of the WILAS research project has been a study of seismic velocity discontinuities beneath the western Iberian Peninsula region, up to a depth range of 700 km, utilizing the P- and S-receiver function techniques (PRF, SRF). Both techniques are based mainly on mode conversion of the elastic body-waves at an interface dividing the layers with different elastic properties. In the first phase of the project, PRF analysis was conducted in order to image the crust-mantle interface (Moho) and the mantle-transition-zone discontinuities at a depth of 410 km and
SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.
2013-12-01
Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.
NASA Astrophysics Data System (ADS)
Patlan, E.; Wamalwa, A. M.; Kaip, G.; Velasco, A. A.
2015-12-01
The Geothermal Development Company (GDC) in Kenya actively seeks to produce geothermal energy, which lies within the East African Rift System (EARS). The EARS, an active continental rift zone, appears to be a developing tectonic plate boundary and thus, has a number of active as well as dormant volcanoes throughout its extent. These volcanic centers can be used as potential sources for geothermal energy. The University of Texas at El Paso (UTEP) and the GDC deployed seismic sensors to monitor several volcanic centers: Menengai, Silali, and Paka, and Korosi. We identify microseismic, local events, and tilt like events using automatic detection algorithms and manual review to identify potential local earthquakes within our seismic network. We then perform the double-difference location method of local magnitude less than two to image the boundary of the magma chamber and the conduit feeding the volcanoes. In the process of locating local seismicity, we also identify long-period, explosion, and tremor signals that we interpret as magma passing through conduits of the magma chamber and/or fluid being transported as a function of magma movement or hydrothermal activity. We used waveform inversion and S-wave shear wave splitting to approximate the orientation of the local stresses from the vent or fissure-like conduit of the volcano. The microseismic events and long period events will help us interpret the activity of the volcanoes. Our goal is to investigate basement structures beneath the volcanoes and identify the extent of magmatic modifications of the crust. Overall, these seismic techniques will help us understand magma movement and volcanic processes in the region.
Automatic Classification of volcano-seismic events based on Deep Neural Networks.
NASA Astrophysics Data System (ADS)
Titos Luzón, M.; Bueno Rodriguez, A.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.
2017-12-01
Seismic monitoring of active volcanoes is a popular remote sensing technique to detect seismic activity, often associated to energy exchanges between the volcano and the environment. As a result, seismographs register a wide range of volcano-seismic signals that reflect the nature and underlying physics of volcanic processes. Machine learning and signal processing techniques provide an appropriate framework to analyze such data. In this research, we propose a new classification framework for seismic events based on deep neural networks. Deep neural networks are composed by multiple processing layers, and can discover intrinsic patterns from the data itself. Internal parameters can be initialized using a greedy unsupervised pre-training stage, leading to an efficient training of fully connected architectures. We aim to determine the robustness of these architectures as classifiers of seven different types of seismic events recorded at "Volcán de Fuego" (Colima, Mexico). Two deep neural networks with different pre-training strategies are studied: stacked denoising autoencoder and deep belief networks. Results are compared to existing machine learning algorithms (SVM, Random Forest, Multilayer Perceptron). We used 5 LPC coefficients over three non-overlapping segments as training features in order to characterize temporal evolution, avoid redundancy and encode the signal, regardless of its duration. Experimental results show that deep architectures can classify seismic events with higher accuracy than classical algorithms, attaining up to 92% recognition accuracy. Pre-training initialization helps these models to detect events that occur simultaneously in time (such explosions and rockfalls), increase robustness against noisy inputs, and provide better generalization. These results demonstrate deep neural networks are robust classifiers, and can be deployed in real-environments to monitor the seismicity of restless volcanoes.
NASA Astrophysics Data System (ADS)
DeGrandpre, K.; Pesicek, J. D.; Lu, Z.
2016-12-01
During the summer of 2014 and the early spring of 2015 two notable increases in seismic activity at Semisopochnoi volcano in the western Aleutian islands were recorded on AVO seismometers on Semisopochnoi and neighboring islands. These seismic swarms did not lead to an eruption. This study employs differential SAR techniques using TerraSAR-X images in conjunction with more accurately relocating the recorded seismic events through simultaneous inversion of event travel times and a three-dimensional velocity model using tomoDD. The interferograms created from the SAR images exhibit surprising coherence and an island wide spatial distribution of inflation that is then used in a Mogi model in order to define the three-dimensional location and volume change required for a source at Semisopochnoi to produce the observed surface deformation. The tomoDD relocations provide a more accurate and realistic three-dimensional velocity model as well as a tighter clustering of events for both swarms that clearly outline a linear seismic void within the larger group of shallow (<10 km) seismicity. While no direct conclusions as to the relationship of these seismic events and the observed surface deformation can be made at this time, these techniques are both complimentary and efficient forms of remotely monitoring volcanic activity that provide much deeper insights into the processes involved without having to risk hazardous or costly field work.
Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS
NASA Astrophysics Data System (ADS)
Ahmad, Raed; Adris, Ahmad; Singh, Ramesh
2016-07-01
In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.
Development of Vertical Cable Seismic System
NASA Astrophysics Data System (ADS)
Asakawa, E.; Murakami, F.; Sekino, Y.; Okamoto, T.; Ishikawa, K.; Tsukahara, H.; Shimura, T.
2011-12-01
In 2009, Ministry of Education, Culture, Sports, Science and Technology(MEXT) started the survey system development for Hydrothermal deposit. We proposed the Vertical Cable Seismic (VCS), the reflection seismic survey with vertical cable above seabottom. VCS has the following advantages for hydrothermal deposit survey. (1) VCS is an efficient high-resolution 3D seismic survey in limited area. (2) It achieves high-resolution image because the sensors are closely located to the target. (3) It avoids the coupling problems between sensor and seabottom that cause serious damage of seismic data quality. (4) Because of autonomous recording system on sea floor, various types of marine source are applicable with VCS such as sea-surface source (GI gun etc.) , deep-towed or ocean bottom source. Our first experiment of 2D/3D VCS surveys has been carried out in Lake Biwa, JAPAN, in November 2009. The 2D VCS data processing follows the walk-away VSP, including wave field separation and depth migration. Seismic Interferometry technique is also applied. The results give much clearer image than the conventional surface seismic. Prestack depth migration is applied to 3D data to obtain good quality 3D depth volume. Seismic Interferometry technique is applied to obtain the high resolution image in the very shallow zone. Based on the feasibility study, we have developed the autonomous recording VCS system and carried out the trial experiment in actual ocean at the water depth of about 400m to establish the procedures of deployment/recovery and to examine the VC position or fluctuation at seabottom. The result shows that the VC position is estimated with sufficient accuracy and very little fluctuation is observed. Institute of Industrial Science, the University of Tokyo took the research cruise NT11-02 on JAMSTEC R/V Natsushima in February, 2011. In the cruise NT11-02, JGI carried out the second VCS survey using the autonomous VCS recording system with the deep towed source provided by
Seismic Analysis Capability in NASTRAN
NASA Technical Reports Server (NTRS)
Butler, T. G.; Strang, R. F.
1984-01-01
Seismic analysis is a technique which pertains to loading described in terms of boundary accelerations. Earthquake shocks to buildings is the type of excitation which usually comes to mind when one hears the word seismic, but this technique also applied to a broad class of acceleration excitations which are applied at the base of a structure such as vibration shaker testing or shocks to machinery foundations. Four different solution paths are available in NASTRAN for seismic analysis. They are: Direct Seismic Frequency Response, Direct Seismic Transient Response, Modal Seismic Frequency Response, and Modal Seismic Transient Response. This capability, at present, is invoked not as separate rigid formats, but as pre-packaged ALTER packets to existing RIGID Formats 8, 9, 11, and 12. These ALTER packets are included with the delivery of the NASTRAN program and are stored on the computer as a library of callable utilities. The user calls one of these utilities and merges it into the Executive Control Section of the data deck to perform any of the four options are invoked by setting parameter values in the bulk data.
NASA Astrophysics Data System (ADS)
Johann, Lisa; Dinske, Carsten; Shapiro, Serge
2017-04-01
Fluid injections into unconventional reservoirs have become a standard for the enhancement of fluid-mobility parameters. Microseismic activity during and after the injection can be frequently directly associated with subsurface fluid injections. Previous studies demonstrate that postinjection-induced seismicity has two important characteristics: On the one hand, the triggering front, which corresponds to early and distant events and envelops farthest induced events. On the other hand, the back front, which describes the lower boundary of the seismic cloud and envelops the aseismic domain evolving around the source after the injection stop. A lot of research has been conducted in recent years to understand seismicity-related processes. For this work, we follow the assumption that the diffusion of pore-fluid pressure is the dominant triggering mechanism. Based on Terzaghi's concept of an effective normal stress, the injection of fluids leads to increasing pressures which in turn reduce the effective normal stress and lead to sliding along pre-existing critically stressed and favourably oriented fractures and cracks. However, in many situations, spatio-temporal signatures of induced events are captured by a rather non-linear process of pore-fluid pressure diffusion, where the hydraulic diffusivity becomes pressure-dependent. This is for example the case during hydraulic fracturing where hydraulic transport properties are significantly enhanced. For a better understanding of processes related to postinjection-induced seismicity, we analytically describe the temporal behaviour of triggering and back fronts. We introduce a scaling law which shows that postinjection-induced events are sensitive to the degree of non-linearity and to the Euclidean dimension of the seismic cloud (see Johann et al., 2016, JGR). To validate the theory, we implement comprehensive modelling of non-linear pore-fluid pressure diffusion in 3D. We solve numerically for the non-linear equation of
Updated Tomographic Seismic Imaging at Kilauea Volcano, Hawaii
NASA Astrophysics Data System (ADS)
Okubo, P.; Johnson, J.; Felts, E. S.; Flores, N.
2013-12-01
Improved and more detailed geophysical, geological, and geochemical observations and measurements at Kilauea, along with prolonged eruptions at its summit caldera and east rift zone, are encouraging more ambitious interpretation and modeling of volcanic processes over a range of temporal and spatial scales. We are updating three-dimensional models of seismic wave-speed distributions within Kilauea using local earthquake arrival time tomography to support waveform-based modeling of seismic source mechanisms. We start from a tomographic model derived from a combination of permanent seismic stations comprising the Hawaiian Volcano Observatory (HVO) seismographic network and a dense deployment of temporary stations in the Kilauea caldera region in 1996. Using P- and S-wave arrival times measured from the HVO network for local earthquakes from 1997 through 2012, we compute velocity models with the finite difference tomographic seismic imaging technique implemented by Benz and others (1996), and applied to numerous volcanoes including Kilauea. Particular impetus to our current modeling was derived from a focused effort to review seismicity occurring in Kilauea's summit caldera and adjoining regions in 2012. Our results reveal clear P-wave low-velocity features at and slightly below sea level beneath Kilauea's summit caldera, lying between Halemaumau Crater and the north-facing scarps that mark the southern caldera boundary. The results are also suggestive of changes in seismic velocity distributions between 1996 and 2012. One example of such a change is an apparent decrease in the size and southeastward extent, compared to the earlier model, of the low VP feature imaged with the more recent data. However, we recognize the distinct possibility that these changes are reflective of differences in earthquake and seismic station distributions in the respective datasets, and we need to further populate the more recent HVO seismicity catalogs to possibly address this concern
NASA Astrophysics Data System (ADS)
Huerta, F. V.; Granados, I.; Aguirre, J.; Carrera, R. Á.
2017-12-01
Nowadays, in hydrocarbon industry, there is a need to optimize and reduce exploration costs in the different types of reservoirs, motivating the community specialized in the search and development of alternative exploration geophysical methods. This study show the reflection response obtained from a shale gas / oil deposit through the method of seismic interferometry of ambient vibrations in combination with Wavelet analysis and conventional seismic reflection techniques (CMP & NMO). The method is to generate seismic responses from virtual sources through the process of cross-correlation of records of Ambient Seismic Vibrations (ASV), collected in different receivers. The seismic response obtained is interpreted as the response that would be measured in one of the receivers considering a virtual source in the other. The acquisition of ASV records was performed in northern of Mexico through semi-rectangular arrays of multi-component geophones with instrumental response of 10 Hz. The in-line distance between geophones was 40 m while in cross-line was 280 m, the sampling used during the data collection was 2 ms and the total duration of the records was 6 hours. The results show the reflection response of two lines in the in-line direction and two in the cross-line direction for which the continuity of coherent events have been identified and interpreted as reflectors. There is certainty that the events identified correspond to reflections because the time-frequency analysis performed with the Wavelet Transform has allowed to identify the frequency band in which there are body waves. On the other hand, the CMP and NMO techniques have allowed to emphasize and correct the reflection response obtained during the correlation processes in the frequency band of interest. The results of the processing and analysis of ASV records through the seismic interferometry method have allowed us to see interesting results in light of the cross-correlation process in combination with
Miller, John J.; Agena, Warren F.; Haines, Seth S.; Hart, Patrick E.
2016-04-13
As part of a cooperative effort among the U.S. Geological Survey (USGS), the U.S. Department of Energy, and the U.S. Department of the Interior Bureau of Ocean Energy Management, two grids of two-dimensional multichannel seismic reflection data were acquired in the Gulf of Mexico over lease blocks Green Canyon 955 and Walker Ridge 313 between April 18 and May 3, 2013. The purpose of the data acquisition was to fill knowledge gaps in an ongoing study of known gas hydrate accumulations in the area. These data were initially processed onboard the recording ship R/V Pelican for more quality control during the recording. The data were subsequently processed in detail by the U.S. Geological Survey in Denver, Colorado, in two phases. The first phase was to create a “kinematic” dataset that removed extensive noise present in the data but did not preserve relative amplitudes. The second phase was to create a true relative amplitude dataset that included noise removal and “wavelet” deconvolution that preserved the amplitude information. This report describes the processing techniques used to create both datasets.
Enhancement of seismic monitoring in hydrocarbon reservoirs
NASA Astrophysics Data System (ADS)
Caffagni, Enrico; Bokelmann, Götz
2017-04-01
Hydraulic Fracturing (HF) is widely considered as one of the most significant enablers of the successful exploitation of hydrocarbons in North America. Massive usage of HF is currently adopted to increase the permeability in shale and tight-sand deep reservoirs, despite the economical downturn. The exploitation success is less due to the subsurface geology, but in technology that improves exploration, production, and decision-making. This includes monitoring of the reservoir, which is vital. Indeed, the general mindset in the industry is to keep enhancing seismic monitoring. It allows understanding and tracking processes in hydrocarbon reservoirs, which serves two purposes, a) to optimize recovery, and b) to help minimize environmental impact. This raises the question of how monitoring, and especially seismic techniques could be more efficient. There is a pressing demand from seismic service industry to evolve quickly and to meet the oil-gas industry's changing needs. Nonetheless, the innovative monitoring techniques, to achieve the purpose, must enhance the characterization or the visualization of a superior-quality images of the reservoir. We discuss recent applications of seismic monitoring in hydrocarbon reservoirs, detailing potential enhancement and eventual limitations. The aim is to test the validity of these seismic monitoring techniques, qualitatively discuss their potential application to energy fields that are not only limited to HF. Outcomes from our investigation may benefit operators and regulators in case of future massive HF applications in Europe, as well. This work is part of the FracRisk consortium (www.fracrisk.eu), funded by the Horizon2020 research programme, whose aims is to help minimize the environmental footprint of the shale-gas exploration and exploitation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael G. Waddell; William J. Domoracki; Tom J. Temples
2001-12-01
This annual technical progress report is for part of Task 4 (site evaluation), Task 5 (2D seismic design, acquisition, and processing), and Task 6 (2D seismic reflection, interpretation, and AVO analysis) on DOE contact number DE-AR26-98FT40369. The project had planned one additional deployment to another site other than Savannah River Site (SRS) or DOE Hanford Site. After the SUBCON midyear review in Albuquerque, NM, it was decided that two additional deployments would be performed. The first deployment is to test the feasibility of using non-invasive seismic reflection and AVO analysis as a monitoring tool to assist in determining the effectivenessmore » of Dynamic Underground Stripping (DUS) in removal of DNAPL. The second deployment is to the Department of Defense (DOD) Charleston Naval Weapons Station Solid Waste Management Unit 12 (SWMU-12), Charleston, SC to further test the technique to detect high concentrations of DNAPL. The Charleston Naval Weapons Station SWMU-12 site was selected in consultation with National Energy Technology Laboratory (NETL) and DOD Naval Facilities Engineering Command Southern Division (NAVFAC) personnel. Based upon the review of existing data and due to the shallow target depth, the project team collected three Vertical Seismic Profiles (VSP) and an experimental P-wave seismic reflection line. After preliminary data analysis of the VSP data and the experimental reflection line data, it was decided to proceed with Task 5 and Task 6. Three high resolution P-wave reflection profiles were collected with two objectives; (1) design the reflection survey to image a target depth of 20 feet below land surface to assist in determining the geologic controls on the DNAPL plume geometry, and (2) apply AVO analysis to the seismic data to locate the zone of high concentration of DNAPL. Based upon the results of the data processing and interpretation of the seismic data, the project team was able to map the channel that is controlling the DNAPL
NASA Astrophysics Data System (ADS)
Weber, R. C.; Dimech, J. L.; Phillips, D.; Molaro, J.; Schmerr, N. C.
2017-12-01
Apollo 17's Lunar Seismic Profiling Experiment's (LSPE) primary objective was to constrain the near-surface velocity structure at the landing site using active sources detected by a 100 m-wide triangular geophone array. The experiment was later operated in "listening mode," and early studies of these data revealed the presence of thermal moonquakes - short-duration seismic events associated with terminator crossings. However, the full data set has never been systematically analyzed for natural seismic signal content. In this study, we analyze 8 months of continuous LSPE data using an automated event detection technique that has previously successfully been applied to the Apollo 16 Passive Seismic Experiment data. We detected 50,000 thermal moonquakes from three distinct event templates, representing impulsive, intermediate, and emergent onset of seismic energy, which we interpret as reflecting their relative distance from the array. Impulsive events occur largely at sunrise, possibly representing the thermal "pinging" of the nearby lunar lander, while emergent events occur at sunset, possibly representing cracking or slumping in more distant surface rocks and regolith. Preliminary application of an iterative event location algorithm to a subset of the impulsive waveforms supports this interpretation. We also perform 3D modeling of the lunar surface to explore the relative contribution of the lander, known rocks and surrounding topography to the thermal state of the regolith in the vicinity of the Apollo 17 landing site over the course of the lunar diurnal cycle. Further development of both this model and the event location algorithm may permit definitive discrimination between different types of local diurnal events e.g. lander noise, thermally-induced rock breakdown, or fault creep on the nearby Lee-Lincoln scarp. These results could place important constraints on both the contribution of seismicity to regolith production, and the age of young lobate scarps.
A Cloud-Computing Service for Environmental Geophysics and Seismic Data Processing
NASA Astrophysics Data System (ADS)
Heilmann, B. Z.; Maggi, P.; Piras, A.; Satta, G.; Deidda, G. P.; Bonomi, E.
2012-04-01
Cloud computing is establishing worldwide as a new high performance computing paradigm that offers formidable possibilities to industry and science. The presented cloud-computing portal, part of the Grida3 project, provides an innovative approach to seismic data processing by combining open-source state-of-the-art processing software and cloud-computing technology, making possible the effective use of distributed computation and data management with administratively distant resources. We substituted the user-side demanding hardware and software requirements by remote access to high-performance grid-computing facilities. As a result, data processing can be done quasi in real-time being ubiquitously controlled via Internet by a user-friendly web-browser interface. Besides the obvious advantages over locally installed seismic-processing packages, the presented cloud-computing solution creates completely new possibilities for scientific education, collaboration, and presentation of reproducible results. The web-browser interface of our portal is based on the commercially supported grid portal EnginFrame, an open framework based on Java, XML, and Web Services. We selected the hosted applications with the objective to allow the construction of typical 2D time-domain seismic-imaging workflows as used for environmental studies and, originally, for hydrocarbon exploration. For data visualization and pre-processing, we chose the free software package Seismic Un*x. We ported tools for trace balancing, amplitude gaining, muting, frequency filtering, dip filtering, deconvolution and rendering, with a customized choice of options as services onto the cloud-computing portal. For structural imaging and velocity-model building, we developed a grid version of the Common-Reflection-Surface stack, a data-driven imaging method that requires no user interaction at run time such as manual picking in prestack volumes or velocity spectra. Due to its high level of automation, CRS stacking
NASA Astrophysics Data System (ADS)
Novelo-Casanova, D. A.; Valdés-González, C.
2008-10-01
Using pattern recognition techniques, we formulate a simple prediction rule for a retrospective prediction of the three last largest eruptions of the Popocatépetl, Mexico, volcano that occurred on 23 April-30 June 1997 (Eruption 1; VEI ~ 2-3); 11 December 2000-23 January 2001 (Eruption 2; VEI ~ 3-4) and 7 June-4 September 2002 (Eruption 3; explosive dome extrusion and destruction phase). Times of Increased Probability (TIP) were estimated from the seismicity recorded by the local seismic network from 1 January 1995 to 31 December 2005. A TIP is issued when a cluster of seismic events occurs under our algorithm considerations in a temporal window several days (or weeks) prior to large volcanic activity providing sufficient time to organize an effective alert strategy. The best predictions of the three analyzed eruptions were obtained when averaging seismicity rate over a 5-day window with a threshold value of 12 events and declaring an alarm for 45 days. A TIP was issued about six weeks before Eruption 1. TIPs were detected about one and four weeks before Eruptions 2 and 3, respectively. According to our objectives, in all cases, the observed TIPs would have allowed the development of an effective civil protection strategy. Although, under our model considerations the three eruptive events were successfully predicted, one false alarm was also issued by our algorithm. An analysis of the epicentral and depth distribution of the local seismicity used by our prediction rule reveals that successful TIPs were issued from microearthquakes that took place below and towards SE of the crater. On the contrary, the seismicity that issued the observed false alarm was concentrated below the summit of the volcano. We conclude that recording of precursory seismicity below and SE of the crater together with detection of TIPs as described here, could become an important tool to predict future large eruptions at Popocatépetl. Although our model worked well for events that occurred
NASA Astrophysics Data System (ADS)
Nowack, Robert L.; Li, Cuiping
The inversion of seismic travel-time data for radially varying media was initially investigated by Herglotz, Wiechert, and Bateman (the HWB method) in the early part of the 20th century [1]. Tomographic inversions for laterally varying media began in seismology starting in the 1970’s. This included early work by Aki, Christoffersson, and Husebye who developed an inversion technique for estimating lithospheric structure beneath a seismic array from distant earthquakes (the ACH method) [2]. Also, Alekseev and others in Russia performed early inversions of refraction data for laterally varying upper mantle structure [3]. Aki and Lee [4] developed an inversion technique using travel-time data from local earthquakes.
Spatial wavefield gradient-based seismic wavefield separation
NASA Astrophysics Data System (ADS)
Van Renterghem, C.; Schmelzbach, C.; Sollberger, D.; Robertsson, J. OA
2018-03-01
Measurements of the horizontal and vertical components of particle motion combined with estimates of the spatial gradients of the seismic wavefield enable seismic data to be acquired and processed using single dedicated multicomponent stations (e.g. rotational sensors) and/or small receiver groups instead of large receiver arrays. Here, we present seismic wavefield decomposition techniques that use spatial wavefield gradient data to separate land and ocean bottom data into their upgoing/downgoing and P/S constituents. Our method is based on the elastodynamic representation theorem with the derived filters requiring local measurements of the wavefield and its spatial gradients only. We demonstrate with synthetic data and a land seismic field data example that combining translational measurements with spatial wavefield gradient estimates allows separating seismic data recorded either at the Earth's free-surface or at the sea bottom into upgoing/downgoing and P/S wavefield constituents for typical incidence angle ranges of body waves. A key finding is that the filter application only requires knowledge of the elastic properties exactly at the recording locations and is valid for a wide elastic property range.
NASA Astrophysics Data System (ADS)
Gogoladze, Z.; Moscatelli, M.; Giallini, S.; Avalle, A.; Gventsadze, A.; Kvavadze, N.; Tsereteli, N.
2016-12-01
Seismic risk is a crucial issue for South Caucasus, which is the main gateway between Asia and Europe. The goal of this work is to propose new methods and criteria for defining an overall approach aimed at assessing and mitigating seismic risk in Georgia. In this reguard seismic microzonation represents a highly useful tool for seismic risk assessmentin land management, for design of buildings or structures and for emergency planning.Seismic microzonation assessment of local seismic hazard,which is a component of seismicity resulting from specific local characteristics which cause local amplification and soil instability, through identification of zones with seismically homogeneous behavior. This paper presents the results of preliminary study of seismic microzonation of Gori, Georgia. Gori is and is located in the Shida Kartli region and on both sides of Liachvi and Mtkvari rivers, with area of about 135 km2around the Gori fortress. Gori is located in Achara-Trialeti fold-thrust belt, that is tectonically unstable. Half of all earthquakes in Gori area with magnitude M≥3.5 have happened along this fault zone and on basis of damage caused by previous earthquakes, this territory show the highest level of risk (the maximum value of direct losses) in central part of the town. The seismic microzonation map of level 1 for Gori was carried out using: 1) Already available data (i.e., topographic map and boreholes data), 2) Results of new geological surveys and 3) Geophysical measurements (i.e., MASW and noise measurements processed with HVSR technique). Our preliminary results highlight the presence of both stable zones susceptible to local amplifications and unstable zones susceptible to geological instability. Our results are directed to establish set of actions aimed at risk mitigation before initial onset of emergency, and to management of the emergency once the seismic event has occurred. The products obtained, will contain the basic elements of an integrated system
Detection of buried mines with seismic sonar
NASA Astrophysics Data System (ADS)
Muir, Thomas G.; Baker, Steven R.; Gaghan, Frederick E.; Fitzpatrick, Sean M.; Hall, Patrick W.; Sheetz, Kraig E.; Guy, Jeremie
2003-10-01
Prior research on seismo-acoustic sonar for detection of buried targets [J. Acoust. Soc. Am. 103, 2333-2343 (1998)] has continued with examination of the target strengths of buried test targets as well as targets of interest, and has also examined detection and confirmatory classification of these, all using arrays of seismic sources and receivers as well as signal processing techniques to enhance target recognition. The target strengths of two test targets (one a steel gas bottle, the other an aluminum powder keg), buried in a sand beach, were examined as a function of internal mass load, to evaluate theory developed for seismic sonar target strength [J. Acoust. Soc. Am. 103, 2344-2353 (1998)]. The detection of buried naval and military targets of interest was achieved with an array of 7 shaker sources and 5, three-axis seismometers, at a range of 5 m. Vector polarization filtering was the main signal processing technique for detection. It capitalizes on the fact that the vertical and horizontal components in Rayleigh wave echoes are 90 deg out of phase, enabling complex variable processing to obtain the imaginary component of the signal power versus time, which is unique to Rayleigh waves. Gabor matrix processing of this signal component was the main technique used to determine whether the target was man-made or just a natural target in the environment. [Work sponsored by ONR.
Continuous Seismic Threshold Monitoring
1992-05-31
Continuous threshold monitoring is a technique for using a seismic network to monitor a geographical area continuously in time. The method provides...area. Two approaches are presented. Site-specific monitoring: By focusing a seismic network on a specific target site, continuous threshold monitoring...recorded events at the site. We define the threshold trace for the network as the continuous time trace of computed upper magnitude limits of seismic
Seismic and infrasonic source processes in volcanic fluid systems
NASA Astrophysics Data System (ADS)
Matoza, Robin S.
Volcanoes exhibit a spectacular diversity in fluid oscillation processes, which lead to distinct seismic and acoustic signals in the solid earth and atmosphere. Volcano seismic waveforms contain rich information on the geometry of fluid migration, resonance effects, and transient and sustained pressure oscillations resulting from unsteady flow through subsurface cracks, fissures and conduits. Volcanic sounds contain information on shallow fluid flow, resonance in near-surface cavities, and degassing dynamics into the atmosphere. Since volcanoes have large spatial scales, the vast majority of their radiated atmospheric acoustic energy is infrasonic (<20 Hz). This dissertation presents observations from joint broadband seismic and infrasound array deployments at Mount St. Helens (MSH, Washington State, USA), Tungurahua (Ecuador), and Kilauea Volcano (Hawaii, USA), each providing data for several years. These volcanoes represent a broad spectrum of eruption styles ranging from hawaiian to plinian in nature. The catalogue of recorded infrasonic signals includes continuous broadband and harmonic tremor from persistent degassing at basaltic lava vents and tubes at Pu'u O'o (Kilauea), thousands of repetitive impulsive signals associated with seismic longperiod (0.5-5 Hz) events and the dynamics of the shallow hydrothermal system at MSH, rockfall signals from the unstable dacite dome at MSH, energetic explosion blast waves and gliding infrasonic harmonic tremor at Tungurahua volcano, and large-amplitude and long-duration broadband signals associated with jetting during vulcanian, subplinian and plinian eruptions at MSH and Tungurahua. We develop models for a selection of these infrasonic signals. For infrasonic long-period (LP) events at MSH, we investigate seismic-acoustic coupling from various buried source configurations as a means to excite infrasound waves in the atmosphere. We find that linear elastic seismic-acoustic transmission from the ground to atmosphere is
NASA Astrophysics Data System (ADS)
Picozzi, Matteo; Oth, Adrien; Parolai, Stefano; Bindi, Dino; De Landro, Grazia; Amoroso, Ortensia
2017-04-01
The accurate determination of stress drop, seismic efficiency and how source parameters scale with earthquake size is an important for seismic hazard assessment of induced seismicity. We propose an improved non-parametric, data-driven strategy suitable for monitoring induced seismicity, which combines the generalized inversion technique together with genetic algorithms. In the first step of the analysis the generalized inversion technique allows for an effective correction of waveforms for the attenuation and site contributions. Then, the retrieved source spectra are inverted by a non-linear sensitivity-driven inversion scheme that allows accurate estimation of source parameters. We therefore investigate the earthquake source characteristics of 633 induced earthquakes (ML 2-4.5) recorded at The Geysers geothermal field (California) by a dense seismic network (i.e., 32 stations of the Lawrence Berkeley National Laboratory Geysers/Calpine surface seismic network, more than 17.000 velocity records). We find for most of the events a non-selfsimilar behavior, empirical source spectra that requires ωγ source model with γ > 2 to be well fitted and small radiation efficiency ηSW. All these findings suggest different dynamic rupture processes for smaller and larger earthquakes, and that the proportion of high frequency energy radiation and the amount of energy required to overcome the friction or for the creation of new fractures surface changes with the earthquake size. Furthermore, we observe also two distinct families of events with peculiar source parameters that, in one case suggests the reactivation of deep structures linked to the regional tectonics, while in the other supports the idea of an important role of steeply dipping fault in the fluid pressure diffusion.
Seismic and acoustic signal identification algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
LADD,MARK D.; ALAM,M. KATHLEEN; SLEEFE,GERARD E.
2000-04-03
This paper will describe an algorithm for detecting and classifying seismic and acoustic signals for unattended ground sensors. The algorithm must be computationally efficient and continuously process a data stream in order to establish whether or not a desired signal has changed state (turned-on or off). The paper will focus on describing a Fourier based technique that compares the running power spectral density estimate of the data to a predetermined signature in order to determine if the desired signal has changed state. How to establish the signature and the detection thresholds will be discussed as well as the theoretical statisticsmore » of the algorithm for the Gaussian noise case with results from simulated data. Actual seismic data results will also be discussed along with techniques used to reduce false alarms due to the inherent nonstationary noise environments found with actual data.« less
NASA Astrophysics Data System (ADS)
Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia
2017-04-01
Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready
NASA Astrophysics Data System (ADS)
Anthony, Robert Ernest
During the past decade, there has been rapidly growing interest in using the naturally occurring seismic noise field to study oceanic, atmospheric, and surface processes. As many seismic noise sources, are non-impulsive and vary over a broad range of time scales (e.g., minutes to decades), they are commonly analyzed using spectral analysis or other hybrid time-frequency domain methods. The PQLX community data analysis program, and the recently released Noise Tool Kit that I co-developed with Incorporated Research Institutions for Seismology's Data Management Center are used here to characterize seismic noise for a variety of environmental targets across a broad range of frequencies. The first two chapters of the dissertation place a strong emphasis on analysis of environmental microseism signals, which occur between 1-25 s period and are dominated by seismic surface waves excited by multiple ocean-solid Earth energy transfer processes. I move away from microseisms in Chapter 3 to investigate the generally higher frequency seismic signals (> 0.33 Hz) generated by fluvial systems. In Chapter 1, I analyze recently collected, broadband data from temporary and permanent Antarctic stations to quantitatively assess background seismic noise levels across the continent between 2007-2012, including substantial previously unsampled sections of the Antarctic continental interior. I characterize three-component noise levels between 0.15-150 s using moving window probability density function-derived metrics and analyze seismic noise levels in multiple frequency bands to examine different noise sources. These metrics reveal and quantify patterns of significant seasonal and geographic noise variations across the continent, including the strong effects of seasonal sea ice variation on the microseism, at a new level of resolution. Thorough analysis of the seismic noise environment and its relation to instrumentation and siting techniques in the Polar Regions facilitates new science
Seismic imaging: From classical to adjoint tomography
NASA Astrophysics Data System (ADS)
Liu, Q.; Gu, Y. J.
2012-09-01
Seismic tomography has been a vital tool in probing the Earth's internal structure and enhancing our knowledge of dynamical processes in the Earth's crust and mantle. While various tomographic techniques differ in data types utilized (e.g., body vs. surface waves), data sensitivity (ray vs. finite-frequency approximations), and choices of model parameterization and regularization, most global mantle tomographic models agree well at long wavelengths, owing to the presence and typical dimensions of cold subducted oceanic lithospheres and hot, ascending mantle plumes (e.g., in central Pacific and Africa). Structures at relatively small length scales remain controversial, though, as will be discussed in this paper, they are becoming increasingly resolvable with the fast expanding global and regional seismic networks and improved forward modeling and inversion techniques. This review paper aims to provide an overview of classical tomography methods, key debates pertaining to the resolution of mantle tomographic models, as well as to highlight recent theoretical and computational advances in forward-modeling methods that spearheaded the developments in accurate computation of sensitivity kernels and adjoint tomography. The first part of the paper is devoted to traditional traveltime and waveform tomography. While these approaches established a firm foundation for global and regional seismic tomography, data coverage and the use of approximate sensitivity kernels remained as key limiting factors in the resolution of the targeted structures. In comparison to classical tomography, adjoint tomography takes advantage of full 3D numerical simulations in forward modeling and, in many ways, revolutionizes the seismic imaging of heterogeneous structures with strong velocity contrasts. For this reason, this review provides details of the implementation, resolution and potential challenges of adjoint tomography. Further discussions of techniques that are presently popular in
NASA Astrophysics Data System (ADS)
Jurado, Maria Jose; Teixido, Teresa; Martin, Elena; Segarra, Miguel; Segura, Carlos
2013-04-01
In the frame of the research conducted to develop efficient strategies for investigation of rock properties and fluids ahead of tunnel excavations the seismic interferometry method was applied to analyze the data acquired in boreholes instrumented with geophone strings. The results obtained confirmed that seismic interferometry provided an improved resolution of petrophysical properties to identify heterogeneities and geological structures ahead of the excavation. These features are beyond the resolution of other conventional geophysical methods but can be the cause severe problems in the excavation of tunnels. Geophone strings were used to record different types of seismic noise generated at the tunnel head during excavation with a tunnelling machine and also during the placement of the rings covering the tunnel excavation. In this study we show how tunnel construction activities have been characterized as source of seismic signal and used in our research as the seismic source signal for generating a 3D reflection seismic survey. The data was recorded in vertical water filled borehole with a borehole seismic string at a distance of 60 m from the tunnel trace. A reference pilot signal was obtained from seismograms acquired close the tunnel face excavation in order to obtain best signal-to-noise ratio to be used in the interferometry processing (Poletto et al., 2010). The seismic interferometry method (Claerbout 1968) was successfully applied to image the subsurface geological structure using the seismic wave field generated by tunneling (tunnelling machine and construction activities) recorded with geophone strings. This technique was applied simulating virtual shot records related to the number of receivers in the borehole with the seismic transmitted events, and processing the data as a reflection seismic survey. The pseudo reflective wave field was obtained by cross-correlation of the transmitted wave data. We applied the relationship between the transmission
NASA Astrophysics Data System (ADS)
Aliotta, M. A.; Cassisi, C.; Prestifilippo, M.; Cannata, A.; Montalto, P.; Patanè, D.
2014-12-01
During the last years, volcanic activity at Mt. Etna was often characterized by cyclic occurrences of fountains. In the period between January 2011 and June 2013, 38 episodes of lava fountains has been observed. Automatic recognition of the volcano's states related to lava fountain episodes (Quiet, Pre-Fountaining, Fountaining, Post-Fountaining) is very useful for monitoring purposes. We discovered that such states are strongly related to the trend of RMS (Root Mean Square) of the seismic signal recorded in the summit area. In the framework of the project PON SIGMA (Integrated Cloud-Sensor System for Advanced Multirisk Management) work, we tried to model the system generating its sampled values (assuming to be a Markov process and assuming that RMS time series is a stochastic process), by using Hidden Markov models (HMMs), that are a powerful tool for modeling any time-varying series. HMMs analysis seeks to discover the sequence of hidden states from the observed emissions. In our framework, observed emissions are characters generated by SAX (Symbolic Aggregate approXimation) technique. SAX is able to map RMS time series values with discrete literal emissions. Our experiments showed how to predict volcano states by means of SAX and HMMs.
Tunnel Detection Using Seismic Methods
NASA Astrophysics Data System (ADS)
Miller, R.; Park, C. B.; Xia, J.; Ivanov, J.; Steeples, D. W.; Ryden, N.; Ballard, R. F.; Llopis, J. L.; Anderson, T. S.; Moran, M. L.; Ketcham, S. A.
2006-05-01
Surface seismic methods have shown great promise for use in detecting clandestine tunnels in areas where unauthorized movement beneath secure boundaries have been or are a matter of concern for authorities. Unauthorized infiltration beneath national borders and into or out of secure facilities is possible at many sites by tunneling. Developments in acquisition, processing, and analysis techniques using multi-channel seismic imaging have opened the door to a vast number of near-surface applications including anomaly detection and delineation, specifically tunnels. Body waves have great potential based on modeling and very preliminary empirical studies trying to capitalize on diffracted energy. A primary limitation of all seismic energy is the natural attenuation of high-frequency energy by earth materials and the difficulty in transmitting a high- amplitude source pulse with a broad spectrum above 500 Hz into the earth. Surface waves have shown great potential since the development of multi-channel analysis methods (e.g., MASW). Both shear-wave velocity and backscatter energy from surface waves have been shown through modeling and empirical studies to have great promise in detecting the presence of anomalies, such as tunnels. Success in developing and evaluating various seismic approaches for detecting tunnels relies on investigations at known tunnel locations, in a variety of geologic settings, employing a wide range of seismic methods, and targeting a range of uniquely different tunnel geometries, characteristics, and host lithologies. Body-wave research at the Moffat tunnels in Winter Park, Colorado, provided well-defined diffraction-looking events that correlated with the subsurface location of the tunnel complex. Natural voids related to karst have been studied in Kansas, Oklahoma, Alabama, and Florida using shear-wave velocity imaging techniques based on the MASW approach. Manmade tunnels, culverts, and crawl spaces have been the target of multi-modal analysis
Seismic reflection imaging of shallow oceanographic structures
NASA Astrophysics Data System (ADS)
Piété, Helen; Marié, Louis; Marsset, Bruno; Thomas, Yannick; Gutscher, Marc-André
2013-05-01
Multichannel seismic (MCS) reflection profiling can provide high lateral resolution images of deep ocean thermohaline fine structure. However, the shallowest layers of the water column (z < 150 m) have remained unexplored by this technique until recently. In order to explore the feasibility of shallow seismic oceanography (SO), we reprocessed and analyzed four multichannel seismic reflection sections featuring reflectors at depths between 10 and 150 m. The influence of the acquisition parameters was quantified. Seismic data processing dedicated to SO was also investigated. Conventional seismic acquisition systems were found to be ill-suited to the imaging of shallow oceanographic structures, because of a high antenna filter effect induced by large offsets and seismic trace lengths, and sources that typically cannot provide both a high level of emission and fine vertical resolution. We considered a test case, the imagery of the seasonal thermocline on the western Brittany continental shelf. New oceanographic data acquired in this area allowed simulation of the seismic acquisition. Sea trials of a specifically designed system were performed during the ASPEX survey, conducted in early summer 2012. The seismic device featured: (i) four seismic streamers, each consisting of six traces of 1.80 m; (ii) a 1000 J SIG sparker source, providing a 400 Hz signal with a level of emission of 205 dB re 1 μPa @ 1 m. This survey captured the 15 m thick, 30 m deep seasonal thermocline in unprecedented detail, showing images of vertical displacements most probably induced by internal waves.
NASA Astrophysics Data System (ADS)
Gibbons, S. J.; Harris, D. B.; Dahl-Jensen, T.; Kværna, T.; Larsen, T. B.; Paulsen, B.; Voss, P. H.
2017-12-01
The oceanic boundary separating the Eurasian and North American plates between 70° and 84° north hosts large earthquakes which are well recorded teleseismically, and many more seismic events at far lower magnitudes that are well recorded only at regional distances. Existing seismic bulletins have considerable spread and bias resulting from limited station coverage and deficiencies in the velocity models applied. This is particularly acute for the lower magnitude events which may only be constrained by a small number of Pn and Sn arrivals. Over the past two decades there has been a significant improvement in the seismic network in the Arctic: a difficult region to instrument due to the harsh climate, a sparsity of accessible sites (particularly at significant distances from the sea), and the expense and difficult logistics of deploying and maintaining stations. New deployments and upgrades to stations on Greenland, Svalbard, Jan Mayen, Hopen, and Bjørnøya have resulted in a sparse but stable regional seismic network which results in events down to magnitudes below 3 generating high-quality Pn and Sn signals on multiple stations. A catalogue of several hundred events in the region since 1998 has been generated using many new phase readings on stations on both sides of the spreading ridge in addition to teleseismic P phases. A Bayesian multiple event relocation has resulted in a significant reduction in the spread of hypocentre estimates for both large and small events. Whereas single event location algorithms minimize vectors of time residuals on an event-by-event basis, the Bayesloc program finds a joint probability distribution of origins, hypocentres, and corrections to traveltime predictions for large numbers of events. The solutions obtained favour those event hypotheses resulting in time residuals which are most consistent over a given source region. The relocations have been performed with different 1-D velocity models applicable to the Arctic region and
NASA Astrophysics Data System (ADS)
Benjumea, Beatriz; Macau, Albert; Gabàs, Anna; Figueras, Sara
2016-04-01
We combine geophysical well logging and passive seismic measurements to characterize the near-surface geology of an area located in Hontomin, Burgos (Spain). This area has some near-surface challenges for a geophysical study. The irregular topography is characterized by limestone outcrops and unconsolidated sediments areas. Additionally, the near-surface geology includes an upper layer of pure limestones overlying marly limestones and marls (Upper Cretaceous). These materials lie on top of Low Cretaceous siliciclastic sediments (sandstones, clays, gravels). In any case, a layer with reduced velocity is expected. The geophysical data sets used in this study include sonic and gamma-ray logs at two boreholes and passive seismic measurements: three arrays and 224 seismic stations for applying the horizontal-to-vertical amplitude spectra ratio method (H/V). Well-logging data define two significant changes in the P-wave-velocity log within the Upper Cretaceous layer and one more at the Upper to Lower Cretaceous contact. This technique has also been used for refining the geological interpretation. The passive seismic measurements provide a map of sediment thickness with a maximum of around 40 m and shear-wave velocity profiles from the array technique. A comparison between seismic velocity coming from well logging and array measurements defines the resolution limits of the passive seismic techniques and helps it to be interpreted. This study shows how these low-cost techniques can provide useful information about near-surface complexity that could be used for designing a geophysical field survey or for seismic processing steps such as statics or imaging.
Cluster Computing For Real Time Seismic Array Analysis.
NASA Astrophysics Data System (ADS)
Martini, M.; Giudicepietro, F.
A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by
Discriminating Induced-Microearthquakes Using New Seismic Features
NASA Astrophysics Data System (ADS)
Mousavi, S. M.; Horton, S.
2016-12-01
We studied characteristics of induced-microearthquakes on the basis of the waveforms recorded on a limited number of surface receivers using machine-learning techniques. Forty features in the time, frequency, and time-frequency domains were measured on each waveform, and several techniques such as correlation-based feature selection, Artificial Neural Networks (ANNs), Logistic Regression (LR) and X-mean were used as research tools to explore the relationship between these seismic features and source parameters. The results show that spectral features have the highest correlation to source depth. Two new measurements developed as seismic features for this study, spectral centroids and 2D cross-correlations in the time-frequency domain, performed better than the common seismic measurements. These features can be used by machine learning techniques for efficient automatic classification of low energy signals recorded at one or more seismic stations. We applied the technique to 440 microearthquakes-1.7Reference: Mousavi, S.M., S.P. Horton, C. A. Langston, B. Samei, (2016) Seismic features and automatic discrimination of deep and shallow induced-microearthquakes using neural network and logistic regression, Geophys. J. Int. doi: 10.1093/gji/ggw258.
High resolution seismic reflection profiling at Aberdeen Proving Grounds, Maryland
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, R.D.; Xia, Jianghai; Swartzel, S.
1996-11-01
The effectiveness of shallow high resolution seismic reflection (i.e., resolution potential) to image geologic interfaces between about 70 and 750 ft at the Aberdeen Proving Grounds, Maryland (APG), appears to vary locally with the geometric complexity of the unconsolidated sediments that overlay crystalline bedrock. The bedrock surface (which represents the primary geologic target of this study) was imaged at each of three test areas on walkaway noise tests and CDP (common depth point) stacked data. Proven high resolution techniques were used to design and acquire data on this survey. Feasibility of the technique and minimum acquisition requirements were determined throughmore » evaluation and correlation of walkaway noise tests, CDP survey lines, and a downhole velocity check shot survey. Data processing and analysis revealed several critical attributes of shallow seismic data from APG that need careful consideration and compensation on reflection data sets. This survey determined: (1) the feasibility of the technique, (2) the resolution potential (both horizontal and vertical) of the technique, (3) the optimum source for this site, (4) the optimum acquisition geometries, (5) general processing flow, and (6) a basic idea of the acoustic variability across this site. Source testing involved an accelerated weight drop, land air gun, downhole black powder charge, sledge hammer/plate, and high frequency vibrator. Shallow seismic reflection profiles provided for a more detailed picture of the geometric complexity and variability of the distinct clay sequences (aquatards), previously inferred from drilling to be present, based on sparse drill holes and basewide conceptual models. The seismic data also reveal a clear explanation for the difficulties previously noted in correlating individual, borehole-identified sand or clay units over even short distances.« less
Coherent Waves in Seismic Researches
NASA Astrophysics Data System (ADS)
Emanov, A.; Seleznev, V. S.
2013-05-01
Development of digital processing algorithms of seismic wave fields for the purpose of useful event picking to study environment and other objects is the basis for the establishment of new seismic techniques. In the submitted paper a fundamental property of seismic wave field coherence is used. The authors extended conception of coherence types of observed wave fields and devised a technique of coherent component selection from observed wave field. Time coherence and space coherence are widely known. In this paper conception "parameter coherence" has been added. The parameter by which wave field is coherent can be the most manifold. The reason is that the wave field is a multivariate process described by a set of parameters. Coherence in the first place means independence of linear connection in wave field of parameter. In seismic wave fields, recorded in confined space, in building-blocks and stratified mediums time coherent standing waves are formed. In prospecting seismology at observation systems with multiple overlapping head waves are coherent by parallel correlation course or, in other words, by one measurement on generalized plane of observation system. For detail prospecting seismology at observation systems with multiple overlapping on basis of coherence property by one measurement of area algorithms have been developed, permitting seismic records to be converted to head wave time sections which have neither reflected nor other types of waves. Conversion in time section is executed on any specified observation base. Energy storage of head waves relative to noise on basis of multiplicity of observation system is realized within area of head wave recording. Conversion on base below the area of wave tracking is performed with lack of signal/noise ratio relative to maximum of this ratio, fit to observation system. Construction of head wave time section and dynamic plots a basis of automatic processing have been developed, similar to CDP procedure in method of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dräbenstedt, A., E-mail: a.draebenstedt@polytec.de, E-mail: rembe@iei.tu-clausthal.de, E-mail: ulrich.polom@liag-hannover.de; Seyfried, V.; Cao, X.
2016-06-28
Laser-Doppler-Vibrometry (LDV) is an established technique to measure vibrations in technical systems with picometer vibration-amplitude resolution. Especially good sensitivity and resolution can be achieved at an infrared wavelength of 1550 nm. High-resolution vibration measurements are possible over more than 100 m distance. This advancement of the LDV technique enables new applications. The detection of seismic waves is an application which has not been investigated so far because seismic waves outside laboratory scales are usually analyzed at low frequencies between approximately 1 Hz and 250 Hz and require velocity resolutions in the range below 1 nm/s/√Hz. Thermal displacements and air turbulence have critical influences to LDVmore » measurements at this low-frequency range leading to noise levels of several 100 nm/√Hz. Commonly seismic waves are measured with highly sensitive inertial sensors (geophones or Micro Electro-Mechanical Sensors (MEMS)). Approaching a laser geophone based on LDV technique is the topic of this paper. We have assembled an actively vibration-isolated optical table in a minivan which provides a hole in its underbody. The laser-beam of an infrared LDV assembled on the optical table impinges the ground below the car through the hole. A reference geophone has detected remaining vibrations on the table. We present the results from the first successful experimental demonstration of contactless detection of seismic waves from a movable vehicle with a LDV as laser geophone.« less
Support Vector Machine Model for Automatic Detection and Classification of Seismic Events
NASA Astrophysics Data System (ADS)
Barros, Vesna; Barros, Lucas
2016-04-01
The automated processing of multiple seismic signals to detect, localize and classify seismic events is a central tool in both natural hazards monitoring and nuclear treaty verification. However, false detections and missed detections caused by station noise and incorrect classification of arrivals are still an issue and the events are often unclassified or poorly classified. Thus, machine learning techniques can be used in automatic processing for classifying the huge database of seismic recordings and provide more confidence in the final output. Applied in the context of the International Monitoring System (IMS) - a global sensor network developed for the Comprehensive Nuclear-Test-Ban Treaty (CTBT) - we propose a fully automatic method for seismic event detection and classification based on a supervised pattern recognition technique called the Support Vector Machine (SVM). According to Kortström et al., 2015, the advantages of using SVM are handleability of large number of features and effectiveness in high dimensional spaces. Our objective is to detect seismic events from one IMS seismic station located in an area of high seismicity and mining activity and classify them as earthquakes or quarry blasts. It is expected to create a flexible and easily adjustable SVM method that can be applied in different regions and datasets. Taken a step further, accurate results for seismic stations could lead to a modification of the model and its parameters to make it applicable to other waveform technologies used to monitor nuclear explosions such as infrasound and hydroacoustic waveforms. As an authorized user, we have direct access to all IMS data and bulletins through a secure signatory account. A set of significant seismic waveforms containing different types of events (e.g. earthquake, quarry blasts) and noise is being analysed to train the model and learn the typical pattern of the signal from these events. Moreover, comparing the performance of the support
Seismic gradiometry using ambient seismic noise in an anisotropic Earth
NASA Astrophysics Data System (ADS)
de Ridder, S. A. L.; Curtis, A.
2017-05-01
We introduce a wavefield gradiometry technique to estimate both isotropic and anisotropic local medium characteristics from short recordings of seismic signals by inverting a wave equation. The method exploits the information in the spatial gradients of a seismic wavefield that are calculated using dense deployments of seismic arrays. The application of the method uses the surface wave energy in the ambient seismic field. To estimate isotropic and anisotropic medium properties we invert an elliptically anisotropic wave equation. The spatial derivatives of the recorded wavefield are evaluated by calculating finite differences over nearby recordings, which introduces a systematic anisotropic error. A two-step approach corrects this error: finite difference stencils are first calibrated, then the output of the wave-equation inversion is corrected using the linearized impulse response to the inverted velocity anomaly. We test the procedure on ambient seismic noise recorded in a large and dense ocean bottom cable array installed over Ekofisk field. The estimated azimuthal anisotropy forms a circular geometry around the production-induced subsidence bowl. This conforms with results from studies employing controlled sources, and with interferometry correlating long records of seismic noise. Yet in this example, the results were obtained using only a few minutes of ambient seismic noise.
Sand dune effects on seismic data
NASA Astrophysics Data System (ADS)
Arran, M.; Vriend, N. M.; Muyzert, E. J.
2017-12-01
Ground roll is a significant source of noise in land seismic data, with cross-line scattered ground roll particularly difficult to suppress. This noise arises from surface heterogeneities lateral to the receiver spread, and in desert regions sand dunes are a major contributor. However, the nature of this noise is poorly understood, preventing the design of more effective data acquisition or processing techniques. Here, we present numerical simulations demonstrating that sand dunes can act as resonators, scattering a seismic signal over an extensive period of time. We introduce a mathematical framework that quantitatively describes the properties of noise scattered by a barchan dune, and we discuss the relevance of heterogeneities within the dune. Having identified regions in time, space, and frequency space at which noise will be more significant, we propose the possibility of reducing dune-scattered noise through careful survey design and data processing.
Investigation of the detection of shallow tunnels using electromagnetic and seismic waves
NASA Astrophysics Data System (ADS)
Counts, Tegan; Larson, Gregg; Gürbüz, Ali Cafer; McClellan, James H.; Scott, Waymond R., Jr.
2007-04-01
Multimodal detection of subsurface targets such as tunnels, pipes, reinforcement bars, and structures has been investigated using both ground-penetrating radar (GPR) and seismic sensors with signal processing techniques to enhance localization capabilities. Both systems have been tested in bi-static configurations but the GPR has been expanded to a multi-static configuration for improved performance. The use of two compatible sensors that sense different phenomena (GPR detects changes in electrical properties while the seismic system measures mechanical properties) increases the overall system's effectiveness in a wider range of soils and conditions. Two experimental scenarios have been investigated in a laboratory model with nearly homogeneous sand. Images formed from the raw data have been enhanced using beamforming inversion techniques and Hough Transform techniques to specifically address the detection of linear targets. The processed data clearly indicate the locations of the buried targets of various sizes at a range of depths.
Very-long-period seismic signals - filling the gap between deformation and seismicity
NASA Astrophysics Data System (ADS)
Neuberg, Jurgen; Smith, Paddy
2013-04-01
Good broadband seismic sensors are capable to record seismic transients with dominant wavelengths of several tens or even hundreds of seconds. This allows us to generate a multi-component record of seismic volcanic events that are located in between the conventional high to low-frequency seismic spectrum and deformation signals. With a much higher temporal resolution and accuracy than e.g. GPS records, these signals fill the gap between seismicity and deformation studies. In this contribution we will review the non-trivial processing steps necessary to retrieve ground deformation from the original velocity seismogram and explore which role the resulting displacement signals have in the analysis of volcanic events. We use examples from Soufriere Hills volcano in Montserrat, West Indies, to discuss the benefits and shortcomings of such methods regarding new insights into volcanic processes.
Statistical methods for investigating quiescence and other temporal seismicity patterns
Matthews, M.V.; Reasenberg, P.A.
1988-01-01
We propose a statistical model and a technique for objective recognition of one of the most commonly cited seismicity patterns:microearthquake quiescence. We use a Poisson process model for seismicity and define a process with quiescence as one with a particular type of piece-wise constant intensity function. From this model, we derive a statistic for testing stationarity against a 'quiescence' alternative. The large-sample null distribution of this statistic is approximated from simulated distributions of appropriate functionals applied to Brownian bridge processes. We point out the restrictiveness of the particular model we propose and of the quiescence idea in general. The fact that there are many point processes which have neither constant nor quiescent rate functions underscores the need to test for and describe nonuniformity thoroughly. We advocate the use of the quiescence test in conjunction with various other tests for nonuniformity and with graphical methods such as density estimation. ideally these methods may promote accurate description of temporal seismicity distributions and useful characterizations of interesting patterns. ?? 1988 Birkha??user Verlag.
A seismic refraction technique used for subsurface investigations at Meteor Crater, Arizona
NASA Technical Reports Server (NTRS)
Ackermann, H. D.; Godson, R. H.; Watkins, J. S.
1975-01-01
A seismic refraction technique for interpreting the subsurface shape and velocity distribution of an anomalous surface feature such as an impact crater is described. The method requires the existence of a relatively deep refracting horizon and combines data obtained from both standard shallow refraction spreads and distant offset shots by using the deep refractor as a source of initial arrivals. Results obtained from applying the technique to Meteor crater generally agree with the known structure of the crater deduced by other investigators and provide new data on an extensive fractured zone surrounding the crater. The breccia lens is computed to extend roughly 190 m below the crater floor, about 30 m less than the value deduced from early drilling data. Rocks around the crater are fractured as distant as 900 m from the rim crest and to a depth of at least 800 m beneath the crater floor.
Monitoring southwest Greenland's ice sheet melt with ambient seismic noise.
Mordret, Aurélien; Mikesell, T Dylan; Harig, Christopher; Lipovsky, Bradley P; Prieto, Germán A
2016-05-01
The Greenland ice sheet presently accounts for ~70% of global ice sheet mass loss. Because this mass loss is associated with sea-level rise at a rate of 0.7 mm/year, the development of improved monitoring techniques to observe ongoing changes in ice sheet mass balance is of paramount concern. Spaceborne mass balance techniques are commonly used; however, they are inadequate for many purposes because of their low spatial and/or temporal resolution. We demonstrate that small variations in seismic wave speed in Earth's crust, as measured with the correlation of seismic noise, may be used to infer seasonal ice sheet mass balance. Seasonal loading and unloading of glacial mass induces strain in the crust, and these strains then result in seismic velocity changes due to poroelastic processes. Our method provides a new and independent way of monitoring (in near real time) ice sheet mass balance, yielding new constraints on ice sheet evolution and its contribution to global sea-level changes. An increased number of seismic stations in the vicinity of ice sheets will enhance our ability to create detailed space-time records of ice mass variations.
Time-frequency domain SNR estimation and its application in seismic data processing
NASA Astrophysics Data System (ADS)
Zhao, Yan; Liu, Yang; Li, Xuxuan; Jiang, Nansen
2014-08-01
Based on an approach estimating frequency domain signal-to-noise ratio (FSNR), we propose a method to evaluate time-frequency domain signal-to-noise ratio (TFSNR). This method adopts short-time Fourier transform (STFT) to estimate instantaneous power spectrum of signal and noise, and thus uses their ratio to compute TFSNR. Unlike FSNR describing the variation of SNR with frequency only, TFSNR depicts the variation of SNR with time and frequency, and thus better handles non-stationary seismic data. By considering TFSNR, we develop methods to improve the effects of inverse Q filtering and high frequency noise attenuation in seismic data processing. Inverse Q filtering considering TFSNR can better solve the problem of amplitude amplification of noise. The high frequency noise attenuation method considering TFSNR, different from other de-noising methods, distinguishes and suppresses noise using an explicit criterion. Examples of synthetic and real seismic data illustrate the correctness and effectiveness of the proposed methods.
Seismic data compression speeds exploration projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galibert, P.Y.
As part of an ongoing commitment to ensure industry-wide distribution of its revolutionary seismic data compression technology, Chevron Petroleum Technology Co. (CPTC) has entered into licensing agreements with Compagnie Generale de Geophysique (CGG) and other seismic contractors for use of its software in oil and gas exploration programs. CPTC expects use of the technology to be far-reaching to all of its industry partners involved in seismic data collection, processing, analysis and storage. Here, CGG--one of the world`s leading seismic acquisition and processing companies--talks about its success in applying the new methodology to replace full on-board seismic processing. Chevron`s technology ismore » already being applied on large off-shore 3-D seismic surveys. Worldwide, CGG has acquired more than 80,000 km of seismic data using the data compression technology.« less
Towards Quantification of Glacier Dynamic Ice Loss through Passive Seismic Monitoring
NASA Astrophysics Data System (ADS)
Köhler, A.; Nuth, C.; Weidle, C.; Schweitzer, J.; Kohler, J.; Buscaino, G.
2015-12-01
Global glaciers and ice caps loose mass through calving, while existing models are currently not equipped to realistically predict dynamic ice loss. This is mainly because long-term continuous calving records, that would help to better understand fine scale processes and key climatic-dynamic feedbacks between calving, climate, terminus evolution and marine conditions, do not exist. Combined passive seismic/acoustic strategies are the only technique able to capture rapid calving events continuously, independent of daylight or meteorological conditions. We have produced such a continuous calving record for Kronebreen, a tidewater glacier in Svalbard, using data from permanent seismic stations between 2001 and 2014. However, currently no method has been established in cryo-seismology to quantify the calving ice loss directly from seismic data. Independent calibration data is required to derive 1) a realistic estimation of the dynamic ice loss unobserved due to seismic noise and 2) a robust scaling of seismic calving signals to ice volumes. Here, we analyze the seismic calving record at Kronebreen and independent calving data in a first attempt to quantify ice loss directly from seismic records. We make use of a) calving flux data with weekly to monthly resolution obtained from satellite remote sensing and GPS data between 2007 and 2013, and b) direct, visual calving observations in two weeks in 2009 and 2010. Furthermore, the magnitude-scaling property of seismic calving events is analyzed. We derive and discuss an empirical relation between seismic calving events and calving flux which for the first time allows to estimate a time series of calving volumes more than one decade back in time. Improving our model requires to incorporate more precise, high-resolution calibration data. A new field campaign will combine innovative, multi-disciplinary monitoring techniques to measure calving ice volumes and dynamic ice-ocean interactions simultaneously with terrestrial laser
Seismic Anisotropy from Surface Refraction Measurements
NASA Astrophysics Data System (ADS)
Vilhelm, J.; Hrdá, J.; Klíma, K.; Lokajícek, T.; Pros, Z.
2003-04-01
The contribution deals with the methods of determining P and S wave velocities in the shallow refraction seismics. The comparison of a P-wave anisotropy from samples and field surface measurement is performed. The laboratory measurement of the P-wave velocity is realized as omni directional ultrasound measurement on oriented spherical samples (diameter 5 cm) under a hydrostatic pressure up to 400 MPa. The field measurement is based on the processing of at least one pair of reversed time-distance curves of refracted waves. Different velocity calculation techniques are involved including tomographic approach from the surface. It is shown that field seismic measurement can reflect internal rock fabric (lineation, mineral anisotropy) as well as effects connected with the fracturing and weathering. The elastic constants derived from laboratory measurements exhibit transversal isotropy. For the estimation of anisotropy influence we perform ray-tracing by the software package ANRAY (Consortium Seismic Waves in Complex 3-D Structures). The use of P and S wave anisotropy measurement to determine hard rock hydro-geological collector (water resource) is presented. In a relatively homogeneous lutaceous sedimentary medium we identified a transversally isotropic layer which exhibits increased value of permeability (transmisivity). The seismic measurement is realized by three component geophones with both vertical and shear seismic sources. VLF and resistivity profiling accompany the filed survey.
Semi-automatic mapping for identifying complex geobodies in seismic images
NASA Astrophysics Data System (ADS)
Domínguez-C, Raymundo; Romero-Salcedo, Manuel; Velasquillo-Martínez, Luis G.; Shemeretov, Leonid
2017-03-01
Seismic images are composed of positive and negative seismic wave traces with different amplitudes (Robein 2010 Seismic Imaging: A Review of the Techniques, their Principles, Merits and Limitations (Houten: EAGE)). The association of these amplitudes together with a color palette forms complex visual patterns. The color intensity of such patterns is directly related to impedance contrasts: the higher the contrast, the higher the color intensity. Generally speaking, low impedance contrasts are depicted with low tone colors, creating zones with different patterns whose features are not evident for a 3D automated mapping option available on commercial software. In this work, a workflow for a semi-automatic mapping of seismic images focused on those areas with low-intensity colored zones that may be associated with geobodies of petroleum interest is proposed. The CIE L*A*B* color space was used to perform the seismic image processing, which helped find small but significant differences between pixel tones. This process generated binary masks that bound color regions to low-intensity colors. The three-dimensional-mask projection allowed the construction of 3D structures for such zones (geobodies). The proposed method was applied to a set of digital images from a seismic cube and tested on four representative study cases. The obtained results are encouraging because interesting geobodies are obtained with a minimum of information.
Seismic experiment ross ice shelf 1990/91: Characteristics of the seismic reflection data
1993-01-01
The Transantarctic Mountains, with a length of 3000-3500 km and elevations of up to 4500 m, are one of the major Cenozoic mountain ranges in the world and are by far the most striking example of rift-shoulder mountains. Over the 1990-1991 austral summer Seismic Experiment Ross Ice Shelf (SERIS) was carried out across the Transantarctic Mountain front, between latitudes 82 degrees to 83 degrees S, in order to investigate the transition zone between the rifted area of the Ross Embayment and the uplifted Transantarctic Mountains. This experiment involved a 140 km long seismic reflection profile together with a 96 km long coincident wide-angle reflection/refraction profile. Gravity and relative elevation (using barometric pressure) were also measured along the profile. The primary purpose was to examine the boundary between the rift system and the uplifted rift margin (represented by the Transantarctic Mountains) using modern multi-channel crustal reflection/refraction techniques. The results provide insight into crustal structure across the plate boundary. SERIS also represented one of the first large-scale and modern multi-channel seismic experiments in the remote interior of Antarctica. As such, the project was designed to test different seismic acquisition techniques which will be involved in future seismic exploration of the continent. This report describes the results from the analysis of the acquisition tests as well as detailing some of the characteristics of the reflection seismic data. (auths.)
NASA Astrophysics Data System (ADS)
Keranen, Katie M.; Weingarten, Matthew
2018-05-01
The ability of fluid-generated subsurface stress changes to trigger earthquakes has long been recognized. However, the dramatic rise in the rate of human-induced earthquakes in the past decade has created abundant opportunities to study induced earthquakes and triggering processes. This review briefly summarizes early studies but focuses on results from induced earthquakes during the past 10 years related to fluid injection in petroleum fields. Study of these earthquakes has resulted in insights into physical processes and has identified knowledge gaps and future research directions. Induced earthquakes are challenging to identify using seismological methods, and faults and reefs strongly modulate spatial and temporal patterns of induced seismicity. However, the similarity of induced and natural seismicity provides an effective tool for studying earthquake processes. With continuing development of energy resources, increased interest in carbon sequestration, and construction of large dams, induced seismicity will continue to pose a hazard in coming years.
NASA Astrophysics Data System (ADS)
Zhan, Yan; Hou, Guiting; Kusky, Timothy; Gregg, Patricia M.
2016-03-01
The New Madrid Seismic Zone (NMSZ) in the Midwestern United States was the site of several major M 6.8-8 earthquakes in 1811-1812, and remains seismically active. Although this region has been investigated extensively, the ultimate controls on earthquake initiation and the duration of the seismicity remain unclear. In this study, we develop a finite element model for the Central United States to conduct a series of numerical experiments with the goal of determining the impact of heterogeneity in the upper crust, the lower crust, and the mantle on earthquake nucleation and rupture processes. Regional seismic tomography data (CITE) are utilized to infer the viscosity structure of the lithosphere which provide an important input to the numerical models. Results indicate that when differential stresses build in the Central United States, the stresses accumulating beneath the Reelfoot Rift in the NMSZ are highly concentrated, whereas the stresses below the geologically similar Midcontinent Rift System are comparatively low. The numerical observations coincide with the observed distribution of seismicity throughout the region. By comparing the numerical results with three reference models, we argue that an extensive mantle low velocity zone beneath the NMSZ produces differential stress localization in the layers above. Furthermore, the relatively strong crust in this region, exhibited by high seismic velocities, enables the elevated stress to extend to the base of the ancient rift system, reactivating fossil rifting faults and therefore triggering earthquakes. These results show that, if boundary displacements are significant, the NMSZ is able to localize tectonic stresses, which may be released when faults close to failure are triggered by external processes such as melting of the Laurentide ice sheet or rapid river incision.
Precise Relative Earthquake Depth Determination Using Array Processing Techniques
NASA Astrophysics Data System (ADS)
Florez, M. A.; Prieto, G. A.
2014-12-01
The mechanism for intermediate depth and deep earthquakes is still under debate. The temperatures and pressures are above the point where ordinary fractures ought to occur. Key to constraining this mechanism is the precise determination of hypocentral depth. It is well known that using depth phases allows for significant improvement in event depth determination, however routinely and systematically picking such phases for teleseismic or regional arrivals is problematic due to poor signal-to-noise ratios around the pP and sP phases. To overcome this limitation we have taken advantage of the additional information carried by seismic arrays. We have used beamforming and velocity spectral analysis techniques to precise measure pP-P and sP-P differential travel times. These techniques are further extended to achieve subsample accuracy and to allow for events where the signal-to-noise ratio is close to or even less than 1.0. The individual estimates obtained at different subarrays for a pair of earthquakes can be combined using a double-difference technique in order to precisely map seismicity in regions where it is tightly clustered. We illustrate these methods using data from the recent M 7.9 Alaska earthquake and its aftershocks, as well as data from the Bucaramanga nest in northern South America, arguably the densest and most active intermediate-depth earthquake nest in the world.
NASA Astrophysics Data System (ADS)
Besutiu, Lucian; Zlagnean, Luminita
2015-04-01
Background Located in the bending zone of East Carpathians, the so-called Vrancea zone is one of the most active seismic regions in Europe. Despite many years of international research, its intermediate-depth seismicity within full intra-continental environment still represents a challenge of the 21st century. Infrastructure In the attempt to join the above-mentioned efforts, the Solid Earth Dynamics Department (SEDD) in the Institute of Geodynamics of the Romanian Academy has developed a special research infrastructure, mainly devoted to gravity and space geodesy observations. A geodetic network covering the epicentre area of the intermediate-depth earthquakes has been designed and implemented for monitoring deep geodynamic processes and their surface echoes. Within each base-station of the above-mentioned network, a still-reinforced concrete pillar allows for high accuracy repeated gravity and GPS determinations. Results Starting from some results of the previously run CERGOP and UNIGRACE European programmes, to which additional SEDD repeated field campaigns were added, an unusual geodynamic behaviour has been revealed in the area. 1) Crust deformation: unlike the overall uprising of East Carpathians, as a result of denudation followed by erosion, their SE bending zone, with Vrancea epicentre area exhibits a slight subsidence. 2) Gravity change: more than 200 microgals non-tidal gravity decrease over a 20 years time-span has been noticed within the subsiding area. Extended observations showed the gravity lowering as a nowadays continuing process. Interpretation This strange combination of topography subsidence and gravity lowering has been interpreted in terms of crust stretching in the Vrancea epicentre zone due to the gravity pull created by densification of the lower crust as a result of phase-transform processes taking place in the lithospheric compartment sunken into the upper mantle. The occurrence of crust earthquakes with vertical-extension focal
NASA Astrophysics Data System (ADS)
Hazreek, Z. A. M.; Kamarudin, A. F.; Rosli, S.; Fauziah, A.; Akmal, M. A. K.; Aziman, M.; Azhar, A. T. S.; Ashraf, M. I. M.; Shaylinda, M. Z. N.; Rais, Y.; Ishak, M. F.; Alel, M. N. A.
2018-04-01
Geotechnical site investigation as known as subsurface profile evaluation is the process of subsurface layer characteristics determination which finally used for design and construction phase. Traditionally, site investigation was performed using drilling technique thus suffers from several limitation due to cost, time, data coverage and sustainability. In order to overcome those problems, this study adopted surface techniques using seismic refraction and ambient vibration method for subsurface profile depth evaluation. Seismic refraction data acquisition and processing was performed using ABEM Terraloc and OPTIM software respectively. Meanwhile ambient vibration data acquisition and processing was performed using CityShark II, Lennartz and GEOPSY software respectively. It was found that studied area consist of two layers representing overburden and bedrock geomaterials based on p-wave velocity value (vp = 300 – 2500 m/s and vp > 2500 m/s) and natural frequency value (Fo = 3.37 – 3.90 Hz) analyzed. Further analysis found that both methods show some good similarity in term of depth and thickness with percentage accuracy at 60 – 97%. Consequently, this study has demonstrated that the application of seismic refractin and ambient vibration method was applicable in subsurface profile depth and thickness estimation. Moreover, surface technique which consider as non-destructive method adopted in this study was able to compliment conventional drilling method in term of cost, time, data coverage and environmental sustainaibility.
NASA Astrophysics Data System (ADS)
Patlan, E.; Velasco, A.; Konter, J. G.
2010-12-01
The San Miguel volcano lies near the city of San Miguel, El Salvador (13.43N and - 88.26W). San Miguel volcano, an active stratovolcano, presents a significant natural hazard for the city of San Miguel. In general, the internal state and activity of volcanoes remains an important component to understanding volcanic hazard. The main technology for addressing volcanic hazards and processes is through the analysis of data collected from the deployment of seismic sensors that record ground motion. Six UTEP seismic stations were deployed around San Miguel volcano from 2007-2008 to define the magma chamber and assess the seismic and volcanic hazard. We utilize these data to develop images of the earth structure beneath the volcano, studying the volcanic processes by identifying different sources, and investigating the role of earthquakes and faults in controlling the volcanic processes. We initially locate events using automated routines and focus on analyzing local events. We then relocate each seismic event by hand-picking P-wave arrivals, and later refine these picks using waveform cross correlation. Using a double difference earthquake location algorithm (HypoDD), we identify a set of earthquakes that vertically align beneath the edifice of the volcano, suggesting that we have identified a magma conduit feeding the volcano. We also apply a double-difference earthquake tomography approach (tomoDD) to investigate the volcano’s plumbing system. Our preliminary results show the extent of the magma chamber that also aligns with some horizontal seismicity. Overall, this volcano is very active and presents a significant hazard to the region.
NASA Astrophysics Data System (ADS)
Zha, Yang
This dissertation focuses on imaging the crustal and upper mantle seismic velocity structure beneath oceanic spreading centers. The goals are to provide a better understanding of the crustal magmatic system and the relationship between mantle melting processes, crustal architecture and ridge characteristics. To address these questions I have analyzed ocean bottom geophysical data collected from the fast-spreading East Pacific Rise and the back-arc Eastern Lau Spreading Center using a combination of ambient noise tomography and seafloor compliance analysis. To characterize the crustal melt distribution at fast spreading ridges, I analyze seafloor compliance - the deformation under long period ocean wave forcing - measured during multiple expeditions between 1994 and 2007 at the East Pacific Rise 9º - 10ºN segment. A 3D numerical modeling technique is developed and used to estimate the effects of low shear velocity zones on compliance measurements. The forward modeling suggests strong variations of lower crustal shear velocity along the ridge axis, with zones of possible high melt fractions beneath certain segments. Analysis of repeated compliance measurements at 9º48'N indicates a decrease of crustal melt fraction following the 2005 - 2006 eruption. This temporal variability provides direct evidence for short-term variations of the magmatic system at a fast spreading ridge. To understand the relationship between mantle melting processes and crustal properties, I apply ambient noise tomography of ocean bottom seismograph (OBS) data to image the upper mantle seismic structure beneath the Eastern Lau Spreading Center (ELSC). The seismic images reveal an asymmetric upper mantle low velocity zone (LVZ) beneath the ELSC, representing a zone of partial melt. As the ridge migrates away from the volcanic arc, the LVZ becomes increasingly offset and separated from the sub-arc low velocity zone. The separation of the ridge and arc low velocity zones is spatially coincident
NASA Astrophysics Data System (ADS)
He, Y.-X.; Angus, D. A.; Blanchard, T. D.; Wang, G.-L.; Yuan, S.-Y.; Garcia, A.
2016-04-01
Extraction of fluids from subsurface reservoirs induces changes in pore pressure, leading not only to geomechanical changes, but also perturbations in seismic velocities and hence observable seismic attributes. Time-lapse seismic analysis can be used to estimate changes in subsurface hydromechanical properties and thus act as a monitoring tool for geological reservoirs. The ability to observe and quantify changes in fluid, stress and strain using seismic techniques has important implications for monitoring risk not only for petroleum applications but also for geological storage of CO2 and nuclear waste scenarios. In this paper, we integrate hydromechanical simulation results with rock physics models and full-waveform seismic modelling to assess time-lapse seismic attribute resolution for dynamic reservoir characterization and hydromechanical model calibration. The time-lapse seismic simulations use a dynamic elastic reservoir model based on a North Sea deep reservoir undergoing large pressure changes. The time-lapse seismic traveltime shifts and time strains calculated from the modelled and processed synthetic data sets (i.e. pre-stack and post-stack data) are in a reasonable agreement with the true earth models, indicating the feasibility of using 1-D strain rock physics transform and time-lapse seismic processing methodology. Estimated vertical traveltime shifts for the overburden and the majority of the reservoir are within ±1 ms of the true earth model values, indicating that the time-lapse technique is sufficiently accurate for predicting overburden velocity changes and hence geomechanical effects. Characterization of deeper structure below the overburden becomes less accurate, where more advanced time-lapse seismic processing and migration is needed to handle the complex geometry and strong lateral induced velocity changes. Nevertheless, both migrated full-offset pre-stack and near-offset post-stack data image the general features of both the overburden and
Seismic imaging of post-glacial sediments - test study before Spitsbergen expedition
NASA Astrophysics Data System (ADS)
Szalas, Joanna; Grzyb, Jaroslaw; Majdanski, Mariusz
2017-04-01
This work presents results of the analysis of reflection seismic data acquired from testing area in central Poland. For this experiment we used total number of 147 vertical component seismic stations (DATA-CUBE and Reftek "Texan") with accelerated weight drop (PEG-40). The profile was 350 metres long. It is a part of pilot study for future research project on Spitsbergen. The purpose of the study is to recognise the characteristics of seismic response of post-glacial sediments in order to design the most adequate survey acquisition parameters and processing sequence for data from Spitsbergen. Multiple tests and comparisons have been performed to obtain the best possible quality of seismic image. In this research we examine the influence of receiver interval size, front mute application and surface wave attenuation attempts. Although seismic imaging is the main technique we are planning to support this analysis with additional data from traveltime tomography, MASW and other a priori information.
NASA Astrophysics Data System (ADS)
Chtouki, Toufik; Vergne, Jerome; Provost, Floriane; Malet, Jean-Philippe; Burtin, Arnaud; Hibert, Clément
2017-04-01
The Super-Sauze landslide is located on the southern part of the Barcelonnette Basin (French Alps) and has developed in a soft clay-shale environment. It is one of the four sites continuously monitored through a wide variety of geophysical and hydro-geological techniques in the framework of the OMIV French national landslide observatory. From early June to mid-July 2016, a temporary dense seismic array has been installed in the most active part of the landslide and at its surroundings. 50 different sites with an average inter-station distance of 50m have been instrumented with 150 miniaturized and autonomous seismic stations (Zland nodes), allowing a continuous record of the seismic signal at frequencies higher than 0.2Hz over an almost regular grid. Concurrently, a Ground-Based InSAR device allowed for a precise and continuous monitoring of the surface deformation. Overall, this experiment is intended to better characterize the spatio-temporal evolution of the deformation processes related to various type of forcing. We analyze the continuous records of ambient seismic noise recorded by the dense array. Using power spectral densities, we characterize the various types of natural and anthropogenic seismic sources, including the effect of water turbulence and bedload transport in the small nearby torrents. We also compute the correlation of the ambient diffuse seismic noise in various frequency bands for the 2448 station pairs to recover the empirical Green functions between them. The temporal evolution of the coda part of these noise correlation functions allows monitoring and localizing shear wave velocity variations in the sliding mass. Here we present some preliminary results of this analysis and compare the seismic variations to meteorological data and surface deformation.
Multimodal approach to seismic pavement testing
Ryden, N.; Park, C.B.; Ulriksen, P.; Miller, R.D.
2004-01-01
A multimodal approach to nondestructive seismic pavement testing is described. The presented approach is based on multichannel analysis of all types of seismic waves propagating along the surface of the pavement. The multichannel data acquisition method is replaced by multichannel simulation with one receiver. This method uses only one accelerometer-receiver and a light hammer-source, to generate a synthetic receiver array. This data acquisition technique is made possible through careful triggering of the source and results in such simplification of the technique that it is made generally available. Multiple dispersion curves are automatically and objectively extracted using the multichannel analysis of surface waves processing scheme, which is described. Resulting dispersion curves in the high frequency range match with theoretical Lamb waves in a free plate. At lower frequencies there are several branches of dispersion curves corresponding to the lower layers of different stiffness in the pavement system. The observed behavior of multimodal dispersion curves is in agreement with theory, which has been validated through both numerical modeling and the transfer matrix method, by solving for complex wave numbers. ?? ASCE / JUNE 2004.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paulsson, Bjorn N.P.; Thornburg, Jon A.; He, Ruiqing
2015-04-21
Seismic techniques are the dominant geophysical techniques for the characterization of subsurface structures and stratigraphy. The seismic techniques also dominate the monitoring and mapping of reservoir injection and production processes. Borehole seismology, of all the seismic techniques, despite its current shortcomings, has been shown to provide the highest resolution characterization and most precise monitoring results because it generates higher signal to noise ratio and higher frequency data than surface seismic techniques. The operational environments for borehole seismic instruments are however much more demanding than for surface seismic instruments making both the instruments and the installation much more expensive. The currentmore » state-of-the-art borehole seismic instruments have not been robust enough for long term monitoring compounding the problems with expensive instruments and installations. Furthermore, they have also not been able to record the large bandwidth data available in boreholes or having the sensitivity allowing them to record small high frequency micro seismic events with high vector fidelity. To reliably achieve high resolution characterization and long term monitoring of Enhanced Geothermal Systems (EGS) sites a new generation of borehole seismic instruments must therefore be developed and deployed. To address the critical site characterization and monitoring needs for EGS programs, US Department of Energy (DOE) funded Paulsson, Inc. in 2010 to develop a fiber optic based ultra-large bandwidth clamped borehole seismic vector array capable of deploying up to one thousand 3C sensor pods suitable for deployment into ultra-high temperature and high pressure boreholes. Tests of the fiber optic seismic vector sensors developed on the DOE funding have shown that the new borehole seismic sensor technology is capable of generating outstanding high vector fidelity data with extremely large bandwidth: 0.01 – 6,000 Hz. Field tests have
Detecting and Locating Seismic Events Without Phase Picks or Velocity Models
NASA Astrophysics Data System (ADS)
Arrowsmith, S.; Young, C. J.; Ballard, S.; Slinkard, M.
2015-12-01
The standard paradigm for seismic event monitoring is to scan waveforms from a network of stations and identify the arrival time of various seismic phases. A signal association algorithm then groups the picks to form events, which are subsequently located by minimizing residuals between measured travel times and travel times predicted by an Earth model. Many of these steps are prone to significant errors which can lead to erroneous arrival associations and event locations. Here, we revisit a concept for event detection that does not require phase picks or travel time curves and fuses detection, association and location into a single algorithm. Our pickless event detector exploits existing catalog and waveform data to build an empirical stack of the full regional seismic wavefield, which is subsequently used to detect and locate events at a network level using correlation techniques. Because the technique uses more of the information content of the original waveforms, the concept is particularly powerful for detecting weak events that would be missed by conventional methods. We apply our detector to seismic data from the University of Utah Seismograph Stations network and compare our results with the earthquake catalog published by the University of Utah. We demonstrate that the pickless detector can detect and locate significant numbers of events previously missed by standard data processing techniques.
Updated Colombian Seismic Hazard Map
NASA Astrophysics Data System (ADS)
Eraso, J.; Arcila, M.; Romero, J.; Dimate, C.; Bermúdez, M. L.; Alvarado, C.
2013-05-01
The Colombian seismic hazard map used by the National Building Code (NSR-98) in effect until 2009 was developed in 1996. Since then, the National Seismological Network of Colombia has improved in both coverage and technology providing fifteen years of additional seismic records. These improvements have allowed a better understanding of the regional geology and tectonics which in addition to the seismic activity in Colombia with destructive effects has motivated the interest and the need to develop a new seismic hazard assessment in this country. Taking advantage of new instrumental information sources such as new broad band stations of the National Seismological Network, new historical seismicity data, standardized global databases availability, and in general, of advances in models and techniques, a new Colombian seismic hazard map was developed. A PSHA model was applied. The use of the PSHA model is because it incorporates the effects of all seismic sources that may affect a particular site solving the uncertainties caused by the parameters and assumptions defined in this kind of studies. First, the seismic sources geometry and a complete and homogeneous seismic catalog were defined; the parameters of seismic rate of each one of the seismic sources occurrence were calculated establishing a national seismotectonic model. Several of attenuation-distance relationships were selected depending on the type of seismicity considered. The seismic hazard was estimated using the CRISIS2007 software created by the Engineering Institute of the Universidad Nacional Autónoma de México -UNAM (National Autonomous University of Mexico). A uniformly spaced grid each 0.1° was used to calculate the peak ground acceleration (PGA) and response spectral values at 0.1, 0.2, 0.3, 0.5, 0.75, 1, 1.5, 2, 2.5 and 3.0 seconds with return periods of 75, 225, 475, 975 and 2475 years. For each site, a uniform hazard spectrum and exceedance rate curves were calculated. With the results, it is
Passive Seismic Monitoring for Rockfall at Yucca Mountain: Concept Tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, J; Twilley, K; Murvosh, H
2003-03-03
For the purpose of proof-testing a system intended to remotely monitor rockfall inside a potential radioactive waste repository at Yucca Mountain, a system of seismic sub-arrays will be deployed and tested on the surface of the mountain. The goal is to identify and locate rockfall events remotely using automated data collecting and processing techniques. We install seismometers on the ground surface, generate seismic energy to simulate rockfall in underground space beneath the array, and interpret the surface response to discriminate and locate the event. Data will be analyzed using matched-field processing, a generalized beam forming method for localizing discrete signals.more » Software is being developed to facilitate the processing. To date, a three-component sub-array has been installed and successfully tested.« less
An extended stochastic method for seismic hazard estimation
NASA Astrophysics Data System (ADS)
Abd el-aal, A. K.; El-Eraki, M. A.; Mostafa, S. I.
2015-12-01
In this contribution, we developed an extended stochastic technique for seismic hazard assessment purposes. This technique depends on the hypothesis of stochastic technique of Boore (2003) "Simulation of ground motion using the stochastic method. Appl. Geophy. 160:635-676". The essential characteristics of extended stochastic technique are to obtain and simulate ground motion in order to minimize future earthquake consequences. The first step of this technique is defining the seismic sources which mostly affect the study area. Then, the maximum expected magnitude is defined for each of these seismic sources. It is followed by estimating the ground motion using an empirical attenuation relationship. Finally, the site amplification is implemented in calculating the peak ground acceleration (PGA) at each site of interest. We tested and applied this developed technique at Cairo, Suez, Port Said, Ismailia, Zagazig and Damietta cities to predict the ground motion. Also, it is applied at Cairo, Zagazig and Damietta cities to estimate the maximum peak ground acceleration at actual soil conditions. In addition, 0.5, 1, 5, 10 and 20 % damping median response spectra are estimated using the extended stochastic simulation technique. The calculated highest acceleration values at bedrock conditions are found at Suez city with a value of 44 cm s-2. However, these acceleration values decrease towards the north of the study area to reach 14.1 cm s-2 at Damietta city. This comes in agreement with the results of previous studies of seismic hazards in northern Egypt and is found to be comparable. This work can be used for seismic risk mitigation and earthquake engineering purposes.
NASA Astrophysics Data System (ADS)
Jurado, Maria Jose; Ripepe, Maurizio; Lopez, Carmen; Blanco, Maria Jose; Crespo, Jose
2015-04-01
A submarine volcanic eruption took place near the southernmost emerged land of the El Hierro Island (Canary Islands, Spain), from October 2011 to February 2012. The Instituto Geografico Nacional (IGN) seismic stations network evidenced seismic unrest since July 2011 and was a reference also to follow the evolution of the seismic activity associated with the volcanic eruption. Right after the eruption onset, in October 2011 a geophone string was deployed by the CSIC-IGN to monitor seismic activity. Monitoring with the seismic array continued till May 2012. The array was installed less than 2 km away from the new vol¬cano, next to La Restinga village shore in the harbor from 6 to 12m deep into the water. Our purpose was to record seismic activity related to the volcanic activity, continuously and with special interest on high frequency events. The seismic array was endowed with 8, high frequency, 3 component, 250 Hz, geophone cable string with a separation of 6 m between them. Each geophone consists on a 3-component module based on 3 orthogonal independent sensors that measures ground velocity. Some of the geophones were placed directly on the seabed, some were buried. Due to different factors, as the irregular characteristics of the seafloor. The data was recorded on the surface with a seismometer and stored on a laptop computer. We show how acoustic data collected underwater show a great correlation with the seismic data recorded on land. Finally we compare our data analysis results with the observed sea surface activity (ash and lava emission and degassing). This evidence is disclosing new and innovative tecniques on monitoring submarine volcanic activity. Reference Instituto Geográfico Nacional (IGN), "Serie El Hierro." Internet: http://www.ign.es/ign/resources /volcanologia/HIERRO.html [May, 17. 2013
Seismic activity prediction using computational intelligence techniques in northern Pakistan
NASA Astrophysics Data System (ADS)
Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat
2017-10-01
Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.
Detecting Seismic Infrasound Signals on Balloon Platforms
NASA Astrophysics Data System (ADS)
Krishnamoorthy, S.; Komjathy, A.; Cutts, J. A.; Pauken, M.; Garcia, R.; Mimoun, D.; Jackson, J. M.; Kedar, S.; Smrekar, S. E.; Hall, J. L.
2017-12-01
The determination of the interior structure of a planet requires detailed seismic investigations - a process that entails the detection and characterization of seismic waves due to geological activities (e.g., earthquakes, volcanoes, etc.). For decades, this task has primarily been performed on Earth by an ever-expanding network of terrestrial seismic stations. However, on planets such as Venus, where the surface pressure and temperature can reach as high as 90 atmospheres and 450 degrees Celsius respectively, placing seismometers on the planet's surface poses a vexing technological challenge. However, the upper layers of the Venusian atmosphere are more benign and capable of hosting geophysical payloads for longer mission lifetimes. In order to achieve the aim of performing geophysical experiments from an atmospheric platform, JPL and its partners (ISAE-SUPAERO and California Institute of Technology) are in the process of developing technologies for detection of infrasonic waves generated by earthquakes from a balloon. The coupling of seismic energy into the atmosphere critically depends on the density differential between the surface of the planet and the atmosphere. Therefore, the successful demonstration of this technique on Earth would provide ample reason to expect success on Venus, where the atmospheric impedance is approximately 60 times that of Earth. In this presentation, we will share results from the first set of Earth-based balloon experiments performed in Pahrump, Nevada in June 2017. These tests involved the generation of artificial sources of known intensity using a seismic hammer and their detection using a complex network of sensors, including highly sensitive micro-barometers suspended from balloons, GPS receivers, geophones, microphones, and seismometers. This experiment was the first of its kind and was successful in detecting infrasonic waves from the earthquakes generated by the seismic hammer. We will present the first comprehensive analysis
Patterns in Seismicity at Mt St Helens and Mt Unzen
NASA Astrophysics Data System (ADS)
Lamb, Oliver; De Angelis, Silvio; Lavallee, Yan
2014-05-01
Cyclic behaviour on a range of timescales is a well-documented feature of many dome-forming volcanoes. Previous work on Soufrière Hills volcano (Montserrat) and Volcán de Colima (Mexico) revealed broad-scale similarities in behaviour implying the potential to develop general physical models of sub-surface processes [1]. Using volcano-seismic data from Mt St Helens (USA) and Mt Unzen (Japan) this study explores parallels in long-term behaviour of seismicity at two dome-forming systems. Within the last twenty years both systems underwent extended dome-forming episodes accompanied by large Vulcanian explosions or dome collapses. This study uses a suite of quantitative and analytical techniques which can highlight differences or similarities in volcano seismic behaviour, and compare the behaviour to changes in activity during the eruptive episodes. Seismic events were automatically detected and characterized on a single short-period seismometer station located 1.5km from the 2004-2008 vent at Mt St Helens. A total of 714 826 individual events were identified from continuous recording of seismic data from 22 October 2004 to 28 February 2006 (average 60.2 events per hour) using a short-term/long-term average algorithm. An equivalent count will be produced from seismometer recordings over the later stages of the 1991-1995 eruption at MT Unzen. The event count time-series from Mt St Helens is then analysed using Multi-taper Method and the Short-Term Fourier Transform to explore temporal variations in activity. Preliminary analysis of seismicity from Mt St Helens suggests cyclic behaviour of subannual timescale, similar to that described at Volcán de Colima and Soufrière Hills volcano [1]. Frequency Index and waveform correlation tools will be implemented to analyse changes in the frequency content of the seismicity and to explore their relations to different phases of activity at the volcano. A single station approach is used to gain a fine-scale view of variations in
NASA Astrophysics Data System (ADS)
Geza, N.; Yushin, V.
2007-12-01
Instant variations of the velocities and attenuation of seismic waves in a friable medium subjected to dynamic loading have been studied by new experimental techniques using a powerful seismic vibrator. The half-space below the operating vibrator baseplate was scanned by high-frequency elastic waves, and the recorded fluctuations were exposed to a stroboscopic analysis. It was found that the variations of seismic velocities and attenuation are synchronous with the external vibrational load but have phase shift from it. Instant variations of the seismic waves parameters depend on the magnitude and absolute value of deformation, which generally result in decreasing of the elastic-wave velocities. New experimental techniques have a high sensitivity to the dynamic disturbance in the medium and allow one to detect a weak seismic boundaries. The relaxation process after dynamic vibrational loading were investigated and the results of research are presented.
Man-caused seismicity of Kuzbass
NASA Astrophysics Data System (ADS)
Emanov, Alexandr; Emanov, Alexey; Leskova, Ekaterina; Fateyev, Alexandr
2010-05-01
A natural seismicity of Kuznetsk Basin is confined in the main to mountain frame of Kuznetsk hollow. In this paper materials of experimental work with local station networks within sediment basin are presented. Two types of seismicity display within Kuznetsk hollow have been understood: first, man-caused seismic processes, confined to mine working and concentrated on depths up to one and a half of km; secondly, seismic activations on depths of 2-56 km, not coordinated in plan with coal mines. Every of studied seismic activations consists of large quantity of earthquakes of small powers (Ms=1-3). From one to first tens of earthquakes were recorded in a day. The earthquakes near mine working shift in space along with mine working, and seismic process become stronger at the instant a coal-plough machine is operated, and slacken at the instant the preventive works are executed. The seismic processes near three lavas in Kuznetsk Basin have been studied in detail. Uplift is the most typical focal mechanism. Activated zone near mine working reach in diameter 1-1,5 km. Seismic activations not linked with mine working testify that the subsoil of Kuznetsk hollow remain in stress state in whole. The most probable causes of man-caused action on hollow are processes, coupled with change of physical state of rocks at loss of methane from large volume or change by mine working of rock watering in large volume. In this case condensed rocks, lost gas and water, can press out upwards, realizing the reverse fault mechanism of earthquakes. A combination of stress state of hollow with man-caused action at deep mining may account for incipient activations in Kuznetsk Basin. Today earthquakes happen mainly under mine workings, though damages of workings themselves do not happen, but intensive shaking on surface calls for intent study of so dangerous phenomena. In 2009 replicates of the experiment on research of seismic activations in area of before investigated lavas have been conducted
Multi-azimuth 3D Seismic Exploration and Processing in the Jeju Basin, the Northern East China Sea
NASA Astrophysics Data System (ADS)
Yoon, Youngho; Kang, Moohee; Kim, Jin-Ho; Kim, Kyong-O.
2015-04-01
Multi-azimuth(MAZ) 3D seismic exploration is one of the most advanced seismic survey methods to improve illumination and multiple attenuation for better image of the subsurface structures. 3D multi-channel seismic data were collected in two phases during 2012, 2013, and 2014 in Jeju Basin, the northern part of the East China Sea Basin where several oil and gas fields were discovered. Phase 1 data were acquired at 135° and 315° azimuths in 2012 and 2013 comprised a full 3D marine seismic coverage of 160 km2. In 2014, phase 2 data were acquired at the azimuths 45° and 225°, perpendicular to those of phase 1. These two datasets were processed through the same processing workflow prior to velocity analysis and merged to one MAZ dataset. We performed velocity analysis on the MAZ dataset as well as two phases data individually and then stacked these three datasets separately. We were able to pick more accurate velocities in the MAZ dataset compare to phase 1 and 2 data while velocity picking. Consequently, the MAZ seismic volume provide us better resolution and improved images since different shooting directions illuminate different parts of the structures and stratigraphic features.
Optimized suppression of coherent noise from seismic data using the Karhunen-Loève transform
NASA Astrophysics Data System (ADS)
Montagne, Raúl; Vasconcelos, Giovani L.
2006-07-01
Signals obtained in land seismic surveys are usually contaminated with coherent noise, among which the ground roll (Rayleigh surface waves) is of major concern for it can severely degrade the quality of the information obtained from the seismic record. This paper presents an optimized filter based on the Karhunen-Loève transform for processing seismic images contaminated with ground roll. In this method, the contaminated region of the seismic record, to be processed by the filter, is selected in such way as to correspond to the maximum of a properly defined coherence index. The main advantages of the method are that the ground roll is suppressed with negligible distortion of the remnant reflection signals and that the filtering procedure can be automated. The image processing technique described in this study should also be relevant for other applications where coherent structures embedded in a complex spatiotemporal pattern need to be identified in a more refined way. In particular, it is argued that the method is appropriate for processing optical coherence tomography images whose quality is often degraded by coherent noise (speckle).
Gas and seismicity within the Istanbul seismic gap.
Géli, L; Henry, P; Grall, C; Tary, J-B; Lomax, A; Batsi, E; Riboulot, V; Cros, E; Gürbüz, C; Işık, S E; Sengör, A M C; Le Pichon, X; Ruffine, L; Dupré, S; Thomas, Y; Kalafat, D; Bayrakci, G; Coutellier, Q; Regnier, T; Westbrook, G; Saritas, H; Çifçi, G; Çağatay, M N; Özeren, M S; Görür, N; Tryon, M; Bohnhoff, M; Gasperini, L; Klingelhoefer, F; Scalabrin, C; Augustin, J-M; Embriaco, D; Marinaro, G; Frugoni, F; Monna, S; Etiope, G; Favali, P; Bécel, A
2018-05-01
Understanding micro-seismicity is a critical question for earthquake hazard assessment. Since the devastating earthquakes of Izmit and Duzce in 1999, the seismicity along the submerged section of North Anatolian Fault within the Sea of Marmara (comprising the "Istanbul seismic gap") has been extensively studied in order to infer its mechanical behaviour (creeping vs locked). So far, the seismicity has been interpreted only in terms of being tectonic-driven, although the Main Marmara Fault (MMF) is known to strike across multiple hydrocarbon gas sources. Here, we show that a large number of the aftershocks that followed the M 5.1 earthquake of July, 25 th 2011 in the western Sea of Marmara, occurred within a zone of gas overpressuring in the 1.5-5 km depth range, from where pressurized gas is expected to migrate along the MMF, up to the surface sediment layers. Hence, gas-related processes should also be considered for a complete interpretation of the micro-seismicity (~M < 3) within the Istanbul offshore domain.
On-line Data Transmission, as Part of the Seismic Evaluation Process in the Buildings Field
NASA Astrophysics Data System (ADS)
Sorin Dragomir, Claudiu; Dobre, Daniela; Craifaleanu, Iolanda; Georgescu, Emil-Sever
2017-12-01
The thorough analytical modelling of seismic actions, of the structural system and of the foundation soil is essential for a proper dynamic analysis of a building. However, the validation of the used models should be made, whenever possible, with reference to results obtained from experimental investigations, building instrumentation and monitoring of vibrations generated by various seismic or non-seismic sources. In Romania, the permanent seismic instrumentation/monitoring of buildings is part of a special follow-up activity, performed in accordance with the P130/1999 code for the time monitoring of building behaviour and with the seismic design code, P100-2013. By using the state-of-the-art modern equipment (GeoSIG and Kinemetrics digital accelerographs) in the seismic network of the National Institute for Research and Development URBAN-INCERC, the instrumented buildings can be monitored remotely, with recorded data being sent to authorities or to research institutes in the field by a real-time data transmission system. The obtained records are processed, computing the Fourier amplitude spectra and the response spectra, and the modal parameters of buildings are determined. The paper presents some of the most important results of the institute in the field of building monitoring, focusing on the situation of some significant instrumented buildings located in different parts of the country. In addition, maps with data received from seismic stations after the occurrence of two recent Vrancea (Romania) earthquakes, showing the spatial distribution of ground accelerations, are presented, together with a comparative analysis, performed with reference to previous studies in the literature.
Iterative Strategies for Aftershock Classification in Automatic Seismic Processing Pipelines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibbons, Steven J.; Kvaerna, Tormod; Harris, David B.
We report aftershock sequences following very large earthquakes present enormous challenges to near-real-time generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase-association algorithms and a significant deterioration in the quality of underlying, fully automatic event bulletins. Current processing pipelines were designed a generation ago, and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams thatmore » are then scanned by a phase-association algorithm to form event hypotheses. We consider the scenario in which a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located, using a separate specially targeted semiautomatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid-search algorithm that may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase-association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove about half of the original detections that could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Lastly, further reductions in the number of detections in the parametric data streams are likely, using correlation and subspace
Iterative Strategies for Aftershock Classification in Automatic Seismic Processing Pipelines
Gibbons, Steven J.; Kvaerna, Tormod; Harris, David B.; ...
2016-06-08
We report aftershock sequences following very large earthquakes present enormous challenges to near-real-time generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase-association algorithms and a significant deterioration in the quality of underlying, fully automatic event bulletins. Current processing pipelines were designed a generation ago, and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams thatmore » are then scanned by a phase-association algorithm to form event hypotheses. We consider the scenario in which a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located, using a separate specially targeted semiautomatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid-search algorithm that may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase-association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove about half of the original detections that could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Lastly, further reductions in the number of detections in the parametric data streams are likely, using correlation and subspace
Iterative Strategies for Aftershock Classification in Automatic Seismic Processing Pipelines
NASA Astrophysics Data System (ADS)
Gibbons, Steven J.; Kværna, Tormod; Harris, David B.; Dodge, Douglas A.
2016-04-01
Aftershock sequences following very large earthquakes present enormous challenges to near-realtime generation of seismic bulletins. The increase in analyst resources needed to relocate an inflated number of events is compounded by failures of phase association algorithms and a significant deterioration in the quality of underlying fully automatic event bulletins. Current processing pipelines were designed a generation ago and, due to computational limitations of the time, are usually limited to single passes over the raw data. With current processing capability, multiple passes over the data are feasible. Processing the raw data at each station currently generates parametric data streams which are then scanned by a phase association algorithm to form event hypotheses. We consider the scenario where a large earthquake has occurred and propose to define a region of likely aftershock activity in which events are detected and accurately located using a separate specially targeted semi-automatic process. This effort may focus on so-called pattern detectors, but here we demonstrate a more general grid search algorithm which may cover wider source regions without requiring waveform similarity. Given many well-located aftershocks within our source region, we may remove all associated phases from the original detection lists prior to a new iteration of the phase association algorithm. We provide a proof-of-concept example for the 2015 Gorkha sequence, Nepal, recorded on seismic arrays of the International Monitoring System. Even with very conservative conditions for defining event hypotheses within the aftershock source region, we can automatically remove over half of the original detections which could have been generated by Nepal earthquakes and reduce the likelihood of false associations and spurious event hypotheses. Further reductions in the number of detections in the parametric data streams are likely using correlation and subspace detectors and/or empirical matched
Huang, Wenzhu; Zhang, Wentao; Luo, Yingbo; Li, Li; Liu, Wenyi; Li, Fang
2018-04-16
A broadband optical fiber seismometer based on FBG resonator is proposed for earthquake monitoring. The principle and key technique, high-resolution ultralow-frequency wavelength interrogation by dual-laser swept frequency and beat frequency method, are discussed and analyzed. From the simulation and test results, the seismometer works at broadband range from 0.01 Hz to 10 Hz with a sensitivity of better than 330 pm/g and the wavelength resolution of the interrogation system is better than 0.001 pm/√Hz from 0.1 Hz to 10 Hz. A three-channel correlation method is used to measure the self-noise of the seismometer. It reaches a noise level of 2.7 × 10 -7 ms -2 /√Hz@0.1 Hz, which is lower than the earth's background noise (the new high noise model, NHNM). An earthquake monitoring experiment is conducted in a low noise seismic station. The recorded seismic waves are analyzed, which suggests that the proposed seismometer has the ability to record the close microearthquake and distant great earthquake with a high signal-noise ratio (SNR). This is the first time that a FBG-based middle-long period seismometer with lower self-noise than NHNM and large dynamic range (100 dB) is reported.
Real-time Microseismic Processing for Induced Seismicity Hazard Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matzel, Eric M.
Induced seismicity is inherently associated with underground fluid injections. If fluids are injected in proximity to a pre-existing fault or fracture system, the resulting elevated pressures can trigger dynamic earthquake slip, which could both damage surface structures and create new migration pathways. The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterizationmore » phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.« less
NASA Astrophysics Data System (ADS)
Van Avendonk, H. J.; Magnani, M. B.; Shillington, D. J.; Gaherty, J. B.; Hornbach, M. J.; Dugan, B.; Long, M. D.; Lizarralde, D.; Becel, A.; Benoit, M. H.; Harder, S. H.; Wagner, L. S.; Christeson, G. L.
2014-12-01
The continental margins of the eastern United States formed in the Early Jurassic after the breakup of supercontinent Pangea. The relationship between the timing of this rift episode and the occurrence of offshore magmatism, which is expressed in the East Coast Magnetic Anomaly, is still unknown. The possible influence of magmatism and existing lithospheric structure on the rifting processes along margin of the eastern U.S. was one of the motivations to conduct a large-scale community seismic experiment in the Eastern North America (ENAM) GeoPRISMS focus site. In addition, there is also a clear need for better high-resolution seismic data with shallow penetration on this margin to better understand the geological setting of submarine landslides. The ENAM community seismic experiment is a project in which a team of scientists will gather both active-source and earthquake seismic data in the vicinity of Cape Hatteras on a 500 km wide section of the margin offshore North Carolina and Virginia. The timing of data acquisition in 2014 and 2015 facilitates leveraging of other geophysical data acquisition programs such as Earthscope's Transportable Array and the USGS marine seismic investigation of the continental shelf. In April of 2014, 30 broadband ocean-bottom seismometers were deployed on the shelf, slope and abyssal plain of the study site. These instruments will record earthquakes for one year, which will help future seismic imaging of the deeper lithosphere beneath the margin. In September and October of 2014, regional marine seismic reflection and refraction data will be gathered with the seismic vessel R/V Marcus Langseth, and airgun shots will also be recorded on land to provide data coverage across the shoreline. Last, in the summer of 2015, a land explosion seismic refraction study will provide constraints on the crustal structure in the adjacent coastal plain of North Carolina and Virginia. All seismic data will be distributed to the community through IRIS
NASA Astrophysics Data System (ADS)
Bauer, K.; Muñoz, G.; Moeck, I.
2012-12-01
The combined interpretation of different models as derived from seismic tomography and magnetotelluric (MT) inversion represents a more efficient approach to determine the lithology of the subsurface compared with the separate treatment of each discipline. Such models can be developed independently or by application of joint inversion strategies. After the step of model generation using different geophysical methodologies, a joint interpretation work flow includes the following steps: (1) adjustment of a joint earth model based on the adapted, identical model geometry for the different methods, (2) classification of the model components (e.g. model blocks described by a set of geophysical parameters), and (3) re-mapping of the classified rock types to visualise their distribution within the earth model, and petrophysical characterization and interpretation. One possible approach for the classification of multi-parameter models is based on statistical pattern recognition, where different models are combined and translated into probability density functions. Classes of rock types are identified in these methods as isolated clusters with high probability density function values. Such techniques are well-established for the analysis of two-parameter models. Alternatively we apply self-organizing map (SOM) techniques, which have no limitations in the number of parameters to be analysed in the joint interpretation. Our SOM work flow includes (1) generation of a joint earth model described by so-called data vectors, (2) unsupervised learning or training, (3) analysis of the feature map by adopting image processing techniques, and (4) application of the knowledge to derive a lithological model which is based on the different geophysical parameters. We show the usage of the SOM work flow for a synthetic and a real data case study. Both tests rely on three geophysical properties: P velocity and vertical velocity gradient from seismic tomography, and electrical resistivity
NASA Astrophysics Data System (ADS)
Walton, M. A. L.; Gulick, S. P. S.; Haeussler, P. J.; Rohr, K.; Roland, E. C.; Trehu, A. M.
2014-12-01
The Queen Charlotte Fault (QCF) is an obliquely convergent strike-slip system that accommodates offset between the Pacific and North America plates in southeast Alaska and western Canada. Two recent earthquakes, including a M7.8 thrust event near Haida Gwaii on 28 October 2012, have sparked renewed interest in the margin and led to further study of how convergent stress is accommodated along the fault. Recent studies have looked in detail at offshore structure, concluding that a change in strike of the QCF at ~53.2 degrees north has led to significant differences in stress and the style of strain accommodation along-strike. We provide updated fault mapping and seismic images to supplement and support these results. One of the highest-quality seismic reflection surveys along the Queen Charlotte system to date, EW9412, was shot aboard the R/V Maurice Ewing in 1994. The survey was last processed to post-stack time migration for a 1999 publication. Due to heightened interest in high-quality imaging along the fault, we have completed updated processing of the EW9412 seismic reflection data and provide prestack migrations with water-bottom multiple reduction. Our new imaging better resolves fault and basement surfaces at depth, as well as the highly deformed sediments within the Queen Charlotte Terrace. In addition to re-processing the EW9412 seismic reflection data, we have compiled and re-analyzed a series of publicly available USGS seismic reflection data that obliquely cross the QCF. Using these data, we are able to provide updated maps of the Queen Charlotte fault system, adding considerable detail along the northernmost QCF where it links up with the Chatham Strait and Transition fault systems. Our results support conclusions that the changing geometry of the QCF leads to fundamentally different convergent stress accommodation north and south of ~53.2 degrees; namely, reactivated splay faults to the north vs. thickening of sediments and the upper crust to the south
Marked point process for modelling seismic activity (case study in Sumatra and Java)
NASA Astrophysics Data System (ADS)
Pratiwi, Hasih; Sulistya Rini, Lia; Wayan Mangku, I.
2018-05-01
Earthquake is a natural phenomenon that is random, irregular in space and time. Until now the forecast of earthquake occurrence at a location is still difficult to be estimated so that the development of earthquake forecast methodology is still carried out both from seismology aspect and stochastic aspect. To explain the random nature phenomena, both in space and time, a point process approach can be used. There are two types of point processes: temporal point process and spatial point process. The temporal point process relates to events observed over time as a sequence of time, whereas the spatial point process describes the location of objects in two or three dimensional spaces. The points on the point process can be labelled with additional information called marks. A marked point process can be considered as a pair (x, m) where x is the point of location and m is the mark attached to the point of that location. This study aims to model marked point process indexed by time on earthquake data in Sumatra Island and Java Island. This model can be used to analyse seismic activity through its intensity function by considering the history process up to time before t. Based on data obtained from U.S. Geological Survey from 1973 to 2017 with magnitude threshold 5, we obtained maximum likelihood estimate for parameters of the intensity function. The estimation of model parameters shows that the seismic activity in Sumatra Island is greater than Java Island.
Seismic lateral prediction in chalky limestone reservoirs offshore Qatar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubbens, I.B.H.M.; Murat, R.C.; Vankeulen, J.
Following the discovery of non-structurally trapped oil accumulations in Cretaceous chalky reservoirs on the northern flank of the North Dome offshore QATAR, a seismic lateral prediction study was carried out for QATAR GENERAL PETROLEUM CORPORATION (Offshore Operations). The objectives of this study were to assist in the appraisal of these oil accumulations by predicting their possible lateral extent and to investigate if the technique applied could be used as a basis for further exploration of similar oil prospects in the area. Wireline logs of eight wells and some 1000 km of high quality seismic data were processed into acoustic impedancemore » (A.I.) logs and seismic A.I. sections. Having obtained a satisfactory match of the A.I. well logs and the A.I. of the seismic traces at the well locations, relationships were established by the use of well log data which allowed the interpretation of the seismic A.I. in terms of reservoir quality. Measurements of the relevant A.I. characteristics were then carried out by computer along all seismic lines and porosity distribution maps prepared for some of the reservoirs. These maps, combined with detailed seismic depth contour maps at reservoir tops, lead to definition of good reservoir development areas downdip from poor reservoir quality zones i.e. of the stratigraphic trap areas, and drilling locations could thus be proposed. The system remains to be adequately calibrated when core material becomes available in the area of study.« less
NASA Astrophysics Data System (ADS)
Cochran, E. S.; Ellsworth, W. L.; Llenos, A. L.; Rubinstein, J. L.
2014-12-01
In this presentation, we outline the USGS response to dramatically increased earthquake activity in the central and eastern US, with a focus on Oklahoma. Using the November 2011 Prague, OK earthquake sequence as an example, we describe the tensions between the need to conduct thorough scientific investigations while providing timely information to local, state, and federal government agencies, and the public. In the early morning hours of November 5, 2011 a M4.8 earthquake struck near the town of Prague, Oklahoma and was followed by a M5.6 earthquake just over 20 hours later. The mainshock was widely felt across the central US, causing damage to homes close to the epicenter and injuring at least 2 people. Within hours of the initial event several portable instruments were installed and following the mainshock a larger seismic deployment was mounted (Keranen et al., 2013). A sizeable earthquake in the central or eastern US is always of scientific interest due to the dearth of seismic data available for assessing seismic hazard. The Prague sequence garnered especially strong scientific and public interest when a link between the sequence and injection of wastewater at several local deep wells was postulated. Therefore, there was a need to provide immediate information as it became available. However, in the first few days to months it was impossible to confidently confirm or refute whether the seismicity was linked to injection, but it was known that the foreshock occurred close to several deep injection wells and many of the events were shallow; thus, the sequence warranted further study. Over the course of the next few years, several studies built the case that the Prague sequence was likely induced by wastewater injection (Keranen et al., 2013; Sumy et al., 2014; McGarr, 2014) and additional studies suggested that the changes in seismicity throughout Oklahoma were not due to natural variations in seismicity rates (Llenos and Michael, 2013; Ellsworth, 2013). These
Demonstration of improved seismic source inversion method of tele-seismic body wave
NASA Astrophysics Data System (ADS)
Yagi, Y.; Okuwaki, R.
2017-12-01
Seismic rupture inversion of tele-seismic body wave has been widely applied to studies of large earthquakes. In general, tele-seismic body wave contains information of overall rupture process of large earthquake, while the tele-seismic body wave is inappropriate for analyzing a detailed rupture process of M6 7 class earthquake. Recently, the quality and quantity of tele-seismic data and the inversion method has been greatly improved. Improved data and method enable us to study a detailed rupture process of M6 7 class earthquake even if we use only tele-seismic body wave. In this study, we demonstrate the ability of the improved data and method through analyses of the 2016 Rieti, Italy earthquake (Mw 6.2) and the 2016 Kumamoto, Japan earthquake (Mw 7.0) that have been well investigated by using the InSAR data set and the field observations. We assumed the rupture occurring on a single fault plane model inferred from the moment tensor solutions and the aftershock distribution. We constructed spatiotemporal discretized slip-rate functions with patches arranged as closely as possible. We performed inversions using several fault models and found that the spatiotemporal location of large slip-rate area was robust. In the 2016 Kumamoto, Japan earthquake, the slip-rate distribution shows that the rupture propagated to southwest during the first 5 s. At 5 s after the origin time, the main rupture started to propagate toward northeast. First episode and second episode correspond to rupture propagation along the Hinagu fault and the Futagawa fault, respectively. In the 2016 Rieti, Italy earthquake, the slip-rate distribution shows that the rupture propagated to up-dip direction during the first 2 s, and then rupture propagated toward northwest. From both analyses, we propose that the spatiotemporal slip-rate distribution estimated by improved inversion method of tele-seismic body wave has enough information to study a detailed rupture process of M6 7 class earthquake.
Wave equation datuming applied to S-wave reflection seismic data
NASA Astrophysics Data System (ADS)
Tinivella, U.; Giustiniani, M.; Nicolich, R.
2018-05-01
S-wave high-resolution reflection seismic data was processed using Wave Equation Datuming technique in order to improve signal/noise ratio, attenuating coherent noise, and seismic resolution and to solve static corrections problems. The application of this algorithm allowed obtaining a good image of the shallow subsurface geological features. Wave Equation Datuming moves shots and receivers from a surface to another datum (the datum plane), removing time shifts originated by elevation variation and/or velocity changes in the shallow subsoil. This algorithm has been developed and currently applied to P wave, but it reveals the capacity to highlight S-waves images when used to resolve thin layers in high-resolution prospecting. A good S-wave image facilitates correlation with well stratigraphies, optimizing cost/benefit ratio of any drilling. The application of Wave Equation Datuming requires a reliable velocity field, so refraction tomography was adopted. The new seismic image highlights the details of the subsoil reflectors and allows an easier integration with borehole information and geological surveys than the seismic section obtained by conventional CMP reflection processing. In conclusion, the analysis of S-wave let to characterize the shallow subsurface recognizing levels with limited thickness once we have clearly attenuated ground roll, wind and environmental noise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michael G. Waddell; William J. Domoracki; Jerome Eyer
2003-01-01
The Earth Sciences and Resources Institute, University of South Carolina is conducting a proof of concept study to determine the location and distribution of subsurface DNAPL carbon tetrachloride (CCl{sub 4}) contamination at the 216-Z-9 crib, 200 West area, DOE Hanford Site, Washington by use of two-dimensional high-resolution seismic reflection surveys and borehole geophysical data. The study makes use of recent advances in seismic reflection amplitude versus offset (AVO) technology to directly detect the presence of subsurface DNAPL. The techniques proposed are noninvasive means of site characterization and direct free-phase DNAPL detection. This final report covers the results of Tasks 1,more » 2, and 3. Task (1) contains site evaluation and seismic modeling studies. The site evaluation consists of identifying and collecting preexisting geological and geophysical information regarding subsurface structure and the presence and quantity of DNAPL. The seismic modeling studies were undertaken to determine the likelihood that an AVO response exists and its probable manifestation. Task (2) is the design and acquisition of 2-D seismic reflection data to image areas of probable high concentration of DNAPL. Task (3) is the processing and interpretation of the 2-D data. During the commission of these tasks four seismic reflection profiles were collected. Subsurface velocity information was obtained by vertical seismic profile surveys in three wells. The interpretation of these data is in two parts. Part one is the construction and interpretation of structural contour maps of the contact between the Hanford Fine unit and the underlying Plio/Pleistocene unit and of the contact between the Plio/Pleistocene unit and the underlying caliche layer. These two contacts were determined to be the most likely surfaces to contain the highest concentration CCl{sub 4}. Part two of the interpretation uses the results of the AVO modeling to locate any seismic amplitude anomalies that
Seismicity and magmatic processes in the Rwenzori region of the Albertine Rift.
NASA Astrophysics Data System (ADS)
Lindenfeld, Michael; Rümpker, Georg; Kasereka, Celestin M.; Batte, Arthur; Schumann, Andreas
2013-04-01
In this presentation we summarize results from two extensive seismic field studies with temporary station networks in the Rwenzori region of the Albertine rift, located at the border between Uganda and the Democratic Republic of Congo. The first network was running from February 2006 to September 2007. It consisted of 27 seismic stations which were deployed in the Ugandan part of the area. A second network of 33 stations was operated between October 2009 and October 2011. It traversed the whole rift segment from the eastern rift shoulder in Uganda to the western shoulder in the D.R. Congo, covering the whole Rwenzori region. The data analysis revealed a pronounced local earthquake activity in this area with an average rate of more than 800 events per month and proves that this segment of the Albertine Rift belongs to the seismically most active regions of the whole East African Rift System. The earthquake distribution is highly heterogeneous. The highest activity is observed in the northeastern part of the Rwenzori area. Here, the mountains are connected to the eastern rift shoulder whereas they are surrounded by rift segments elsewhere. We were able to locate seismicity bursts with more than 300 events per day. The depth extent of seismicity ranges from 20 to 39 km and correlates well with Moho depths that were derived from teleseismic receiver functions. The majority of the derived fault plane solutions exhibit normal faulting with WNW-ESE oriented T-axes, which is perpendicular to the rift axis and in good agreement with kinematic rift models. The area of highest seismic activity is also characterized by the existence of several vertical elongated earthquake clusters in the crust. From petrological considerations we presume that these events are triggered by fluids and gases which originate from a magmatic source below the crust. The existence of a magmatic source within the lithosphere is supported by the detection of mantle earthquakes at about 40 - 60 km
NASA Astrophysics Data System (ADS)
Genzano, Nicola; Filizzola, Carolina; Hattori, Katsumi; Lisi, Mariano; Paciello, Rossana; Pergola, Nicola; Tramutoli, Valerio
2016-04-01
Since eighties, the fluctuations of Earth's thermally emitted radiation, measured by satellite sensors operating in the thermal infrared (TIR) spectral range, have been associated with the complex process of preparation for major earthquakes. But, like other claimed earthquake precursors (seismological, physical, chemical, biological, etc.) they have been for long-time considered with some caution by scientific community. The lack of a rigorous definition of anomalous TIR signal fluctuations and the scarce attention paid to the possibility that other causes (e.g. meteorological) different from seismic activity could be responsible for the observed TIR variations were the main causes of such skepticism. Compared with previously proposed approaches the general change detection approach, named Robust Satellite Techniques (RST), showed good ability to discriminate anomalous TIR signals possibly associated to seismic activity, from the normal variability of TIR signal due to other causes. Thanks to its full exportability on different satellite packages, since 2001 RST has been implemented on TIR images acquired by polar (e.g. NOAA-AVHRR, EOS -MODIS) and geostationary (e.g. MSG-SEVIRI, NOAA-GOES/W, GMS-5/VISSR) satellite sensors, in order to verify the presence (or absence) of TIR anomalies in presence (absence) of earthquakes (with M>4) in different seismogenic areas around the world (e.g. Italy, Greece, Turkey, India, Taiwan, etc.). In this paper, the RST data analysis approach has been implemented on TIR satellite records collected over Japan by the geostationary satellite sensor MTSAT (Multifunctional Transport SATellites) and RETIRA (Robust Estimator of TIR Anomalies) index was used to identify Significant Sequences of TIR Anomalies (SSTAs) in a possible space-time relations with seismic events. Achieved results will be discussed in the perspective of a multi-parametric approach for a time-Dependent Assessment of Seismic Hazard (t-DASH).
Pick- and waveform-based techniques for real-time detection of induced seismicity
NASA Astrophysics Data System (ADS)
Grigoli, Francesco; Scarabello, Luca; Böse, Maren; Weber, Bernd; Wiemer, Stefan; Clinton, John F.
2018-05-01
The monitoring of induced seismicity is a common operation in many industrial activities, such as conventional and non-conventional hydrocarbon production or mining and geothermal energy exploitation, to cite a few. During such operations, we generally collect very large and strongly noise-contaminated data sets that require robust and automated analysis procedures. Induced seismicity data sets are often characterized by sequences of multiple events with short interevent times or overlapping events; in these cases, pick-based location methods may struggle to correctly assign picks to phases and events, and errors can lead to missed detections and/or reduced location resolution and incorrect magnitudes, which can have significant consequences if real-time seismicity information are used for risk assessment frameworks. To overcome these issues, different waveform-based methods for the detection and location of microseismicity have been proposed. The main advantages of waveform-based methods is that they appear to perform better and can simultaneously detect and locate seismic events providing high-quality locations in a single step, while the main disadvantage is that they are computationally expensive. Although these methods have been applied to different induced seismicity data sets, an extensive comparison with sophisticated pick-based detection methods is still missing. In this work, we introduce our improved waveform-based detector and we compare its performance with two pick-based detectors implemented within the SeiscomP3 software suite. We test the performance of these three approaches with both synthetic and real data sets related to the induced seismicity sequence at the deep geothermal project in the vicinity of the city of St. Gallen, Switzerland.
Response in thermal neutrons intensity on the activation of seismic processes
NASA Astrophysics Data System (ADS)
Antonova, Valentina; Chubenko, Alexandr; Kryukov, Sergey; Lutsenko, Vadim
2017-04-01
Results of study of thermal and high-energy neutrons intensity during the activation of seismic activity are presented. Installations are located close to the fault of the earth's crust at the high-altitude station of cosmic rays (3340 m above sea level, 20 km from Almaty) in the mountains of Northern Tien-Shan. High correlation and similarity of responses to changes of space and geophysical conditions in the absence of seismic activity are obtained between data of thermal neutron detectors and data of the standard neutron monitor, recording the intensity of high-energy particles. These results confirm the genetic connection of thermal neutrons at the Earth's surface with high-energy neutrons of the galactic origin and suggest same sources of disturbances of their flux. However, observations and analysis of experimental data during the activation of seismic activity showed the frequent breakdown of the correlation between the intensity of thermal and high-energy neutrons and the absence of similarity between variations during these periods. We suppose that the cause of this phenomenon is the additional thermal neutron flux of the lithospheric origin, which appears under these conditions. Method of separating of thermal neutron intensity variations of the lithospheric origin from neutrons variations generated in the atmosphere is proposed. We used this method for analysis of variations of thermal neutrons intensity during earthquakes (with intensity ≥ 3b) in the vicinity of Almaty which took place in 2006-2015. The increase of thermal neutrons flux of the lithospheric origin during of seismic processes activation was observed for 60% of events. However, before the earthquake the increase of thermal neutron flux is only observed for 25-30% of events. It is shown that the amplitude of the additional thermal neutron flux from the Earth's crust is equal to 5-7% of the background level.
Nonlinear dynamic failure process of tunnel-fault system in response to strong seismic event
NASA Astrophysics Data System (ADS)
Yang, Zhihua; Lan, Hengxing; Zhang, Yongshuang; Gao, Xing; Li, Langping
2013-03-01
Strong earthquakes and faults have significant effect on the stability capability of underground tunnel structures. This study used a 3-Dimensional Discrete Element model and the real records of ground motion in the Wenchuan earthquake to investigate the dynamic response of tunnel-fault system. The typical tunnel-fault system was composed of one planned railway tunnel and one seismically active fault. The discrete numerical model was prudentially calibrated by means of the comparison between the field survey and numerical results of ground motion. It was then used to examine the detailed quantitative information on the dynamic response characteristics of tunnel-fault system, including stress distribution, strain, vibration velocity and tunnel failure process. The intensive tunnel-fault interaction during seismic loading induces the dramatic stress redistribution and stress concentration in the intersection of tunnel and fault. The tunnel-fault system behavior is characterized by the complicated nonlinear dynamic failure process in response to a real strong seismic event. It can be qualitatively divided into 5 main stages in terms of its stress, strain and rupturing behaviors: (1) strain localization, (2) rupture initiation, (3) rupture acceleration, (4) spontaneous rupture growth and (5) stabilization. This study provides the insight into the further stability estimation of underground tunnel structures under the combined effect of strong earthquakes and faults.
Signal Quality and the Reliability of Seismic Observations
NASA Astrophysics Data System (ADS)
Zeiler, C. P.; Velasco, A. A.; Pingitore, N. E.
2009-12-01
The ability to detect, time and measure seismic phases depends on the location, size, and quality of the recorded signals. Additional constraints are an analyst’s familiarity with a seismogenic zone and with the seismic stations that record the energy. Quantification and qualification of an analyst’s ability to detect, time and measure seismic signals has not been calculated or fully assessed. The fundamental measurement for computing the accuracy of a seismic measurement is the signal quality. Several methods have been proposed to measure signal quality; however, the signal-to-noise ratio (SNR) has been adopted as a short-term average over the long-term average. While the standard SNR is an easy and computationally inexpensive term, the overall statistical significance has not been computed for seismic measurement analysis. The prospect of canonizing the process of cataloging seismic arrivals hinges on the ability to repeat measurements made by different methods and analysts. The first step in canonizing phase measurements has been done by the IASPEI, which established a reference for accepted practices in naming seismic phases. The New Manual for Seismological Observatory Practices (NMSOP, 2002) outlines key observations for seismic phases recorded at different distances and proposes to quantify timing uncertainty with a user-specified windowing technique. However, this added measurement would not completely remove bias introduced by different techniques used by analysts to time seismic arrivals. The general guideline to time a seismic arrival is to record the time where a noted change in frequency and/or amplitude begins. This is generally achieved by enhancing the arrivals through filtering or beam forming. However, these enhancements can alter the characteristics of the arrival and how the arrival will be measured. Furthermore, each enhancement has user-specified parameters that can vary between analysts and this results in reduced ability to repeat
Fast principal component analysis for stacking seismic data
NASA Astrophysics Data System (ADS)
Wu, Juan; Bai, Min
2018-04-01
Stacking seismic data plays an indispensable role in many steps of the seismic data processing and imaging workflow. Optimal stacking of seismic data can help mitigate seismic noise and enhance the principal components to a great extent. Traditional average-based seismic stacking methods cannot obtain optimal performance when the ambient noise is extremely strong. We propose a principal component analysis (PCA) algorithm for stacking seismic data without being sensitive to noise level. Considering the computational bottleneck of the classic PCA algorithm in processing massive seismic data, we propose an efficient PCA algorithm to make the proposed method readily applicable for industrial applications. Two numerically designed examples and one real seismic data are used to demonstrate the performance of the presented method.
Seismic databases of The Caucasus
NASA Astrophysics Data System (ADS)
Gunia, I.; Sokhadze, G.; Mikava, D.; Tvaradze, N.; Godoladze, T.
2012-12-01
The Caucasus is one of the active segments of the Alpine-Himalayan collision belt. The region needs continues seismic monitoring systems for better understanding of tectonic processes going in the region. Seismic Monitoring Center of Georgia (Ilia State University) is operating the digital seismic network of the country and is also collecting and exchanging data with neighboring countries. The main focus of our study was to create seismic database which is well organized, easily reachable and is convenient for scientists to use. The seismological database includes the information about more than 100 000 earthquakes from the whole Caucasus. We have to mention that it includes data from analog and digital seismic networks. The first analog seismic station in Georgia was installed in 1899 in the Caucasus in Tbilisi city. The number of analog seismic stations was increasing during next decades and in 1980s about 100 analog stations were operated all over the region. From 1992 due to political and economical situation the number of stations has been decreased and in 2002 just two analog equipments was operated. New digital seismic network was developed in Georgia since 2003. The number of digital seismic stations was increasing and in current days there are more than 25 digital stations operating in the country. The database includes the detailed information about all equipments installed on seismic stations. Database is available online. That will make convenient interface for seismic data exchange data between Caucasus neighboring countries. It also makes easier both the seismic data processing and transferring them to the database and decreases the operator's mistakes during the routine work. The database was created using the followings: php, MySql, Javascript, Ajax, GMT, Gmap, Hypoinverse.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daley, Tom; Majer, Ernie
2007-04-30
Seismic stimulation is a proposed enhanced oil recovery(EOR) technique which uses seismic energy to increase oil production. Aspart of an integrated research effort (theory, lab and field studies),LBNL has been measuring the seismic amplitude of various stimulationsources in various oil fields (Majer, et al., 2006, Roberts,et al.,2001, Daley et al., 1999). The amplitude of the seismic waves generatedby a stimulation source is an important parameter for increased oilmobility in both theoretical models and laboratory core studies. Theseismic amplitude, typically in units of seismic strain, can be measuredin-situ by use of a borehole seismometer (geophone). Measuring thedistribution of amplitudes within amore » reservoir could allow improved designof stimulation source deployment. In March, 2007, we provided in-fieldmonitoring of two stimulation sources operating in Occidental (Oxy)Permian Ltd's South Wasson Clear Fork (SWCU) unit, located near DenverCity, Tx. The stimulation source is a downhole fluid pulsation devicedeveloped by Applied Seismic Research Corp. (ASR). Our monitoring used aborehole wall-locking 3-component geophone operating in two nearbywells.« less
NASA Astrophysics Data System (ADS)
Kell, Anna Marie
The plate margin in the western United States is an active tectonic region that contains the integrated deformation between the North American and Pacific plates. Nearly focused plate motion between the North American and Pacific plates within the northern Gulf of California gives way north of the Salton Trough to more diffuse deformation. In particular a large fraction of the slip along the southernmost San Andreas fault ultimately bleeds eastward, including about 20% of the total plate motion budget that finds its way through the transtensional Walker Lane Deformation Belt just east of the Sierra Nevada mountain range. Fault-bounded ranges combined with intervening low-lying basins characterize this region; the down-dropped features are often filled with water, which present opportunities for seismic imaging at unprecedented scales. Here I present active-source seismic imaging from the Salton Sea and Walker Lane Deformation Belt, including both marine applications in lakes and shallow seas, and more conventional land-based techniques along the Carson range front. The complex fault network beneath the Salton Trough in eastern California is the on-land continuation of the Gulf of California rift system, where North American-Pacific plate motion is accommodated by a series of long transform faults, separated by small pull-apart, transtensional basins; the right-lateral San Andreas fault bounds this system to the north where it carries, on average, about 50% of total plate motion. The Salton Sea resides within the most youthful and northerly "spreading center" in this several thousand-kilometer-long rift system. The Sea provides an ideal environment for the use of high-data-density marine seismic techniques. Two active-source seismic campaigns in 2010 and 2011 show progression of the development of the Salton pull-apart sub-basin and the northerly propagation of the Imperial-San Andreas system through time at varying resolutions. High fidelity seismic imagery
NASA Astrophysics Data System (ADS)
Bonner, J.
2006-05-01
Differences in energy partitioning of seismic phases from earthquakes and explosions provide the opportunity for event identification. In this talk, I will briefly review teleseismic Ms:mb and P/S ratio techniques that help identify events based on differences in compressional, shear, and surface wave energy generation from explosions and earthquakes. With the push to identify smaller yield explosions, the identification process has become increasingly complex as varied types of explosions, including chemical, mining, and nuclear, must be identified at regional distances. Thus, I will highlight some of the current views and problems associated with the energy partitioning of seismic phases from single- and delay-fired chemical explosions. One problem yet to have a universally accepted answer is whether the explosion and earthquake populations, based on the Ms:mb discriminants, should be separated at smaller magnitudes. I will briefly describe the datasets and theory that support either converging or parallel behavior of these populations. Also, I will discuss improvement to the currently used methods that will better constrain this problem in the future. I will also discuss the role of regional P/S ratios in identifying explosions. In particular, recent datasets from South Africa, Scandinavia, and the Western United States collected from earthquakes, single-fired chemical explosions, and/or delay-fired mining explosions have provide new insight into regional P, S, Lg, and Rg energy partitioning. Data from co-located mining and chemical explosions suggest that some mining explosions may be used for limited calibration of regional discriminants in regions where no historic explosion data is available.
NASA Astrophysics Data System (ADS)
Grab, Melchior; Scott, Samuel; Quintal, Beatriz; Caspari, Eva; Maurer, Hansruedi; Greenhalgh, Stewart
2016-04-01
Seismic methods are amongst the most common techniques to explore the earth's subsurface. Seismic properties such as velocities, impedance contrasts and attenuation enable the characterization of the rocks in a geothermal system. The most important goal of geothermal exploration, however, is to describe the enthalpy state of the pore fluids, which act as the main transport medium for the geothermal heat, and to detect permeable structures such as fracture networks, which control the movement of these pore fluids in the subsurface. Since the quantities measured with seismic methods are only indirectly related with the fluid state and the rock permeability, the interpretation of seismic datasets is difficult and usually delivers ambiguous results. To help overcome this problem, we use a numerical modeling tool that quantifies the seismic properties of fractured rock formations that are typically found in magmatic geothermal systems. We incorporate the physics of the pore fluids, ranging from the liquid to the boiling and ultimately vapor state. Furthermore, we consider the hydromechanics of permeable structures at different scales from small cooling joints to large caldera faults as are known to be present in volcanic systems. Our modeling techniques simulate oscillatory compressibility and shear tests and yield the P- and S-wave velocities and attenuation factors of fluid saturated fractured rock volumes. To apply this modeling technique to realistic scenarios, numerous input parameters need to be indentified. The properties of the rock matrix and individual fractures were derived from extensive literature research including a large number of laboratory-based studies. The geometries of fracture networks were provided by structural geologists from their published studies of outcrops. Finally, the physical properties of the pore fluid, ranging from those at ambient pressures and temperatures up to the supercritical conditions, were taken from the fluid physics
Limitations of quantitative analysis of deep crustal seismic reflection data: Examples from GLIMPCE
Lee, Myung W.; Hutchinson, Deborah R.
1992-01-01
Amplitude preservation in seismic reflection data can be obtained by a relative true amplitude (RTA) processing technique in which the relative strength of reflection amplitudes is preserved vertically as well as horizontally, after compensating for amplitude distortion by near-surface effects and propagation effects. Quantitative analysis of relative true amplitudes of the Great Lakes International Multidisciplinary Program on Crustal Evolution seismic data is hampered by large uncertainties in estimates of the water bottom reflection coefficient and the vertical amplitude correction and by inadequate noise suppression. Processing techniques such as deconvolution, F-K filtering, and migration significantly change the overall shape of amplitude curves and hence calculation of reflection coefficients and average reflectance. Thus lithological interpretation of deep crustal seismic data based on the absolute value of estimated reflection strength alone is meaningless. The relative strength of individual events, however, is preserved on curves generated at different stages in the processing. We suggest that qualitative comparisons of relative strength, if used carefully, provide a meaningful measure of variations in reflectivity. Simple theoretical models indicate that peg-leg multiples rather than water bottom multiples are the most severe source of noise contamination. These multiples are extremely difficult to remove when the water bottom reflection coefficient is large (>0.6), a condition that exists beneath parts of Lake Superior and most of Lake Huron.
Ray Tracing Methods in Seismic Emission Tomography
NASA Astrophysics Data System (ADS)
Chebotareva, I. Ya.
2018-03-01
Highly efficient approximate ray tracing techniques which can be used in seismic emission tomography and in other methods requiring a large number of raypaths are described. The techniques are applicable for the gradient and plane-layered velocity sections of the medium and for the models with a complicated geometry of contrasting boundaries. The empirical results obtained with the use of the discussed ray tracing technologies and seismic emission tomography results, as well as the results of numerical modeling, are presented.
NASA Astrophysics Data System (ADS)
Maity, Debotyam
This study is aimed at an improved understanding of unconventional reservoirs which include tight reservoirs (such as shale oil and gas plays), geothermal developments, etc. We provide a framework for improved fracture zone identification and mapping of the subsurface for a geothermal system by integrating data from different sources. The proposed ideas and methods were tested primarily on data obtained from North Brawley geothermal field and the Geysers geothermal field apart from synthetic datasets which were used to test new algorithms before actual application on the real datasets. The study has resulted in novel or improved algorithms for use at specific stages of data acquisition and analysis including improved phase detection technique for passive seismic (and teleseismic) data as well as optimization of passive seismic surveys for best possible processing results. The proposed workflow makes use of novel integration methods as a means of making best use of the available geophysical data for fracture characterization. The methodology incorporates soft computing tools such as hybrid neural networks (neuro-evolutionary algorithms) as well as geostatistical simulation techniques to improve the property estimates as well as overall characterization efficacy. The basic elements of the proposed characterization workflow involves using seismic and microseismic data to characterize structural and geomechanical features within the subsurface. We use passive seismic data to model geomechanical properties which are combined with other properties evaluated from seismic and well logs to derive both qualitative and quantitative fracture zone identifiers. The study has resulted in a broad framework highlighting a new technique for utilizing geophysical data (seismic and microseismic) for unconventional reservoir characterization. It provides an opportunity to optimally develop the resources in question by incorporating data from different sources and using their temporal
Unsupervised seismic facies analysis with spatial constraints using regularized fuzzy c-means
NASA Astrophysics Data System (ADS)
Song, Chengyun; Liu, Zhining; Cai, Hanpeng; Wang, Yaojun; Li, Xingming; Hu, Guangmin
2017-12-01
Seismic facies analysis techniques combine classification algorithms and seismic attributes to generate a map that describes main reservoir heterogeneities. However, most of the current classification algorithms only view the seismic attributes as isolated data regardless of their spatial locations, and the resulting map is generally sensitive to noise. In this paper, a regularized fuzzy c-means (RegFCM) algorithm is used for unsupervised seismic facies analysis. Due to the regularized term of the RegFCM algorithm, the data whose adjacent locations belong to same classification will play a more important role in the iterative process than other data. Therefore, this method can reduce the effect of seismic data noise presented in discontinuous regions. The synthetic data with different signal/noise values are used to demonstrate the noise tolerance ability of the RegFCM algorithm. Meanwhile, the fuzzy factor, the neighbour window size and the regularized weight are tested using various values, to provide a reference of how to set these parameters. The new approach is also applied to a real seismic data set from the F3 block of the Netherlands. The results show improved spatial continuity, with clear facies boundaries and channel morphology, which reveals that the method is an effective seismic facies analysis tool.
NASA Astrophysics Data System (ADS)
Picozzi, M.; Oth, A.; Parolai, S.; Bindi, D.; De Landro, G.; Amoroso, O.
2017-05-01
The accurate determination of stress drop, seismic efficiency, and how source parameters scale with earthquake size is an important issue for seismic hazard assessment of induced seismicity. We propose an improved nonparametric, data-driven strategy suitable for monitoring induced seismicity, which combines the generalized inversion technique together with genetic algorithms. In the first step of the analysis the generalized inversion technique allows for an effective correction of waveforms for attenuation and site contributions. Then, the retrieved source spectra are inverted by a nonlinear sensitivity-driven inversion scheme that allows accurate estimation of source parameters. We therefore investigate the earthquake source characteristics of 633 induced earthquakes (Mw 2-3.8) recorded at The Geysers geothermal field (California) by a dense seismic network (i.e., 32 stations, more than 17.000 velocity records). We find a nonself-similar behavior, empirical source spectra that require an ωγ source model with γ > 2 to be well fit and small radiation efficiency ηSW. All these findings suggest different dynamic rupture processes for smaller and larger earthquakes and that the proportion of high-frequency energy radiation and the amount of energy required to overcome the friction or for the creation of new fractures surface changes with earthquake size. Furthermore, we observe also two distinct families of events with peculiar source parameters that in one case suggests the reactivation of deep structures linked to the regional tectonics, while in the other supports the idea of an important role of steeply dipping faults in the fluid pressure diffusion.
A High-Sensitivity Broad-Band Seismic Sensor for Shallow Seismic Sounding of the Lunar Regolith
NASA Technical Reports Server (NTRS)
Pike, W. Thomas; Standley, Ian M.; Banerdt, W. Bruce
2005-01-01
The recently undertaken Space Exploration Initiative has prompted a renewed interest in techniques for characterizing the surface and shallow subsurface (0-10s of meters depth) of the Moon. There are several reasons for this: First, there is an intrinsic scientific interest in the subsurface structure. For example the stratigraphy, depth to bedrock, density/porosity, and block size distribution all have implications for the formation of, and geological processes affecting the surface, such as sequential crater ejecta deposition, impact gardening, and seismic settling. In some permanently shadowed craters there may be ice deposits just below the surface. Second, the geotechnical properties of the lunar surface layers are of keen interest to future mission planners. Regolith thickness, strength, density, grain size and compaction will affect construction of exploration infrastructure in terms of foundation strength and stability, ease of excavation, radiation shielding effectiveness, as well as raw material handling and processing techniques for resource extraction.
A seismic data compression system using subband coding
NASA Technical Reports Server (NTRS)
Kiely, A. B.; Pollara, F.
1995-01-01
This article presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The algorithm includes three stages: a decorrelation stage, a quantization stage that introduces a controlled amount of distortion to allow for high compression ratios, and a lossless entropy coding stage based on a simple but efficient arithmetic coding method. Subband coding methods are particularly suited to the decorrelation of nonstationary processes such as seismic events. Adaptivity to the nonstationary behavior of the waveform is achieved by dividing the data into separate blocks that are encoded separately with an adaptive arithmetic encoder. This is done with high efficiency due to the low overhead introduced by the arithmetic encoder in specifying its parameters. The technique could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.
NASA Astrophysics Data System (ADS)
Tsakalos, E.; Lin, A.; Bassiakos, Y.; Kazantzaki, M.; Filippaki, E.
2017-12-01
During a seismic-geodynamic process, frictional heating and pressure are generated on sediments fragments resulting in deformation and alteration of minerals contained in them. The luminescence signal enclosed in minerals crystal lattice can be affected and even zeroed during such an event. This has been breakthrough in geochronological studies as it could be utilized as a chronometer for the previous seismic activity of a tectonically active area. Although the employment of luminescence dating has in some cases been successfully described, a comprehensive study outlining and defining protocols for routine luminescence dating applied to neotectonic studies has not been forthcoming. This study is the experimental investigation, recording and parameterization of the effects of tectonic phenomena on minerals luminescence signal and the development of detailed protocols for the standardization of the luminescence methodology for directly dating deformed geological formations, so that the long-term temporal behaviour of seismically active faults could be reasonably understood and modeled. This will be achieved by: a) identifying and proposing brittle fault zone materials suitable for luminescence dating using petrological, mineralogical and chemical analyses and b) investigating the "zeroing" potential of the luminescence signal of minerals contained in fault zone materials by employing experimental simulations of tectonic processes in the laboratory, combined with luminescence measurements on samples collected from real fault zones. For this to be achieved, a number of samples collected from four faults of four different geographical regions will be used. This preliminary-first step of the study presents the microstructural, and mineralogical analyses for the characterization of brittle fault zone materials that contain suitable minerals for luminescence dating (e.g., quartz and feldspar). The results showed that the collected samples are seismically deformed fault
Bexfield, C.E.; McBride, J.H.; Pugin, Andre J.M.; Ravat, D.; Biswas, S.; Nelson, W.J.; Larson, T.H.; Sargent, S.L.; Fillerup, M.A.; Tingey, B.E.; Wald, L.; Northcott, M.L.; South, J.V.; Okure, M.S.; Chandler, M.R.
2006-01-01
Shallow high-resolution seismic reflection surveys have traditionally been restricted to either compressional (P) or horizontally polarized shear (SH) waves in order to produce 2-D images of subsurface structure. The northernmost Mississippi embayment and coincident New Madrid seismic zone (NMSZ) provide an ideal laboratory to study the experimental use of integrating P- and SH-wave seismic profiles, integrated, where practicable, with micro-gravity data. In this area, the relation between "deeper" deformation of Paleozoic bedrock associated with the formation of the Reelfoot rift and NMSZ seismicity and "shallower" deformation of overlying sediments has remained elusive, but could be revealed using integrated P- and SH-wave reflection. Surface expressions of deformation are almost non-existent in this region, which makes seismic reflection surveying the only means of detecting structures that are possibly pertinent to seismic hazard assessment. Since P- and SH-waves respond differently to the rock and fluid properties and travel at dissimilar speeds, the resulting seismic profiles provide complementary views of the subsurface based on different levels of resolution and imaging capability. P-wave profiles acquired in southwestern Illinois and western Kentucky (USA) detect faulting of deep, Paleozoic bedrock and Cretaceous reflectors while coincident SH-wave surveys show that this deformation propagates higher into overlying Tertiary and Quaternary strata. Forward modeling of micro-gravity data acquired along one of the seismic profiles further supports an interpretation of faulting of bedrock and Cretaceous strata. The integration of the two seismic and the micro-gravity methods therefore increases the scope for investigating the relation between the older and younger deformation in an area of critical seismic hazard. ?? 2006 Elsevier B.V. All rights reserved.
Seismic Characterization of the Newberry and Cooper Basin EGS Sites
NASA Astrophysics Data System (ADS)
Templeton, D. C.; Wang, J.; Goebel, M.; Johannesson, G.; Myers, S. C.; Harris, D.; Cladouhos, T. T.
2015-12-01
To aid in the seismic characterization of Engineered Geothermal Systems (EGS), we enhance traditional microearthquake detection and location methodologies at two EGS systems: the Newberry EGS site and the Habanero EGS site in the Cooper Basin of South Australia. We apply the Matched Field Processing (MFP) seismic imaging technique to detect new seismic events using known discrete microearthquake sources. Events identified using MFP typically have smaller magnitudes or occur within the coda of a larger event. Additionally, we apply a Bayesian multiple-event location algorithm, called MicroBayesLoc, to estimate the 95% probability ellipsoids for events with high signal-to-noise ratios (SNR). Such probability ellipsoid information can provide evidence for determining if a seismic lineation is real, or simply within the anticipated error range. At the Newberry EGS site, 235 events were reported in the original catalog. MFP identified 164 additional events (an increase of over 70% more events). For the relocated events in the Newberry catalog, we can distinguish two distinct seismic swarms that fall outside of one another's 95% probability error ellipsoids.This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
NASA Astrophysics Data System (ADS)
Phillips, D.; Dimech, J. L.; Weber, R. C.
2017-12-01
data processing and verified the presence of deep moonquake signals in the recovered data. This positions us well for the application of automated event-detection techniques that have been successfully applied to the Apollo 16 Passive Seismic Experiment data as well as the Apollo 17 Lunar Seismic Profiling Experiment data.
NASA Astrophysics Data System (ADS)
Maurya, S. P.; Singh, K. H.; Singh, N. P.
2018-05-01
In present study, three recently developed geostatistical methods, single attribute analysis, multi-attribute analysis and probabilistic neural network algorithm have been used to predict porosity in inter well region for Blackfoot field, Alberta, Canada, an offshore oil field. These techniques make use of seismic attributes, generated by model based inversion and colored inversion techniques. The principle objective of the study is to find the suitable combination of seismic inversion and geostatistical techniques to predict porosity and identification of prospective zones in 3D seismic volume. The porosity estimated from these geostatistical approaches is corroborated with the well log porosity. The results suggest that all the three implemented geostatistical methods are efficient and reliable to predict the porosity but the multi-attribute and probabilistic neural network analysis provide more accurate and high resolution porosity sections. A low impedance (6000-8000 m/s g/cc) and high porosity (> 15%) zone is interpreted from inverted impedance and porosity sections respectively between 1060 and 1075 ms time interval and is characterized as reservoir. The qualitative and quantitative results demonstrate that of all the employed geostatistical methods, the probabilistic neural network along with model based inversion is the most efficient method for predicting porosity in inter well region.
Automatic Seismic Signal Processing Research.
1981-09-01
be used. We then rave mD(k) AT(k) + b (11) S2 aT S SD(k) - a(k) a so Equation (9) becomes ( Gnanadesikan , 1977, p. 83; Young and Calvert, 1974, Equation... Gnanadesikan (1977, p. 196), "The main function of statistical data analysis is to extricate and explicate the informational content of a body of...R. C. Goff (1980), "Evaluation of the MARS Seismic Event Detector," Systems, Science and Software Report SSS-R-81-4656, August. Gnanadesikan , R
Detection capability of the IMS seismic network based on ambient seismic noise measurements
NASA Astrophysics Data System (ADS)
Gaebler, Peter J.; Ceranna, Lars
2016-04-01
All nuclear explosions - on the Earth's surface, underground, underwater or in the atmosphere - are banned by the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty, a verification regime was put into place to detect, locate and characterize nuclear explosion testings at any time, by anyone and everywhere on the Earth. The International Monitoring System (IMS) plays a key role in the verification regime of the CTBT. Out of the different monitoring techniques used in the IMS, the seismic waveform approach is the most effective technology for monitoring nuclear underground testing and to identify and characterize potential nuclear events. This study introduces a method of seismic threshold monitoring to assess an upper magnitude limit of a potential seismic event in a certain given geographical region. The method is based on ambient seismic background noise measurements at the individual IMS seismic stations as well as on global distance correction terms for body wave magnitudes, which are calculated using the seismic reflectivity method. From our investigations we conclude that a global detection threshold of around mb 4.0 can be achieved using only stations from the primary seismic network, a clear latitudinal dependence for the detection threshold can be observed between northern and southern hemisphere. Including the seismic stations being part of the auxiliary seismic IMS network results in a slight improvement of global detection capability. However, including wave arrivals from distances greater than 120 degrees, mainly PKP-wave arrivals, leads to a significant improvement in average global detection capability. In special this leads to an improvement of the detection threshold on the southern hemisphere. We further investigate the dependence of the detection capability on spatial (latitude and longitude) and temporal (time) parameters, as well as on parameters such as source type and percentage of operational IMS stations.
Classifying elephant behaviour through seismic vibrations.
Mortimer, Beth; Rees, William Lake; Koelemeijer, Paula; Nissen-Meyer, Tarje
2018-05-07
Seismic waves - vibrations within and along the Earth's surface - are ubiquitous sources of information. During propagation, physical factors can obscure information transfer via vibrations and influence propagation range [1]. Here, we explore how terrain type and background seismic noise influence the propagation of seismic vibrations generated by African elephants. In Kenya, we recorded the ground-based vibrations of different wild elephant behaviours, such as locomotion and infrasonic vocalisations [2], as well as natural and anthropogenic seismic noise. We employed techniques from seismology to transform the geophone recordings into source functions - the time-varying seismic signature generated at the source. We used computer modelling to constrain the propagation ranges of elephant seismic vibrations for different terrains and noise levels. Behaviours that generate a high force on a sandy terrain with low noise propagate the furthest, over the kilometre scale. Our modelling also predicts that specific elephant behaviours can be distinguished and monitored over a range of propagation distances and noise levels. We conclude that seismic cues have considerable potential for both behavioural classification and remote monitoring of wildlife. In particular, classifying the seismic signatures of specific behaviours of large mammals remotely in real time, such as elephant running, could inform on poaching threats. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Characterizing a Brazilian sanitary landfill using geophysical seismic techniques.
Abreu, A E S; Gandolfo, O C B; Vilar, O M
2016-07-01
Two different geophysical techniques, namely crosshole and multichannel analysis of surface waves - MASW, were applied to investigate the mechanical response of Municipal Solid Waste buried under humid, subtropical climate. Direct investigations revealed that the buried waste was composed mainly of soil-like material (51%) and plastics (31%) with moisture content average values of 43% near the surface and 53% after around 11m depth. Unit weight varied between 9kN/m(3) and 15kN/m(3). Seismic investigation of the landfill yielded shear wave velocities (VS) estimated from the crosshole tests ranging from 92 to 214m/s, while compression wave velocities (VP) ranged from 197 to 451m/s. Both velocities were influenced by vertical confining stress and thus tended to increase with depth. VS calculated from MASW tests were lower than the ones calculated from the crosshole tests, probably due to the different frequencies used in the tests. The results of both methods tended to configure a lower bound to the values reported in the technical literature in general, as expected for low compaction waste with small amounts of cover soil. Although VS did not show abrupt changes with depth, VP profile distribution combined with direct investigations results, such as temperature, in-place unit weight and moisture content, suggest that the waste body could be divided into two strata. The lower one is poorly drained and shows higher moisture content, as a consequence of the operational techniques used in the first years, while the upper stratum is probably related to a better drained waste stratum, resulting from the improvement of operational standards and increase in drainage facilities throughout the years. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Utility of the Extended Images in Ambient Seismic Wavefield Migration
NASA Astrophysics Data System (ADS)
Girard, A. J.; Shragge, J. C.
2015-12-01
Active-source 3D seismic migration and migration velocity analysis (MVA) are robust and highly used methods for imaging Earth structure. One class of migration methods uses extended images constructed by incorporating spatial and/or temporal wavefield correlation lags to the imaging conditions. These extended images allow users to directly assess whether images focus better with different parameters, which leads to MVA techniques that are based on the tenets of adjoint-state theory. Under certain conditions (e.g., geographical, cultural or financial), however, active-source methods can prove impractical. Utilizing ambient seismic energy that naturally propagates through the Earth is an alternate method currently used in the scientific community. Thus, an open question is whether extended images are similarly useful for ambient seismic migration processing and verifying subsurface velocity models, and whether one can similarly apply adjoint-state methods to perform ambient migration velocity analysis (AMVA). Herein, we conduct a number of numerical experiments that construct extended images from ambient seismic recordings. We demonstrate that, similar to active-source methods, there is a sensitivity to velocity in ambient seismic recordings in the migrated extended image domain. In synthetic ambient imaging tests with varying degrees of error introduced to the velocity model, the extended images are sensitive to velocity model errors. To determine the extent of this sensitivity, we utilize acoustic wave-equation propagation and cross-correlation-based migration methods to image weak body-wave signals present in the recordings. Importantly, we have also observed scenarios where non-zero correlation lags show signal while zero-lags show none. This may be a valuable missing piece for ambient migration techniques that have yielded largely inconclusive results, and might be an important piece of information for performing AMVA from ambient seismic recordings.
2D Seismic Reflection Data across Central Illinois
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Valerie; Leetaru, Hannes
interpretation of the Mt. Simon and Knox sections difficult. The data quality also gradually decreased moving westward across the state. To meet evolving project objectives, in 2012 the seismic data was re-processed using different techniques to enhance the signal quality thereby rendering a more coherent seismic profile for interpreters. It is believed that the seismic degradation could be caused by shallow natural gas deposits and Quaternary sediments (which include abandoned river and stream channels, former ponds, and swamps with peat deposits) that may have complicated or changed the seismic wavelet. Where previously limited by seismic coverage, the seismic profiles have provided valuable subsurface information across central Illinois. Some of the interpretations based on this survey included, but are not limited to: - Stratigraphy generally gently dips to the east from Morgan to Douglas County. - The Knox Supergroup roughly maintains its thickness. There is little evidence for faulting in the Knox. However, at least one resolvable fault penetrates the entire Knox section. - The Eau Claire Formation, the primary seal for the Mt. Simon Sandstone, appears to be continuous across the entire seismic profile. - The Mt. Simon Sandstone thins towards the western edge of the basin. As a result, the highly porous lowermost Mt. Simon section is absent in the western part of the state. - Overall basement dip is from west to east. - Basement topography shows evidence of basement highs with on-lapping patterns by Mt. Simon sediments. - There is evidence of faults within the lower Mt. Simon Sandstone and basement rock that are contemporaneous with Mt. Simon Sandstone deposition. These faults are not active and do not penetrate the Eau Claire Shale. It is believed that these faults are associated with a possible failed rifting event 750 to 560 million years ago during the breakup of the supercontinent Rodinia.« less
NASA Astrophysics Data System (ADS)
Bai, Z. M.; Zhang, Z. Z.; Wang, C. Y.; Klemperer, S. L.
2012-04-01
The weakened lithosphere around eastern syntax of Tibet plateau has been revealed by the Average Pn and Sn velocities, the 3D upper mantle velocity variations of P wave and S wave, and the iimaging results of magnetotelluric data. Tengchong volcanic area is neighboring to core of eastern syntax and famous for its springs, volcanic-geothermal activities and remarkable seismicity in mainland China. To probe the deep environment for the Tengchong volcanic-geothermal activity a deep seismic sounding (DSS) project was carried out across the this area in 1999. In this paper the seismic signature of crustal magma and fluid is explored from the DSS data with the seismic attribute fusion (SAF) technique, hence four possible positions for magma generation together with some locations for porous and fractured fluid beneath the Tengchong volcanic area were disclosed from the final fusion image of multi seismic attributes. The adopted attributes include the Vp, Vs and Vp/Vs results derived from a new inversion method based on the No-Ray-Tomography technique, and the migrated instantaneous attributes of central frequency, bandwidth and high frequency energy of pressure wave. Moreover, the back-projected ones which are mainly consisted by the attenuation factor Qp , the delay-time of shear wave splitting, and the amplitude ratio between S wave and P wave + S wave were also considered in this fusion process. Our fusion image indicates such a mechanism for the surface springs: a large amount of heat and the fluid released by the crystallization of magma were transmitted upward into the fluid-filled rock, and the fluid upwells along some pipeline since the high pressure in deep, thus the widespread springs of Tengchong volcanic area were developed. Moreover, the fusion image, regional volcanic and geothermal activities, and the seismicity suggest that the main risk of volcanic eruption was concentrated to the south of Tengchong city, especially around the shot point (SP) Tuantian
NASA Astrophysics Data System (ADS)
Ji, Zhan-Huai; Yan, Sheng-Gang
2017-12-01
This paper presents an analytical study of the complete transform of improved Gabor wavelets (IGWs), and discusses its application to the processing and interpretation of seismic signals. The complete Gabor wavelet transform has the following properties. First, unlike the conventional transform, the improved Gabor wavelet transform (IGWT) maps time domain signals to the time-frequency domain instead of the time-scale domain. Second, the IGW's dominant frequency is fixed, so the transform can perform signal frequency division, where the dominant frequency components of the extracted sub-band signal carry essentially the same information as the corresponding components of the original signal, and the subband signal bandwidth can be regulated effectively by the transform's resolution factor. Third, a time-frequency filter consisting of an IGWT and its inverse transform can accurately locate target areas in the time-frequency field and perform filtering in a given time-frequency range. The complete IGW transform's properties are investigated using simulation experiments and test cases, showing positive results for seismic signal processing and interpretation, such as enhancing seismic signal resolution, permitting signal frequency division, and allowing small faults to be identified.
Automatic classification of seismic events within a regional seismograph network
NASA Astrophysics Data System (ADS)
Tiira, Timo; Kortström, Jari; Uski, Marja
2015-04-01
A fully automatic method for seismic event classification within a sparse regional seismograph network is presented. The tool is based on a supervised pattern recognition technique, Support Vector Machine (SVM), trained here to distinguish weak local earthquakes from a bulk of human-made or spurious seismic events. The classification rules rely on differences in signal energy distribution between natural and artificial seismic sources. Seismic records are divided into four windows, P, P coda, S, and S coda. For each signal window STA is computed in 20 narrow frequency bands between 1 and 41 Hz. The 80 discrimination parameters are used as a training data for the SVM. The SVM models are calculated for 19 on-line seismic stations in Finland. The event data are compiled mainly from fully automatic event solutions that are manually classified after automatic location process. The station-specific SVM training events include 11-302 positive (earthquake) and 227-1048 negative (non-earthquake) examples. The best voting rules for combining results from different stations are determined during an independent testing period. Finally, the network processing rules are applied to an independent evaluation period comprising 4681 fully automatic event determinations, of which 98 % have been manually identified as explosions or noise and 2 % as earthquakes. The SVM method correctly identifies 94 % of the non-earthquakes and all the earthquakes. The results imply that the SVM tool can identify and filter out blasts and spurious events from fully automatic event solutions with a high level of confidence. The tool helps to reduce work-load in manual seismic analysis by leaving only ~5 % of the automatic event determinations, i.e. the probable earthquakes for more detailed seismological analysis. The approach presented is easy to adjust to requirements of a denser or wider high-frequency network, once enough training examples for building a station-specific data set are available.
Testing seismic amplitude source location for fast debris-flow detection at Illgraben, Switzerland
NASA Astrophysics Data System (ADS)
Walter, Fabian; Burtin, Arnaud; McArdell, Brian W.; Hovius, Niels; Weder, Bianca; Turowski, Jens M.
2017-06-01
Heavy precipitation can mobilize tens to hundreds of thousands of cubic meters of sediment in steep Alpine torrents in a short time. The resulting debris flows (mixtures of water, sediment and boulders) move downstream with velocities of several meters per second and have a high destruction potential. Warning protocols for affected communities rely on raising awareness about the debris-flow threat, precipitation monitoring and rapid detection methods. The latter, in particular, is a challenge because debris-flow-prone torrents have their catchments in steep and inaccessible terrain, where instrumentation is difficult to install and maintain. Here we test amplitude source location (ASL) as a processing scheme for seismic network data for early warning purposes. We use debris-flow and noise seismograms from the Illgraben catchment, Switzerland, a torrent system which produces several debris-flow events per year. Automatic in situ detection is currently based on geophones mounted on concrete check dams and radar stage sensors suspended above the channel. The ASL approach has the advantage that it uses seismometers, which can be installed at more accessible locations where a stable connection to mobile phone networks is available for data communication. Our ASL processing uses time-averaged ground vibration amplitudes to estimate the location of the debris-flow front. Applied to continuous data streams, inversion of the seismic amplitude decay throughout the network is robust and efficient, requires no manual identification of seismic phase arrivals and eliminates the need for a local seismic velocity model. We apply the ASL technique to a small debris-flow event on 19 July 2011, which was captured with a temporary seismic monitoring network. The processing rapidly detects the debris-flow event half an hour before arrival at the outlet of the torrent and several minutes before detection by the in situ alarm system. An analysis of continuous seismic records furthermore
Seismic migration for SAR focusing: Interferometrical applications
NASA Astrophysics Data System (ADS)
Prati, C.; Montiguarnieri, A.; Damonti, E.; Rocca, F.
SAR (Synthetic Aperture Radar) data focusing is analyzed from a theoretical point of view. Two applications of a SAR data processing algorithm are presented, where the phases of the returns are used for the recovery of interesting parameters of the observed scenes. Migration techniques, similar to those used in seismic signal processing for oil prospecting, were implemented for the determination of the terrain altitude map from a satellite and the evaluation of the sensor attitude for an airplane. A satisfying precision was achieved, since it was shown how an interferometric system is able to detect variations of the airplane roll angle of a small fraction of a degree.
Introducing Seismic Tomography with Computational Modeling
NASA Astrophysics Data System (ADS)
Neves, R.; Neves, M. L.; Teodoro, V.
2011-12-01
Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.
NASA Astrophysics Data System (ADS)
Garcia-Yeguas, A.; Ibañez, J. M.; Rietbrock, A.; Tom-Teidevs, G.
2008-12-01
An active seismic experiment to study the internal structure of Teide Volcano was carried out on Tenerife, a volcanic island in Spain's Canary Islands. The main objective of the TOM-TEIDEVS experiment is to obtain a 3-dimensional structural image of Teide Volcano using seismic tomography and seismic reflection/refraction imaging techniques. At present, knowledge of the deeper structure of Teide and Tenerife is very limited, with proposed structural models mainly based on sparse geophysical and geological data. This multinational experiment which involves institutes from Spain, Italy, the United Kingdom, Ireland, and Mexico will generate a unique high resolution structural image of the active volcano edifice and will further our understanding of volcanic processes.
The application of refraction seismics in alpine permafrost studies
NASA Astrophysics Data System (ADS)
Draebing, Daniel
2017-04-01
Permafrost studies in alpine environments focus on landslides from permafrost-affected rockwalls, landslide deposits or periglacial sediment dynamics. Mechanical properties of soils or rocks are influenced by permafrost and changed strength properties affect these periglacial processes. To assess the effects of permafrost thaw and degradation, monitoring techniques for permafrost distribution and active-layer thaw are required. Seismic wave velocities are sensitive to freezing and, therefore, refraction seismics presents a valuable tool to investigate permafrost in alpine environments. In this study, (1) laboratory and field applications of refraction seismics in alpine environments are reviewed and (2) data are used to quantify effects of rock properties (e.g. lithology, porosity, anisotropy, saturation) on p-wave velocities. In the next step, (3) influence of environmental factors are evaluated and conclusions drawn on permafrost differentiation within alpine periglacial landforms. This study shows that p-wave velocity increase is susceptible to porosity which is pronounced in high-porosity rocks. In low-porosity rocks, p-wave velocity increase is controlled by anisotropy decrease due to ice pressure (Draebing and Krautblatter, 2012) which enables active-layer and permafrost differentiation at rockwall scale (Krautblatter and Draebing, 2014; Draebing et al., 2016). However, discontinuity distribution can result in high anisotropy effects on seismic velocities which can impede permafrost differentiation (Phillips et al., 2016). Due to production or deposition history, porosity can show large spatial differences in deposited landforms. Landforms with large boulders such as rock glaciers and moraines show highest p-wave velocity differences between active-layer and permafrost which facilitates differentiation (Draebing, 2016). Saturation with water is essential for the successful application of refraction seismics for permafrost detection and can be controlled at
Hydrologically-driven crustal stresses and seismicity in the New Madrid Seismic Zone.
Craig, Timothy J; Chanard, Kristel; Calais, Eric
2017-12-15
The degree to which short-term non-tectonic processes, either natural and anthropogenic, influence the occurrence of earthquakes in active tectonic settings or 'stable' plate interiors, remains a subject of debate. Recent work in plate-boundary regions demonstrates the capacity for long-wavelength changes in continental water storage to produce observable surface deformation, induce crustal stresses and modulate seismicity rates. Here we show that a significant variation in the rate of microearthquakes in the intraplate New Madrid Seismic Zone at annual and multi-annual timescales coincides with hydrological loading in the upper Mississippi embayment. We demonstrate that this loading, which results in geodetically observed surface deformation, induces stresses within the lithosphere that, although of small amplitude, modulate the ongoing seismicity of the New Madrid region. Correspondence between surface deformation, hydrological loading and seismicity rates at both annual and multi-annual timescales indicates that seismicity variations are the direct result of elastic stresses induced by the water load.
Illuminating Asset Value through New Seismic Technology
NASA Astrophysics Data System (ADS)
Brandsberg-Dahl, S.
2007-05-01
The ability to reduce risk and uncertainty across the full life cycle of an asset is directly correlated to creating an accurate subsurface image that enhances our understanding of the geology. This presentation focuses on this objective in areas of complex overburden in deepwater. Marine 3D seismic surveys have been acquired in essentially the same way for the past decade. This configuration of towed streamer acquisition, where the boat acquires data in one azimuth has been very effective in imaging areas in fairly benign geologic settings. As the industry has moved into more complicated geologic settings these surveys no longer meet the imaging objectives for risk reduction in exploration through production. In shallow water, we have seen increasing use of ocean bottom cables to meet this challenge. For deepwater, new breakthroughs in technology were required. This will be highlighted through examples of imaging below large salt bodies in the deep water Gulf of Mexico. GoM - Mad Dog: The Mad Dog field is located approximately 140 miles south of the Louisiana coastline in the southern Green Canyon area in water depths between 4100 feet to 6000 feet. The complex salt canopy overlying a large portion of the field results in generally poor seismic data quality. Advanced processing techniques improved the image, but gaps still remained even after several years of effort. We concluded that wide azimuth acquisition was required to illuminate the field in a new way. Results from the Wide Azimuth Towed Streamer (WATS) survey deployed at Mad Dog demonstrated the anticipated improvement in the subsalt image. GoM - Atlantis Field: An alternative approach to wide azimuth acquisition, ocean bottom seismic (OBS) node technology, was developed and tested. In 2001 deepwater practical experience was limited to a few nodes owned by academic institutions and there were no commercial solutions either available or in development. BP embarked on a program of sea trials designed to both
Seismic reflection constraints on the glacial dynamics of Johnsons Glacier, Antarctica
NASA Astrophysics Data System (ADS)
Benjumea, Beatriz; Teixidó, Teresa
2001-01-01
During two Antarctic summers (1996-1997 and 1997-1998), five seismic refraction and two reflection profiles were acquired on the Johnsons Glacier (Livingston Island, Antarctica) in order to obtain information about the structure of the ice, characteristics of the ice-bed contact and basement topography. An innovative technique has been used for the acquisition of reflection data to optimise the field survey schedule. Different shallow seismic sources were used during each field season: Seismic Impulse Source System (SISSY) for the first field survey and low-energy explosives (pyrotechnic noisemakers) during the second one. A comparison between these two shallow seismic sources has been performed, showing that the use of the explosives is a better seismic source in this ice environment. This is one of the first studies where this type of source has been used. The analysis of seismic data corresponding to one of the reflection profiles (L3) allows us to delineate sectors with different glacier structure (accumulation and ablation zones) without using glaciological data. Moreover, vertical discontinuities were detected by the presence of back-scattered energy and the abrupt change in frequency content of first arrivals shown in shot records. After the raw data analysis, standard processing led us to a clear seismic image of the underlying bed topography, which can be correlated with the ice flow velocity anomalies. The information obtained from seismic data on the internal structure of the glacier, location of fracture zones and the topography of the ice-bed interface constrains the glacial dynamics of Johnsons Glacier.
Imaging Fracture Networks Using Angled Crosshole Seismic Logging and Change Detection Techniques
NASA Astrophysics Data System (ADS)
Knox, H. A.; Grubelich, M. C.; Preston, L. A.; Knox, J. M.; King, D. K.
2015-12-01
We present results from a SubTER funded series of cross borehole geophysical imaging efforts designed to characterize fracture zones generated with an alternative stimulation method, which is being developed for Enhanced Geothermal Systems (EGS). One important characteristic of this stimulation method is that each detonation will produce multiple fractures without damaging the wellbore. To date, we have collected six full data sets with ~30k source-receiver pairs each for the purposes of high-resolution cross borehole seismic tomographic imaging. The first set of data serves as the baseline measurement (i.e. un-stimulated), three sets evaluate material changes after fracture emplacement and/or enhancement, and two sets are used for evaluation of pick error and seismic velocity changes attributable to changing environmental factors (i.e. saturation due to rain/snowfall in the shallow subsurface). Each of the six datasets has been evaluated for data quality and first arrivals have been picked on nearly 200k waveforms in the target area. Each set of data is then inverted using a Vidale-Hole finite-difference 3-D eikonal solver in two ways: 1) allowing for iterative ray tracing and 2) with fixed ray paths determined from the test performed before the fracture stimulation of interest. Utilizing these two methods allows us to compare and contrast the results from two commonly used change detection techniques. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Characteristics of Swarm Seismicity in Northern California
NASA Astrophysics Data System (ADS)
Chiorini, S.; Lekic, V.
2017-12-01
Seismic swarms are characterized by an anomalously large number of earthquakes compared to the background rate of seismicity that are tightly clustered in space (typically, one to tens of kilometers) and time (typically, days to weeks). However, why and how swarms occur is poorly understood, partly because of the difficulty of identifying the range of swarm behaviors within large seismic catalogs. Previous studies have found that swarms, compared to other earthquake sequences, appear to be more common in extensional (Vidale & Shearer, 2006) and volcanic settings (Hayashi & Morita, 2003). In addition, swarms more commonly exhibit migration patterns, consistent with either fluid diffusion (Chen & Shearer, 2011; Chen et al., 2012) or aseismic creep (Lohman & McGuire, 2007), and are preferentially found in areas of enhanced heat flow (Enescu, 2009; Zaliapin & Ben Zion, 2016). While the swarm seismicity of Southern California has been studied extensively, that of Northern California has not been systematically documented and characterized. We employed two complementary methods of swarm identification: the approach of Vidale and Shearer (2006; henceforth VS2006) based on a priori threshold distances and timings of quakes, and the spatio-temporal distance metric proposed by Zaliapin et al. (2008; henceforth Z2008) in order to build a complete catalog of swarm seismicity in Northern California spanning 1984-2016 (Waldhauser & Schaff, 2008). Once filtered for aftershocks, the catalog allows us to describe the main features of swarm seismicity in Northern California, including spatial distribution, association or lack thereof with known faults and volcanic systems, and seismically quiescent regions. We then apply a robust technique to characterize the morphology of swarms, leading to subsets of swarms that are oriented either vertically or horizontally in space. When mapped, vertical swarms show a significant association with volcanic regions, and horizontal swarms with
NASA Astrophysics Data System (ADS)
Mendoza, M.; Ghosh, A.; Rai, S. S.
2017-12-01
The devastation brought on by the Mw 7.8 Gorkha earthquake in Nepal on 25 April 2015, reconditioned people to the high earthquake risk along the Himalayan arc. It is therefore imperative to learn from the Gorkha earthquake, and gain a better understanding of the state of stress in this fault regime, in order to identify areas that could produce the next devastating earthquake. Here, we focus on what is known as the "central Himalaya seismic gap". It is located in Uttarakhand, India, west of Nepal, where a large (> Mw 7.0) earthquake has not occurred for over the past 200 years [Rajendran, C.P., & Rajendran, K., 2005]. This 500 - 800 km long along-strike seismic gap has been poorly studied, mainly due to the lack of modern and dense instrumentation. It is especially concerning since it surrounds densely populated cities, such as New Delhi. In this study, we analyze a rich seismic dataset from a dense network consisting of 50 broadband stations, that operated between 2005 and 2012. We use the STA/LTA filter technique to detect earthquake phases, and the latest tools contributed to the Antelope software environment, to develop a large and robust earthquake catalog containing thousands of precise hypocentral locations, magnitudes, and focal mechanisms. By refining those locations in HypoDD [Waldhauser & Ellsworth, 2000] to form a tighter cluster of events using relative relocation, we can potentially illustrate fault structures in this region with high resolution. Additionally, using ZMAP [Weimer, S., 2001], we perform a variety of statistical analyses to understand the variability and nature of seismicity occurring in the region. Generating a large and consistent earthquake catalog not only brings to light the physical processes controlling the earthquake cycle in an Himalayan seismogenic zone, it also illustrates how stresses are building up along the décollment and the faults that stem from it. With this new catalog, we aim to reveal fault structure, study
Seismic While Drilling Case Study in Shengli Oilfield, Eastern China
NASA Astrophysics Data System (ADS)
Wang, L.; Liu, H.; Tong, S.; Zou, Z.
2015-12-01
Seismic while drilling (SWD) is a promising borehole seismic technique with reduction of drilling risk, cost savings and increased efficiency. To evaluate the technical and economic benefits of this new technique, we carried out SWD survey at well G130 in Shengli Oilfield of Eastern China. Well G130 is an evaluation well, located in Dongying depression at depth more than 3500m. We used an array of portable seismometers to record the surface SWD-data, during the whole drilling progress. The pilot signal was being recorded continuously, by an accelerometer mounted on the top of the drill string. There were also two seismometers buried in the drill yard, one near diesel engine and another near derrick. All the data was being recorded continuously. According to mud logging data, we have processed and analyzed all the data. It demonstrates the drill yard noise is the primary noise among the whole surface wavefield and its dominant frequency is about 20Hz. Crosscorrelation of surface signal with the pilot signal shows its SNR is severely low and there is no any obvious event of drill-bit signals. Fortunately, the autocorrelation of the pilot signal shows clear BHA multiple and drill string multiple. The period of drill string multiple can be used for establishing the reference time (so-called zero time). We identified and removed different noises from the surface SWD-data, taking advantages of wavefield analysis. The drill-bit signal was retrieved from surface SWD-data, using seismic interferometry. And a reverse vertical seismic profile (RVSP) data set for the continuous drilling depth was established. The subsurface images derived from these data compare well with the corresponding images of 3D surface seismic survey cross the well.
NASA Astrophysics Data System (ADS)
Derode, B.; Riquelme, S.; Ruiz, J. A.; Leyton, F.; Campos, J. A.; Delouis, B.
2014-12-01
The intermediate depth earthquakes of high moment magnitude (Mw ≥ 8) in Chile have had a relative greater impact in terms of damage, injuries and deaths, than thrust type ones with similar magnitude (e.g. 1939, 1950, 1965, 1997, 2003, and 2005). Some of them have been studied in details, showing paucity of aftershocks, down-dip tensional focal mechanisms, high stress-drop and subhorizontal rupture. At present, their physical mechanism remains unclear because ambient temperatures and pressures are expected to lead to ductile, rather than brittle deformation. We examine source characteristics of more than 100 intraslab intermediate depth earthquakes using local and regional waveforms data obtained from broadband and accelerometers stations of IPOC network in northern Chile. With this high quality database, we estimated the total radiated energy from the energy flux carried by P and S waves integrating this flux in time and space, and evaluated their seismic moment directly from both spectral amplitude and near-field waveform inversion methods. We estimated the three parameters Ea, τa and M0 because their estimates entail no model dependence. Interestingly, the seismic nest studied using near-field re-location and only data from stations close to the source (D<250km) appears to not be homogeneous in terms of depths, displaying unusual seismic gaps along the Wadati-Benioff zone. Moreover, as confirmed by other studies of intermediate-depth earthquakes in subduction zones, very high stress drop ( >> 10MPa) and low radiation efficiency in this seismic nest were found. These unusual seismic parameter values can be interpreted as the expression of the loose of a big quantity of the emitted energy by heating processes during the rupture. Although it remains difficult to conclude about the processes of seismic nucleation, we present here results that seem to support a thermal weakening behavior of the fault zones and the existence of thermal stress processes like thermal
Seismicity of the Adriatic microplate
Console, R.; Di, Giovambattista R.; Favali, P.; Presgrave, B.W.; Smriglio, G.
1993-01-01
The Adriatic microplate was previously considered to be a unique block, tectonically active only along its margins. The seismic sequences that took place in the basin from 1986 to 1990 give new information about the geodynamics of this area. Three subsets of well recorded events were relocated by the joint hypocentre determination technique. On the whole, this seismic activity was concentrated in a belt crossing the southern Adriatic sea around latitude 42??, in connection with regional E-W fault systems. Some features of this seismicity, similar to those observed in other well known active margins of the Adriatic plate, support a model of a southern Adriatic lithospheric block, detached from the Northern one. Other geophysical information provides evidence of a transitional zone at the same latitude. ?? 1993.
NASA Astrophysics Data System (ADS)
Trugman, Daniel Taylor
The complexity of the earthquake rupture process makes earthquakes inherently unpredictable. Seismic hazard forecasts often presume that the rate of earthquake occurrence can be adequately modeled as a space-time homogenenous or stationary Poisson process and that the relation between the dynamical source properties of small and large earthquakes obey self-similar scaling relations. While these simplified models provide useful approximations and encapsulate the first-order statistical features of the historical seismic record, they are inconsistent with the complexity underlying earthquake occurrence and can lead to misleading assessments of seismic hazard when applied in practice. The six principle chapters of this thesis explore the extent to which the behavior of real earthquakes deviates from these simplified models, and the implications that the observed deviations have for our understanding of earthquake rupture processes and seismic hazard. Chapter 1 provides a brief thematic overview and introduction to the scope of this thesis. Chapter 2 examines the complexity of the 2010 M7.2 El Mayor-Cucapah earthquake, focusing on the relation between its unexpected and unprecedented occurrence and anthropogenic stresses from the nearby Cerro Prieto Geothermal Field. Chapter 3 compares long-term changes in seismicity within California's three largest geothermal fields in an effort to characterize the relative influence of natural and anthropogenic stress transients on local seismic hazard. Chapter 4 describes a hybrid, hierarchical clustering algorithm that can be used to relocate earthquakes using waveform cross-correlation, and applies the new algorithm to study the spatiotemporal evolution of two recent seismic swarms in western Nevada. Chapter 5 describes a new spectral decomposition technique that can be used to analyze the dynamic source properties of large datasets of earthquakes, and applies this approach to revisit the question of self-similar scaling of
NASA Astrophysics Data System (ADS)
Garcia, Alicia; Fernandez-Ros, Alberto; Berrocoso, Manuel; Marrero, Jose Manuel; Prates, Gonçalo; De la Cruz-Reyna, Servando; Ortiz, Ramon
2014-05-01
In July 2011 at El Hierro (Canary Islands, Spain), a volcanic unrest was detected, with significant deformations followed by increased seismicity. A submarine eruption started on 10 October 2011 and ceased on 5 March 2012, after the volcanic tremor signals persistently weakened through February 2012. However, the seismic activity did not end when the eruption, as several other seismic crises followed since. The seismic episodes presented a characteristic pattern: over a few days the number and magnitude of seismic event increased persistently, culminating in seismic events severe enough to be felt all over the island. In all cases the seismic activity was preceded by significant deformations measured on the island's surface that continued during the whole episode. Analysis of the available GNSS-GPS and seismic data suggests that several magma injection processes occurred at depth from the beginning of the unrest. A model combining the geometry of the magma injection process and the variations in seismic energy released has allowed successful forecasting of the new-vent opening. The model presented here places special emphasis on phenomena associated to moderate eruptions, as well as on volcano-tectonic earthquakes and landslides, which in some cases, as in El Hierro, may be more destructive than an eruption itself.
[Advance in interferogram data processing technique].
Jing, Juan-Juan; Xiangli, Bin; Lü, Qun-Bo; Huang, Min; Zhou, Jin-Song
2011-04-01
Fourier transform spectrometry is a type of novel information obtaining technology, which integrated the functions of imaging and spectra, but the data that the instrument acquired is the interference data of the target, which is an intermediate data and couldn't be used directly, so data processing must be adopted for the successful application of the interferometric data In the present paper, data processing techniques are divided into two classes: general-purpose and special-type. First, the advance in universal interferometric data processing technique is introduced, then the special-type interferometric data extracting method and data processing technique is illustrated according to the classification of Fourier transform spectroscopy. Finally, the trends of interferogram data processing technique are discussed.
Application of Seismic Array Processing to Tsunami Early Warning
NASA Astrophysics Data System (ADS)
An, C.; Meng, L.
2015-12-01
Tsunami wave predictions of the current tsunami warning systems rely on accurate earthquake source inversions of wave height data. They are of limited effectiveness for the near-field areas since the tsunami waves arrive before data are collected. Recent seismic and tsunami disasters have revealed the need for early warning to protect near-source coastal populations. In this work we developed the basis for a tsunami warning system based on rapid earthquake source characterisation through regional seismic array back-projections. We explored rapid earthquake source imaging using onshore dense seismic arrays located at regional distances on the order of 1000 km, which provides faster source images than conventional teleseismic back-projections. We implement this method in a simulated real-time environment, and analysed the 2011 Tohoku earthquake rupture with two clusters of Hi-net stations in Kyushu and Northern Hokkaido, and the 2014 Iquique event with the Earthscope USArray Transportable Array. The results yield reasonable estimates of rupture area, which is approximated by an ellipse and leads to the construction of simple slip models based on empirical scaling of the rupture area, seismic moment and average slip. The slip model is then used as the input of the tsunami simulation package COMCOT to predict the tsunami waves. In the example of the Tohoku event, the earthquake source model can be acquired within 6 minutes from the start of rupture and the simulation of tsunami waves takes less than 2 min, which could facilitate a timely tsunami warning. The predicted arrival time and wave amplitude reasonably fit observations. Based on this method, we propose to develop an automatic warning mechanism that provides rapid near-field warning for areas of high tsunami risk. The initial focus will be Japan, Pacific Northwest and Alaska, where dense seismic networks with the capability of real-time data telemetry and open data accessibility, such as the Japanese HiNet (>800
Seismic instrumentation plan for the Hawaiian Volcano Observatory
Thelen, Weston A.
2014-01-01
The installation of new seismic stations is only the first part of building a volcanic early warning capability for seismicity in the State of Hawaii. Additional personnel will likely be required to study the volcanic processes at work under each volcano, analyze the current seismic activity at a level sufficient for early warning, build new tools for monitoring, maintain seismic computing resources, and maintain the new seismic stations.
NASA Astrophysics Data System (ADS)
Panea, I.; Stephenson, R.; Knapp, C.; Mocanu, V.; Drijkoningen, G.; Matenco, L.; Knapp, J.; Prodehl, K.
2005-12-01
The DACIA PLAN (Danube and Carpathian Integrated Action on Process in the Lithosphere and Neotectonics) deep seismic sounding survey was performed in August-September 2001 in south-eastern Romania, at the same time as the regional deep refraction seismic survey VRANCEA 2001. The main goal of the experiment was to obtain new information on the deep structure of the external Carpathians nappes and the architecture of Tertiary/Quaternary basins developed within and adjacent to the seismically-active Vrancea zone, including the Focsani Basin. The seismic reflection line had a WNW-ESE orientation, running from internal East Carpathians units, across the mountainous south-eastern Carpathians, and the foreland Focsani Basin towards the Danube Delta. There were 131 shot points along the profile, with about 1 km spacing, and data were recorded with stand-alone RefTek-125s (also known as "Texans"), supplied by the University Texas at El Paso and the PASSCAL Institute. The entire line was recorded in three deployments, using about 340 receivers in the first deployment and 640 receivers in each of the other two deployments. The resulting deep seismic reflection stacks, processed to 20 s along the entire profile and to 10 s in the eastern Focsani Basin, are presented here. The regional architecture of the latter, interpreted in the context of abundant independent constraint from exploration seismic and subsurface data, is well imaged. Image quality within and beneath the thrust belt is of much poorer quality. Nevertheless, there is good evidence to suggest that a thick (˜10 km) sedimentary basin having the structure of a graben and of indeterminate age underlies the westernmost part of the Focsani Basin, in the depth range 10-25 km. Most of the crustal depth seismicity observed in the Vrancea zone (as opposed to the more intense upper mantle seismicity) appears to be associated with this sedimentary basin. The sedimentary successions within this basin and other horizons
A Comparative of business process modelling techniques
NASA Astrophysics Data System (ADS)
Tangkawarow, I. R. H. T.; Waworuntu, J.
2016-04-01
In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.
ON-SITE CAVITY LOCATION-SEISMIC PROFILING AT NEVADA TEST SITE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Forbes, C.B.; Peterson, R.A.; Heald, C.L.
1961-10-25
Experimental seismic studies were conducted at the Nevada Test Site for the purpose of designing and evaluating the most promising seismic techniques for on-site inspection. Post-explosion seismic profiling was done in volcanic tuff in the vicinity of the Rainier and Blanca underground explosions. Pre-explosion seismic profiling was done over granitic rock outcrops in the Climax Stock area, and over tuff at proposed location for Linen and Orchid. Near surface velocity profiling techniques based on measurements of seismic time-distance curves gave evidence of disturbances in near surface rock velocities over the Rainier and Refer als0 to abstract 30187. Blanca sites. Thesemore » disturbances appear to be related to near surface fracturing and spallation effects resulting from the reflection of the original intense compression wave pulse at the near surface as a tension pulse. Large tuned seismometer arrays were used for horizontal seismic ranging in an attempt to record back-scattered'' or reflected seismic waves from subsurface cavities or zones of rock fracturing around the underground explosions. Some possible seismic events were recorded from the near vicinities of the Rainier and Blanca sites. However, many more similar events were recorded from numerous other locations, presumably originating from naturally occurring underground geological features. No means was found for discriminating between artificial and natural events recorded by horizontal seismic ranging, and the results were, therefore, not immediately useful for inspection purposes. It is concluded that in some instances near surface velocity profiling methods may provide a useful tool in verifying the presence of spalled zones above underground nuclear explosion sites. In the case of horizontal seismic ranging it appears that successful application would require development of satisfactory means for recognition of and discrimination against seismic responses to naturally occurring geological
Seismic Methods of Identifying Explosions and Estimating Their Yield
NASA Astrophysics Data System (ADS)
Walter, W. R.; Ford, S. R.; Pasyanos, M.; Pyle, M. L.; Myers, S. C.; Mellors, R. J.; Pitarka, A.; Rodgers, A. J.; Hauk, T. F.
2014-12-01
Seismology plays a key national security role in detecting, locating, identifying and determining the yield of explosions from a variety of causes, including accidents, terrorist attacks and nuclear testing treaty violations (e.g. Koper et al., 2003, 1999; Walter et al. 1995). A collection of mainly empirical forensic techniques has been successfully developed over many years to obtain source information on explosions from their seismic signatures (e.g. Bowers and Selby, 2009). However a lesson from the three DPRK declared nuclear explosions since 2006, is that our historic collection of data may not be representative of future nuclear test signatures (e.g. Selby et al., 2012). To have confidence in identifying future explosions amongst the background of other seismic signals, and accurately estimate their yield, we need to put our empirical methods on a firmer physical footing. Goals of current research are to improve our physical understanding of the mechanisms of explosion generation of S- and surface-waves, and to advance our ability to numerically model and predict them. As part of that process we are re-examining regional seismic data from a variety of nuclear test sites including the DPRK and the former Nevada Test Site (now the Nevada National Security Site (NNSS)). Newer relative location and amplitude techniques can be employed to better quantify differences between explosions and used to understand those differences in term of depth, media and other properties. We are also making use of the Source Physics Experiments (SPE) at NNSS. The SPE chemical explosions are explicitly designed to improve our understanding of emplacement and source material effects on the generation of shear and surface waves (e.g. Snelson et al., 2013). Finally we are also exploring the value of combining seismic information with other technologies including acoustic and InSAR techniques to better understand the source characteristics. Our goal is to improve our explosion models
NASA Astrophysics Data System (ADS)
Zhao, Qi
Rock failure process is a complex phenomenon that involves elastic and plastic deformation, microscopic cracking, macroscopic fracturing, and frictional slipping of fractures. Understanding this complex behaviour has been the focus of a significant amount of research. In this work, the combined finite-discrete element method (FDEM) was first employed to study (1) the influence of rock discontinuities on hydraulic fracturing and associated seismicity and (2) the influence of in-situ stress on seismic behaviour. Simulated seismic events were analyzed using post-processing tools including frequency-magnitude distribution (b-value), spatial fractal dimension (D-value), seismic rate, and fracture clustering. These simulations demonstrated that at the local scale, fractures tended to propagate following the rock mass discontinuities; while at reservoir scale, they developed in the direction parallel to the maximum in-situ stress. Moreover, seismic signature (i.e., b-value, D-value, and seismic rate) can help to distinguish different phases of the failure process. The FDEM modelling technique and developed analysis tools were then coupled with laboratory experiments to further investigate the different phases of the progressive rock failure process. Firstly, a uniaxial compression experiment, monitored using a time-lapse ultrasonic tomography method, was carried out and reproduced by the numerical model. Using this combination of technologies, the entire deformation and failure processes were studied at macroscopic and microscopic scales. The results not only illustrated the rock failure and seismic behaviours at different stress levels, but also suggested several precursory behaviours indicating the catastrophic failure of the rock. Secondly, rotary shear experiments were conducted using a newly developed rock physics experimental apparatus ERDmu-T) that was paired with X-ray micro-computed tomography (muCT). This combination of technologies has significant advantages
a Comparative Case Study of Reflection Seismic Imaging Method
NASA Astrophysics Data System (ADS)
Alamooti, M.; Aydin, A.
2017-12-01
Seismic imaging is the most common means of gathering information about subsurface structural features. The accuracy of seismic images may be highly variable depending on the complexity of the subsurface and on how seismic data is processed. One of the crucial steps in this process, especially in layered sequences with complicated structure, is the time and/or depth migration of seismic data.The primary purpose of the migration is to increase the spatial resolution of seismic images by repositioning the recorded seismic signal back to its original point of reflection in time/space, which enhances information about complex structure. In this study, our objective is to process a seismic data set (courtesy of the University of South Carolina) to generate an image on which the Magruder fault near Allendale SC can be clearly distinguished and its attitude can be accurately depicted. The data was gathered by common mid-point method with 60 geophones equally spaced along an about 550 m long traverse over a nearly flat ground. The results obtained from the application of different migration algorithms (including finite-difference and Kirchhoff) are compared in time and depth domains to investigate the efficiency of each algorithm in reducing the processing time and improving the accuracy of seismic images in reflecting the correct position of the Magruder fault.
The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.
NASA Astrophysics Data System (ADS)
Reymond, Dominique
2016-04-01
We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http://sourceforge.net/projects/seismic
100 years of seismic research on the Moho
NASA Astrophysics Data System (ADS)
Prodehl, Claus; Kennett, Brian; Artemieva, Irina M.; Thybo, Hans
2013-12-01
The detection of a seismic boundary, the “Moho”, between the outermost shell of the Earth, the Earth's crust, and the Earth's mantle by A. Mohorovičić was the consequence of increased insight into the propagation of seismic waves caused by earthquakes. This short history of seismic research on the Moho is primarily based on the comprehensive overview of the worldwide history of seismological studies of the Earth's crust using controlled sources from 1850 to 2005, by Prodehl and Mooney (2012). Though the art of applying explosions, so-called “artificial events”, as energy sources for studies of the uppermost crustal layers began in the early 1900s, its effective use for studying the entire crust only began at the end of World War II. From 1945 onwards, controlled-source seismology has been the major approach to study details of the crust and underlying crust-mantle boundary, the Moho. The subsequent description of history of controlled-source crustal seismology and its seminal results is subdivided into separate chapters for each decade, highlighting the major advances achieved during that decade in terms of data acquisition, processing technology, and interpretation methods. Since the late 1980s, passive seismology using distant earthquakes has played an increasingly important role in studies of crustal structure. The receiver function technique exploiting conversions between P and SV waves at discontinuities in seismic wavespeed below a seismic station has been extensively applied to the increasing numbers of permanent and portable broad-band seismic stations across the globe. Receiver function studies supplement controlled source work with improved geographic coverage and now make a significant contribution to knowledge of the nature of the crust and the depth to Moho.
Reconstructing the Seismic Wavefield using Curvelets and Distributed Acoustic Sensing
NASA Astrophysics Data System (ADS)
Muir, J. B.; Zhan, Z.
2017-12-01
Distributed Acoustic Sensing (DAS) offers an opportunity to produce cost effective and uniquely dense images of the surface seismic wavefield - DAS also produces extremely large data volumes that require innovative methods of data reduction and seismic parameter inversion to handle efficiently. We leverage DAS and the super-Nyquist sampling enabled by compressed sensing of the wavefield in the curvelet domain to produce accurate images of the horizontal velocity within a target region, using only short ( 1-10 minutes) records of either active seismic sources or ambient seismic signals. Once the wavefield has been fully described, modern "tomographic" techniques, such as Helmholtz tomography or Wavefield Gradiometry, can be employed to determine seismic parameters of interest such as phase velocity. An additional practical benefit of employing a wavefield reconstruction step is that multiple heterogeneous forms of instrumentation can be naturally combined - therefore in this study we also explore the addition of three component nodal seismic data into the reconstructed wavefield. We illustrate these techniques using both synthetic examples and data taken from the Brady Geothermal Field in Nevada during the PoroTomo (U. Wisconsin Madison) experiment of 2016.
Seismic and Biological Sources of Ambient Ocean Sound
NASA Astrophysics Data System (ADS)
Freeman, Simon Eric
Sound is the most efficient radiation in the ocean. Sounds of seismic and biological origin contain information regarding the underlying processes that created them. A single hydrophone records summary time-frequency information from the volume within acoustic range. Beamforming using a hydrophone array additionally produces azimuthal estimates of sound sources. A two-dimensional array and acoustic focusing produce an unambiguous two-dimensional `image' of sources. This dissertation describes the application of these techniques in three cases. The first utilizes hydrophone arrays to investigate T-phases (water-borne seismic waves) in the Philippine Sea. Ninety T-phases were recorded over a 12-day period, implying a greater number of seismic events occur than are detected by terrestrial seismic monitoring in the region. Observation of an azimuthally migrating T-phase suggests that reverberation of such sounds from bathymetric features can occur over megameter scales. In the second case, single hydrophone recordings from coral reefs in the Line Islands archipelago reveal that local ambient reef sound is spectrally similar to sounds produced by small, hard-shelled benthic invertebrates in captivity. Time-lapse photography of the reef reveals an increase in benthic invertebrate activity at sundown, consistent with an increase in sound level. The dominant acoustic phenomenon on these reefs may thus originate from the interaction between a large number of small invertebrates and the substrate. Such sounds could be used to take census of hard-shelled benthic invertebrates that are otherwise extremely difficult to survey. A two-dimensional `map' of sound production over a coral reef in the Hawaiian Islands was obtained using two-dimensional hydrophone array in the third case. Heterogeneously distributed bio-acoustic sources were generally co-located with rocky reef areas. Acoustically dominant snapping shrimp were largely restricted to one location within the area surveyed
Properties of the seismic nucleation phase
Beroza, G.C.; Ellsworth, W.L.
1996-01-01
Near-source observations show that earthquakes begin abruptly at the P-wave arrival, but that this beginning is weak, with a low moment rate relative to the rest of the main shock. We term this initial phase of low moment rate the seismic nucleation phase. We have observed the seismic nucleation phase for a set of 48 earthquakes ranging in magnitude from 1.1-8.1. The size and duration of the seismic nucleation phase scale with the total seismic moment of the earthquake, suggesting that the process responsible for the seismic nucleation phase carries information about the eventual size of the earthquake. The seismic nucleation phase is characteristically followed by quadratic growth in the moment rate, consistent with self-similar rupture at constant stress drop. In this paper we quantify the properties of the seismic nucleation phase and offer several possible explanations for it.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zharkov, S.; Matthews, S. A.; Zharkova, V. V.
2011-10-01
The first observations of seismic responses to solar flares were carried out using time-distance (TD) and holography techniques applied to SOHO/Michelson Doppler Imager (MDI) Dopplergrams obtained from space and unaffected by terrestrial atmospheric disturbances. However, the ground-based network GONG is potentially a very valuable source of sunquake observations, especially in cases where space observations are unavailable. In this paper, we present an updated technique for pre-processing of GONG observations for the application of subjacent vantage holography. Using this method and TD diagrams, we investigate several sunquakes observed in association with M- and X-class solar flares and compare the outcomes withmore » those reported earlier using MDI data. In both GONG and MDI data sets, for the first time, we also detect the TD ridge associated with the 2001 September 9 flare. Our results show reassuringly positive identification of sunquakes from GONG data that can provide further information about the physics of seismic processes associated with solar flares.« less
NASA Astrophysics Data System (ADS)
Tibuleac, I. M.; Iovenitti, J. L.; Pullammanappallil, S. K.; von Seggern, D. H.; Ibser, H.; Shaw, D.; McLachlan, H.
2015-12-01
A new, cost effective and non-invasive exploration method using ambient seismic noise has been tested at Soda Lake, NV, with promising results. Seismic interferometry was used to extract Green's Functions (P and surface waves) from 21 days of continuous ambient seismic noise. With the advantage of S-velocity models estimated from surface waves, an ambient noise seismic reflection survey along a line (named Line 2), although with lower resolution, reproduced the results of the active survey, when the ambient seismic noise was not contaminated by strong cultural noise. Ambient noise resolution was less at depth (below 1000m) compared to the active survey. Useful information could be recovered from ambient seismic noise, including dipping features and fault locations. Processing method tests were developed, with potential to improve the virtual reflection survey results. Through innovative signal processing techniques, periods not typically analyzed with high frequency sensors were used in this study to obtain seismic velocity model information to a depth of 1.4km. New seismic parameters such as Green's Function reflection component lateral variations, waveform entropy, stochastic parameters (Correlation Length and Hurst number) and spectral frequency content extracted from active and passive surveys showed potential to indicate geothermal favorability through their correlation with high temperature anomalies, and showed potential as fault indicators, thus reducing the uncertainty in fault identification. Geothermal favorability maps along ambient seismic Line 2 were generated considering temperature, lithology and the seismic parameters investigated in this study and compared to the active Line 2 results. Pseudo-favorability maps were also generated using only the seismic parameters analyzed in this study.
NASA Astrophysics Data System (ADS)
Tsuji, T.; Ikeda, T.; Nimiya, H.
2017-12-01
We report spatio-temporal variations of seismic velocity around the seismogenic faults in western Japan. We mainly focus on the seismic velocity variation during (1) the 2016 Off-Mie earthquake in the Nankai subduction zone (Mw5.8) and (2) the 2016 Kumamoto earthquake in Kyushu Island (Mw7.0). We applied seismic interferometry and surface wave analysis to the ambient noise data recorded by Hi-net and DONET seismometers of National Research Institute for Earth Science and Disaster Resilience (NIED). Seismic velocity near the rupture faults and volcano decreased during the earthquake. For example, we observed velocity reduction around the seismogenic Futagawa-Hinagu fault system and Mt Aso in the 2016 Kumamoto earthquake. We also identified velocity increase after the eruptions of Mt Aso. During the 2016 Off-Mie earthquake, we observed seismic velocity variation in the Nankai accretionary prism. After the earthquakes, the seismic velocity gradually returned to the pre-earthquake value. The velocity recovering process (healing process) is caused by several mechanisms, such as pore pressure reduction, strain change, and crack sealing. By showing the velocity variations obtained at different geologic settings (volcano, seismogenic fault, unconsolidated sediment), we discuss the mechanism of seismic velocity variation as well as the post-seismic fault healing process.
NASA Astrophysics Data System (ADS)
El Fellah, Younes; El-Aal, Abd El-Aziz Khairy Abd; Harnafi, Mimoun; Villaseñor, Antonio
2017-05-01
In the current work, we constructed new comprehensive standard seismic noise models and 3D temporal-spatial seismic noise level cubes for Morocco in north-west Africa to be used for seismological and engineering purposes. Indeed, the original global standard seismic noise models published by Peterson (1993) and their following updates by Astiz and Creager (1995), Ekström (2001) and Berger et al. (2003) had no contributing seismic stations deployed in North Africa. Consequently, this preliminary study was conducted to shed light on seismic noise levels specific to north-west Africa. For this purpose, 23 broadband seismic stations recently installed in different structural domains throughout Morocco are used to study the nature and characteristics of seismic noise and to create seismic noise models for Morocco. Continuous data recorded during 2009, 2010 and 2011 were processed and analysed to construct these new noise models and 3D noise levels from all stations. We compared the Peterson new high-noise model (NHNM) and low-noise model (NLNM) with the Moroccan high-noise model (MHNM) and low-noise model (MLNM). These new noise models are comparable to the United States Geological Survey (USGS) models in the short period band; however, in the period range 1.2 s to 1000 s for MLNM and 10 s to 1000 s for MHNM display significant variations. This variation is attributed to differences in the nature of seismic noise sources that dominate Morocco in these period bands. The results of this study have a new perception about permanent seismic noise models for this spectacular region and can be considered a significant contribution because it supplements the Peterson models and can also be used to site future permanent seismic stations in Morocco.
Vertical Cable Seismic Survey for SMS exploration
NASA Astrophysics Data System (ADS)
Asakawa, Eiichi; Murakami, Fumitoshi; Tsukahara, Hotoshi; Mizohata, Shigeharu
2014-05-01
The Vertical Cable Seismic (VCS) survey is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by sea-surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. Because the VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed it for the SMS survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We have been developing the VCS survey system, including not only data acquisition hardware but data processing and analysis technique. We carried out several VCS surveys combining with surface towed source, deep towed source and ocean bottom source. The water depths of these surveys are from 100m up to 2100 m. Through these experiments, our VCS data acquisition system has been also completed. But the data processing techniques are still on the way. One of the most critical issues is the positioning in the water. The uncertainty in the positions of the source and of the hydrophones in water degraded the quality of subsurface image. GPS navigation system is available on sea surface, but in case of deep-towed source or ocean bottom source, the accuracy of shot position with SSBL/USBL is not sufficient for the very high-resolution imaging. We have developed a new approach to determine the positions in water using the travel time data from the source to VCS hydrophones. In 2013, we have carried out the second VCS survey using the surface-towed high-voltage sparker and ocean bottom source in the Izena Cauldron, which is one of the most promising SMS areas around Japan. The positions of ocean bottom source estimated by this method are consistent with the VCS field records. The VCS data with the sparker have been processed with 3D PSTM. It gives the very high resolution 3D volume deeper than two
Quantifying the similarity of seismic polarizations
NASA Astrophysics Data System (ADS)
Jones, Joshua P.; Eaton, David W.; Caffagni, Enrico
2016-02-01
Assessing the similarities of seismic attributes can help identify tremor, low signal-to-noise (S/N) signals and converted or reflected phases, in addition to diagnosing site noise and sensor misalignment in arrays. Polarization analysis is a widely accepted method for studying the orientation and directional characteristics of seismic phases via computed attributes, but similarity is ordinarily discussed using qualitative comparisons with reference values or known seismic sources. Here we introduce a technique for quantitative polarization similarity that uses weighted histograms computed in short, overlapping time windows, drawing on methods adapted from the image processing and computer vision literature. Our method accounts for ambiguity in azimuth and incidence angle and variations in S/N ratio. Measuring polarization similarity allows easy identification of site noise and sensor misalignment and can help identify coherent noise and emergent or low S/N phase arrivals. Dissimilar azimuths during phase arrivals indicate misaligned horizontal components, dissimilar incidence angles during phase arrivals indicate misaligned vertical components and dissimilar linear polarization may indicate a secondary noise source. Using records of the Mw = 8.3 Sea of Okhotsk earthquake, from Canadian National Seismic Network broad-band sensors in British Columbia and Yukon Territory, Canada, and a vertical borehole array at Hoadley gas field, central Alberta, Canada, we demonstrate that our method is robust to station spacing. Discrete wavelet analysis extends polarization similarity to the time-frequency domain in a straightforward way. Time-frequency polarization similarities of borehole data suggest that a coherent noise source may have persisted above 8 Hz several months after peak resource extraction from a `flowback' type hydraulic fracture.
Learnings from the Monitoring of Induced Seismicity in Western Canada over the Past Three Years
NASA Astrophysics Data System (ADS)
Yenier, E.; Moores, A. O.; Baturan, D.; Spriggs, N.
2017-12-01
In response to induced seismicity observed in western Canada, existing public networks have been densified and a number of private networks have been deployed to closely monitor the earthquakes induced by hydraulic fracturing operations in the region. These networks have produced an unprecedented volume of seismic data, which can be used to map pre-existing geological structures and understand their activation mechanisms. Here, we present insights gained over the past three years from induced seismicity monitoring (ISM) for some of the most active operators in Canada. First, we discuss the benefits of high-quality ISM data sets for making operational decisions and how their value largely depends on choice of instrumentation, seismic network design and data processing techniques. Using examples from recent research studies, we illustrate the key role of robust modeling of regional source, attenuation and site attributes on the accuracy of event magnitudes, ground motion estimates and induced seismicity hazard assessment. Finally, acknowledging that the ultimate goal of ISM networks is assisting operators to manage induced seismic risk, we share some examples of how ISM data products can be integrated into existing protocols for developing effective risk management strategies.
NASA Astrophysics Data System (ADS)
Provost, F.; Malet, J. P.; Hibert, C.; Doubre, C.
2017-12-01
The Super-Sauze landslide is a clay-rich landslide located the Southern French Alps. The landslide exhibits a complex pattern of deformation: a large number of rockfalls are observed in the 100 m height main scarp while the deformation of the upper part of the accumulated material is mainly affected by material shearing along stable in-situ crests. Several fissures are locally observed. The shallowest layer of the accumulated material tends to behave in a brittle manner but may undergo fluidization and/or rapid acceleration. Previous studies have demonstrated the presence of a rich endogenous micro-seismicity associated to the deformation of the landslide. However, the lack of long-term seismic records and suitable processing chains prevented a full interpretation of the links between the external forcings, the deformation and the recorded seismic signals. Since 2013, two permanent seismic arrays are installed in the upper part of the landslide. We here present the methodology adopted to process this dataset. The processing chain consists of a set of automated methods for automatic and robust detection, classification and location of the recorded seismicity. Thousands of events are detected and further automatically classified. The classification method is based on the description of the signal through attributes (e.g. waveform, spectral content properties). These attributes are used as inputs to classify the signal using a Random Forest machine-learning algorithm in four classes: endogenous micro-quakes, rockfalls, regional earthquakes and natural/anthropogenic noises. The endogenous landslide sources (i.e. micro-quake and rockfall) are further located. The location method is adapted to the type of event. The micro-quakes are located with a 3D velocity model derived from a seismic tomography campaign and an optimization of the first arrival picking with the inter-trace correlation of the P-wave arrivals. The rockfalls are located by optimizing the inter
Seismic Search Engine: A distributed database for mining large scale seismic data
NASA Astrophysics Data System (ADS)
Liu, Y.; Vaidya, S.; Kuzma, H. A.
2009-12-01
The International Monitoring System (IMS) of the CTBTO collects terabytes worth of seismic measurements from many receiver stations situated around the earth with the goal of detecting underground nuclear testing events and distinguishing them from other benign, but more common events such as earthquakes and mine blasts. The International Data Center (IDC) processes and analyzes these measurements, as they are collected by the IMS, to summarize event detections in daily bulletins. Thereafter, the data measurements are archived into a large format database. Our proposed Seismic Search Engine (SSE) will facilitate a framework for data exploration of the seismic database as well as the development of seismic data mining algorithms. Analogous to GenBank, the annotated genetic sequence database maintained by NIH, through SSE, we intend to provide public access to seismic data and a set of processing and analysis tools, along with community-generated annotations and statistical models to help interpret the data. SSE will implement queries as user-defined functions composed from standard tools and models. Each query is compiled and executed over the database internally before reporting results back to the user. Since queries are expressed with standard tools and models, users can easily reproduce published results within this framework for peer-review and making metric comparisons. As an illustration, an example query is “what are the best receiver stations in East Asia for detecting events in the Middle East?” Evaluating this query involves listing all receiver stations in East Asia, characterizing known seismic events in that region, and constructing a profile for each receiver station to determine how effective its measurements are at predicting each event. The results of this query can be used to help prioritize how data is collected, identify defective instruments, and guide future sensor placements.
Innovative Approaches for Seismic Studies of Mars (Invited)
NASA Astrophysics Data System (ADS)
Banerdt, B.
2010-12-01
In addition to its intrinsic interest, Mars is particularly well-suited for studying the full range of processes and phenomena related to early terrestrial planet evolution, from initial differentiation to the start of plate tectonics. It is large and complex enough to have undergone most of the processes that affected early Earth but, unlike the Earth, has apparently not undergone extensive plate tectonics or other major reworking that erased the imprint of early events (as evidenced by the presence of cratered surfaces older than 4 Ga). The martian mantle should have Earth-like polymorphic phase transitions and may even support a perovskite layer near the core (depending on the actual core radius), a characteristic that would have major implications for core cooling and mantle convection. Thus even the most basic measurements of planetary structure, such as crustal thickness, core radius and state (solid/liquid), and gross mantle velocity structure would provide invaluable constraints on models of early planetary evolution. Despite this strong scientific motivation (and several failed attempts), Mars remains terra incognita from a seismic standpoint. This is due to an unfortunate convergence of circumstances, prominent among which are our uncertainty in the level of seismic activity and the relatively high cost of landing multiple long-lived spacecraft on Mars to comprise a seismic network for body-wave travel-time analysis; typically four to ten stations are considered necessary for this type of experiment. In this presentation I will address both of these issues. In order to overcome the concern about a possible lack of marsquakes with which to work, it is useful to identify alternative methods for using seismic techniques to probe the interior. Seismology without quakes can be accomplished in a number of ways. “Unconventional” sources of seismic energy include meteorites (which strike the surface of Mars at a relatively high rate), artificial projectiles
Systematic detection of seismic events at Mount St. Helens with an ultra-dense array
NASA Astrophysics Data System (ADS)
Meng, X.; Hartog, J. R.; Schmandt, B.; Hotovec-Ellis, A. J.; Hansen, S. M.; Vidale, J. E.; Vanderplas, J.
2016-12-01
During the summer of 2014, an ultra-dense array of 900 geophones was deployed around the crater of Mount St. Helens and continuously operated for 15 days. This dataset provides us an unprecedented opportunity to systematically detect seismic events around an active volcano and study their underlying mechanisms. We use a waveform-based matched filter technique to detect seismic events from this dataset. Due to the large volume of continuous data ( 1 TB), we performed the detection on the GPU cluster Stampede (https://www.tacc.utexas.edu/systems/stampede). We build a suite of template events from three catalogs: 1) the standard Pacific Northwest Seismic Network (PNSN) catalog (45 events); 2) the catalog from Hansen&Schmandt (2015) obtained with a reverse-time imaging method (212 events); and 3) the catalog identified with a matched filter technique using the PNSN permanent stations (190 events). By searching for template matches in the ultra-dense array, we find 2237 events. We then calibrate precise relative magnitudes for template and detected events, using a principal component fit to measure waveform amplitude ratios. The magnitude of completeness and b-value of the detected catalog is -0.5 and 1.1, respectively. Our detected catalog shows several intensive swarms, which are likely driven by fluid pressure transients in conduits or slip transients on faults underneath the volcano. We are currently relocating the detected catalog with HypoDD and measuring the seismic velocity changes at Mount St. Helens using the coda wave interferometry of detected repeating earthquakes. The accurate temporal-spatial migration pattern of seismicity and seismic property changes should shed light on the physical processes beneath Mount St. Helens.
Imaging near surface mineral targets with ambient seismic noise
NASA Astrophysics Data System (ADS)
Dales, P.; Audet, P.; Olivier, G.
2017-12-01
To keep up with global metal and mineral demand, new ore-deposits have to be discovered on a regular basis. This task is becoming increasingly difficult, since easily accessible deposits have been exhausted to a large degree. The typical procedure for mineral exploration begins with geophysical surveys followed by a drilling program to investigate potential targets. Since the retrieved drill core samples are one-dimensional observations, the many holes needed to interpolate and interpret potential deposits can lead to very high costs. To reduce the amount of drilling, active seismic imaging is sometimes used as an intermediary, however the active sources (e.g. large vibrating trucks or explosive shots) are expensive and unsuitable for operation in remote or environmentally sensitive areas. In recent years, passive seismic imaging using ambient noise has emerged as a novel, low-cost and environmentally sensitive approach for exploring the sub-surface. This technique dispels with active seismic sources and instead uses ambient seismic noise such as ocean waves, traffic or minor earthquakes. Unfortunately at this point, passive surveys are not capable of reaching the required resolution to image the vast majority of the ore-bodies that are being explored. In this presentation, we will show the results of an experiment where ambient seismic noise recorded on 60 seismic stations was used to image a near-mine target. The target consists of a known ore-body that has been partially exhausted by mining efforts roughly 100 years ago. The experiment examined whether ambient seismic noise interferometry can be used to image the intact and exhausted ore deposit. A drilling campaign was also conducted near the target which offers the opportunity to compare the two methods. If the accuracy and resolution of passive seismic imaging can be improved to that of active surveys (and beyond), this method could become an inexpensive intermediary step in the exploration process and result
Taylor, Michael H.; Dillon, William P.; Anton, Christopher H.; Danforth, William W.
1999-01-01
As part of an ongoing study, seismic-reflection profiles were collected over the Blake Ridge in 1992 and 1995, in order to map the volume and distribution of methane hydrate. Faulting and seafloor instabilities appear to be related to methane hydrate processes at the Blake Ridge. Seismic profiles display a prominent collapse structure at the crest, which is inferred to have resulted from the mobilization of sediment that was associated with methane hydrate dissociation.
NASA Astrophysics Data System (ADS)
Matcharashvili, Teimuraz N.; Chelidze, Tamaz L.; Zhukova, Natalia N.
2015-07-01
Investigation of dynamical features of the seismic process as well as the possible influence of different natural and man-made impacts on it remains one of the main interdisciplinary research challenges. The question of external influences (forcings) acquires new importance in the light of known facts on possible essential changes, which occur in the behavior of complex systems due to different relatively weak external impacts. Seismic processes in the complicated tectonic system are not an exclusion from this general rule. In the present research we continued the investigation of dynamical features of seismic activity in Central Asia around the Bishkek (Kyrgyzstan) test area, where strong electromagnetic (EM) soundings were performed in the 1980s. The unexpected result of these experiments was that they revealed the impact of strong electromagnetic discharges on the microseismic activity of investigated area. We used an earthquake catalogue of this area to investigate dynamical features of seismic activity in periods before, during, and after the mentioned man-made EM forcings. Different methods of modern time series analysis have been used, such as wavelet transformation, Hilbert Huang transformation, detrended fluctuation analysis, and recurrence quantification analysis. Namely, inter-event (waiting) time intervals, inter-earthquake distances and magnitude sequences, as well as time series of the number of daily occurring earthquakes have been analyzed. We concluded that man-made high-energy EM irradiation essentially affects dynamics of the seismic process in the investigated area in its temporal and spatial domains; namely, the extent of order in earthquake time and space distribution increase. At the same time, EM influence on the energetic distribution is not clear from the present analysis. It was also shown that the influence of EM impulses on dynamical features of seismicity differs in different areas of the examined territory around the test site. Clear
NASA Astrophysics Data System (ADS)
Vargas-Meleza, Liliana; Healy, David; Alsop, G. Ian; Timms, Nicholas E.
2015-01-01
We present the influence of mineralogy and microstructure on the seismic velocity anisotropy of evaporites. Bulk elastic properties and seismic velocities are calculated for a suite of 20 natural evaporite samples, which consist mainly of halite, anhydrite, and gypsum. They exhibit strong fabrics as a result of tectonic and diagenetic processes. Sample mineralogy and crystallographic preferred orientation (CPO) were obtained with the electron backscatter diffraction (EBSD) technique and the data used for seismic velocity calculations. Bulk seismic properties for polymineralic evaporites were evaluated with a rock recipe approach. Ultrasonic velocity measurements were also taken on cube shaped samples to assess the contribution of grain-scale shape preferred orientation (SPO) to the total seismic anisotropy. The sample results suggest that CPO is responsible for a significant fraction of the bulk seismic properties, in agreement with observations from previous studies. Results from the rock recipe indicate that increasing modal proportion of anhydrite grains can lead to a greater seismic anisotropy of a halite-dominated rock. Conversely, it can lead to a smaller seismic anisotropy degree of a gypsum-dominated rock until an estimated threshold proportion after which anisotropy increases again. The difference between the predicted anisotropy due to CPO and the anisotropy measured with ultrasonic velocities is attributed to the SPO and grain boundary effects in these evaporites.
Ambient Seismic Noise Interferometry on the Island of Hawai`i
NASA Astrophysics Data System (ADS)
Ballmer, Silke
Ambient seismic noise interferometry has been successfully applied in a variety of tectonic settings to gain information about the subsurface. As a passive seismic technique, it extracts the coherent part of ambient seismic noise in-between pairs of seismic receivers. Measurements of subtle temporal changes in seismic velocities, and high-resolution tomographic imaging are then possible - two applications of particular interest for volcano monitoring. Promising results from other volcanic settings motivate its application in Hawai'i, with this work being the first to explore its potential. The dataset used for this purpose was recorded by the Hawaiian Volcano Observatory's permanent seismic network on the Island of Hawai'i. It spans 2.5 years from 5/2007 to 12/2009 and covers two distinct sources of volcanic tremor. After applying standard processing for ambient seismic noise interferometry, we find that volcanic tremor strongly affects the extracted noise information not only close to the tremor source, but unexpectedly, throughout the island-wide network. Besides demonstrating how this long-range observability of volcanic tremor can be used to monitor volcanic activity in the absence of a dense seismic array, our results suggest that care must be taken when applying ambient seismic noise interferometry in volcanic settings. In a second step, we thus exclude days that show signs of volcanic tremor, reducing the dataset to three months, and perform ambient seismic noise tomography. The resulting two-dimensional Rayleigh wave group velocity maps for 0.1 - 0.9 Hz compare very well with images from previous travel time tomography, both, for the main volcanic structures at low frequencies as well as for smaller features at mid-to-high frequencies - a remarkable observation for the temporally truncated dataset. These robust results suggest that ambient seismic noise tomography in Hawai'i is suitable 1) to provide a three-dimensional S-wave model for the volcanoes and 2
Lunar seismic profiling experiment natural activity study
NASA Technical Reports Server (NTRS)
Duennebier, F. K.
1976-01-01
The Lunar Seismic Experiment Natural Activity Study has provided a unique opportunity to study the high frequency (4-20 Hz) portion to the seismic spectrum on the moon. The data obtained from the LSPE was studied to evaluate the origin and importance of the process that generates thermal moonquakes and the characteristics of the seismic scattering zone at the lunar surface. The detection of thermal moonquakes by the LSPE array made it possible to locate the sources of many events and determine that they are definitely not generated by astronaut activities but are the result of a natural process on the moon. The propagation of seismic waves in the near-surface layers was studied in a qualitative manner. In the absence of an adequate theoretical model for the propagation of seismic waves in the moon, it is not possible to assign a depth for the scattering layer. The LSPE data does define several parameters which must be satisfied by any model developed in the future.
NASA Astrophysics Data System (ADS)
Bohm, Mirjam; Haberland, Christian; Asch, Günter
2013-04-01
We use local earthquake data observed by the amphibious, temporary seismic MERAMEX array to derive spatial variations of seismic attenuation (Qp) in the crust and upper mantle beneath Central Java. The path-averaged attenuation values (t∗) of a high quality subset of 84 local earthquakes were calculated by a spectral inversion technique. These 1929 t∗-values inverted by a least-squares tomographic inversion yield the 3D distribution of the specific attenuation (Qp). Analysis of the model resolution matrix and synthetic recovery tests were used to investigate the confidence of the Qp-model. We notice a prominent zone of increased attenuation beneath and north of the modern volcanic arc at depths down to 15 km. Most of this anomaly seems to be related to the Eocene-Miocene Kendeng Basin (mainly in the eastern part of the study area). Enhanced attenuation is also found in the upper crust in the direct vicinity of recent volcanoes pointing towards zones of partial melts, presence of fluids and increased temperatures in the middle to upper crust. The middle and lower crust seems not to be associated with strong heating and the presence of melts throughout the arc. Enhanced attenuation above the subducting slab beneath the marine forearc seems to be due to the presence of fluids.
NASA Astrophysics Data System (ADS)
Caudron, Corentin; Taisne, Benoit; Kugaenko, Yulia; Saltykov, Vadim
2015-12-01
In contrast of the 1975-76 Tolbachik eruption, the 2012-13 Tolbachik eruption was not preceded by any striking change in seismic activity. By processing the Klyuchevskoy volcano group seismic data with the Seismic Amplitude Ratio Analysis (SARA) method, we gain insights into the dynamics of magma movement prior to this important eruption. A clear seismic migration within the seismic swarm, started 20 hours before the reported eruption onset (05:15 UTC, 26 November 2012). This migration proceeded in different phases and ended when eruptive tremor, corresponding to lava flows, was recorded (at 11:00 UTC, 27 November 2012). In order to get a first order approximation of the magma location, we compare the calculated seismic intensity ratios with the theoretical ones. As expected, the observations suggest that the seismicity migrated toward the eruption location. However, we explain the pre-eruptive observed ratios by a vertical migration under the northern slope of Plosky Tolbachik volcano followed by a lateral migration toward the eruptive vents. Another migration is also captured by this technique and coincides with a seismic swarm that started 16-20 km to the south of Plosky Tolbachik at 20:31 UTC on November 28 and lasted for more than 2 days. This seismic swarm is very similar to the seismicity preceding the 1975-76 Tolbachik eruption and can be considered as a possible aborted eruption.
Induced Seismicity Monitoring System
NASA Astrophysics Data System (ADS)
Taylor, S. R.; Jarpe, S.; Harben, P.
2014-12-01
There are many seismological aspects associated with monitoring of permanent storage of carbon dioxide (CO2) in geologic formations. Many of these include monitoring underground gas migration through detailed tomographic studies of rock properties, integrity of the cap rock and micro seismicity with time. These types of studies require expensive deployments of surface and borehole sensors in the vicinity of the CO2 injection wells. Another problem that may exist in CO2 sequestration fields is the potential for damaging induced seismicity associated with fluid injection into the geologic reservoir. Seismic hazard monitoring in CO2 sequestration fields requires a seismic network over a spatially larger region possibly having stations in remote settings. Expensive observatory-grade seismic systems are not necessary for seismic hazard deployments or small-scale tomographic studies. Hazard monitoring requires accurate location of induced seismicity to magnitude levels only slightly less than that which can be felt at the surface (e.g. magnitude 1), and the frequencies of interest for tomographic analysis are ~1 Hz and greater. We have developed a seismo/acoustic smart sensor system that can achieve the goals necessary for induced seismicity monitoring in CO2 sequestration fields. The unit is inexpensive, lightweight, easy to deploy, can operate remotely under harsh conditions and features 9 channels of recording (currently 3C 4.5 Hz geophone, MEMS accelerometer and microphone). An on-board processor allows for satellite transmission of parameter data to a processing center. Continuous or event-detected data is kept on two removable flash SD cards of up to 64+ Gbytes each. If available, data can be transmitted via cell phone modem or picked up via site visits. Low-power consumption allows for autonomous operation using only a 10 watt solar panel and a gel-cell battery. The system has been successfully tested for long-term (> 6 months) remote operations over a wide range
Seismicity of Cascade Volcanoes: Characterization and Comparison
NASA Astrophysics Data System (ADS)
Thelen, W. A.
2016-12-01
Here we summarize and compare the seismicity around each of the Very High Threat Volcanoes of the Cascade Range of Washington, Oregon and California as defined by the National Volcanic Early Warning System (NVEWS) threat assessment (Ewert et al., 2005). Understanding the background seismic activity and processes controlling it is critical for assessing changes in seismicity and their implications for volcanic hazards. Comparing seismicity at different volcanic centers can help determine what critical factors or processes affect the observed seismic behavior. Of the ten Very High Threat Volcanoes in the Cascade Range, five volcanoes are consistently seismogenic when considering earthquakes within 10 km of the volcanic center or caldera edge (Mount Rainier, Mount St. Helens, Mount Hood, Newberry Caldera, Lassen Volcanic Center). Other Very High Threat volcanoes (South Sister, Mount Baker, Glacier Peak, Crater Lake and Mount Shasta) have comparatively low rates of seismicity and not enough recorded earthquakes to calculate catalog statistics. Using a swarm definition of 3 or more earthquakes occurring in a day with magnitudes above the largest of the network's magnitude of completenesses (M 0.9), we find that Lassen Volcanic Center is the "swarmiest" in terms of percent of seismicity occurring in swarms, followed by Mount Hood, Mount St. Helens and Rainier. The predominance of swarms at Mount Hood may be overstated, as much of the seismicity is occurring on surrounding crustal faults (Jones and Malone, 2005). Newberry Caldera has a relatively short record of seismicity since the permanent network was installed in 2011, however there have been no swarms detected as defined here. Future work will include developing discriminates for volcanic versus tectonic seismicity to better filter the seismic catalog and more precise binning of depths at some volcanoes so that we may better consider different processes. Ewert J. W., Guffanti, M. and Murray, T. L. (2005). An
NASA Astrophysics Data System (ADS)
Wei, Jia; Liu, Huaishan; Xing, Lei; Du, Dong
2018-02-01
The stability of submarine geological structures has a crucial influence on the construction of offshore engineering projects and the exploitation of seabed resources. Marine geologists should possess a detailed understanding of common submarine geological hazards. Current marine seismic exploration methods are based on the most effective detection technologies. Therefore, current research focuses on improving the resolution and precision of shallow stratum structure detection methods. In this article, the feasibility of shallow seismic structure imaging is assessed by building a complex model, and differences between the seismic interferometry imaging method and the traditional imaging method are discussed. The imaging effect of the model is better for shallow layers than for deep layers because coherent noise produced by this method can result in an unsatisfactory imaging effect for deep layers. The seismic interference method has certain advantages for geological structural imaging of shallow submarine strata, which indicates continuous horizontal events, a high resolution, a clear fault, and an obvious structure boundary. The effects of the actual data applied to the Shenhu area can fully illustrate the advantages of the method. Thus, this method has the potential to provide new insights for shallow submarine strata imaging in the area.
Probabilistic Simulation of Territorial Seismic Scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baratta, Alessandro; Corbi, Ileana
2008-07-08
The paper is focused on a stochastic process for the prevision of seismic scenarios on the territory and developed by means of some basic assumptions in the procedure and by elaborating the fundamental parameters recorded during some ground motions occurred in a seismic area.
Exploring Large-Scale Cross-Correlation for Teleseismic and Regional Seismic Event Characterization
NASA Astrophysics Data System (ADS)
Dodge, Doug; Walter, William; Myers, Steve; Ford, Sean; Harris, Dave; Ruppert, Stan; Buttler, Dave; Hauk, Terri
2013-04-01
The decrease in costs of both digital storage space and computation power invites new methods of seismic data processing. At Lawrence Livermore National Laboratory(LLNL) we operate a growing research database of seismic events and waveforms for nuclear explosion monitoring and other applications. Currently the LLNL database contains several million events associated with tens of millions of waveforms at thousands of stations. We are making use of this database to explore the power of seismic waveform correlation to quantify signal similarities, to discover new events not in catalogs, and to more accurately locate events and identify source types. Building on the very efficient correlation methodologies of Harris and Dodge (2011) we computed the waveform correlation for event pairs in the LLNL database in two ways. First we performed entire waveform cross-correlation over seven distinct frequency bands. The correlation coefficient exceeds 0.6 for more than 40 million waveform pairs for several hundred thousand events at more than a thousand stations. These correlations reveal clusters of mining events and aftershock sequences, which can be used to readily identify and locate events. Second we determine relative pick times by correlating signals in time windows for distinct seismic phases. These correlated picks are then used to perform very high accuracy event relocations. We are examining the percentage of events that correlate as a function of magnitude and observing station distance in selected high seismicity regions. Combining these empirical results and those using synthetic data, we are working to quantify relationships between correlation and event pair separation (in epicenter and depth) as well as mechanism differences. Our exploration of these techniques on a large seismic database is in process and we will report on our findings in more detail at the meeting.
Exploring Large-Scale Cross-Correlation for Teleseismic and Regional Seismic Event Characterization
NASA Astrophysics Data System (ADS)
Dodge, D.; Walter, W. R.; Myers, S. C.; Ford, S. R.; Harris, D.; Ruppert, S.; Buttler, D.; Hauk, T. F.
2012-12-01
The decrease in costs of both digital storage space and computation power invites new methods of seismic data processing. At Lawrence Livermore National Laboratory (LLNL) we operate a growing research database of seismic events and waveforms for nuclear explosion monitoring and other applications. Currently the LLNL database contains several million events associated with tens of millions of waveforms at thousands of stations. We are making use of this database to explore the power of seismic waveform correlation to quantify signal similarities, to discover new events not in catalogs, and to more accurately locate events and identify source types. Building on the very efficient correlation methodologies of Harris and Dodge (2011) we computed the waveform correlation for event pairs in the LLNL database in two ways. First we performed entire waveform cross-correlation over seven distinct frequency bands. The correlation coefficient exceeds 0.6 for more than 40 million waveform pairs for several hundred thousand events at more than a thousand stations. These correlations reveal clusters of mining events and aftershock sequences, which can be used to readily identify and locate events. Second we determine relative pick times by correlating signals in time windows for distinct seismic phases. These correlated picks are then used to perform very high accuracy event relocations. We are examining the percentage of events that correlate as a function of magnitude and observing station distance in selected high seismicity regions. Combining these empirical results and those using synthetic data, we are working to quantify relationships between correlation and event pair separation (in epicenter and depth) as well as mechanism differences. Our exploration of these techniques on a large seismic database is in process and we will report on our findings in more detail at the meeting.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
LANL seismic screening method for existing buildings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O.
1997-01-01
The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method andmore » will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.« less
Bohnhoff, Marco; Dresen, Georg; Ellsworth, William L.; Ito, Hisao; Cloetingh, Sierd; Negendank, Jörg
2010-01-01
An important discovery in crustal mechanics has been that the Earth’s crust is commonly stressed close to failure, even in tectonically quiet areas. As a result, small natural or man-made perturbations to the local stress field may trigger earthquakes. To understand these processes, Passive Seismic Monitoring (PSM) with seismometer arrays is a widely used technique that has been successfully applied to study seismicity at different magnitude levels ranging from acoustic emissions generated in the laboratory under controlled conditions, to seismicity induced by hydraulic stimulations in geological reservoirs, and up to great earthquakes occurring along plate boundaries. In all these environments the appropriate deployment of seismic sensors, i.e., directly on the rock sample, at the earth’s surface or in boreholes close to the seismic sources allows for the detection and location of brittle failure processes at sufficiently low magnitude-detection threshold and with adequate spatial resolution for further analysis. One principal aim is to develop an improved understanding of the physical processes occurring at the seismic source and their relationship to the host geologic environment. In this paper we review selected case studies and future directions of PSM efforts across a wide range of scales and environments. These include induced failure within small rock samples, hydrocarbon reservoirs, and natural seismicity at convergent and transform plate boundaries. Each example represents a milestone with regard to bridging the gap between laboratory-scale experiments under controlled boundary conditions and large-scale field studies. The common motivation for all studies is to refine the understanding of how earthquakes nucleate, how they proceed and how they interact in space and time. This is of special relevance at the larger end of the magnitude scale, i.e., for large devastating earthquakes due to their severe socio-economic impact.
Weighted stacking of seismic AVO data using hybrid AB semblance and local similarity
NASA Astrophysics Data System (ADS)
Deng, Pan; Chen, Yangkang; Zhang, Yu; Zhou, Hua-Wei
2016-04-01
The common-midpoint (CMP) stacking technique plays an important role in enhancing the signal-to-noise ratio (SNR) in seismic data processing and imaging. Weighted stacking is often used to improve the performance of conventional equal-weight stacking in further attenuating random noise and handling the amplitude variations in real seismic data. In this study, we propose to use a hybrid framework of combining AB semblance and a local-similarity-weighted stacking scheme. The objective is to achieve an optimal stacking of the CMP gathers with class II amplitude-variation-with-offset (AVO) polarity-reversal anomaly. The selection of high-quality near-offset reference trace is another innovation of this work because of its better preservation of useful energy. Applications to synthetic and field seismic data demonstrate a great improvement using our method to capture the true locations of weak reflections, distinguish thin-bed tuning artifacts, and effectively attenuate random noise.
What defines an Expert? - Uncertainty in the interpretation of seismic data
NASA Astrophysics Data System (ADS)
Bond, C. E.
2008-12-01
Studies focusing on the elicitation of information from experts are concentrated primarily in economics and world markets, medical practice and expert witness testimonies. Expert elicitation theory has been applied in the natural sciences, most notably in the prediction of fluid flow in hydrological studies. In the geological sciences expert elicitation has been limited to theoretical analysis with studies focusing on the elicitation element, gaining expert opinion rather than necessarily understanding the basis behind the expert view. In these cases experts are defined in a traditional sense, based for example on: standing in the field, no. of years of experience, no. of peer reviewed publications, the experts position in a company hierarchy or academia. Here traditional indicators of expertise have been compared for significance on affective seismic interpretation. Polytomous regression analysis has been used to assess the relative significance of length and type of experience on the outcome of a seismic interpretation exercise. Following the initial analysis the techniques used by participants to interpret the seismic image were added as additional variables to the analysis. Specific technical skills and techniques were found to be more important for the affective geological interpretation of seismic data than the traditional indicators of expertise. The results of a seismic interpretation exercise, the techniques used to interpret the seismic and the participant's prior experience have been combined and analysed to answer the question - who is and what defines an expert?
Understanding Seismic Anisotropy in Hunt Well of Fort McMurray, Canada
NASA Astrophysics Data System (ADS)
Malehmir, R.; Schmitt, D. R.; Chan, J.
2014-12-01
Seismic imaging plays vital role in geothermal systems as a sustainable energy resource. In this paper, we acquired and processed zero-offset and walk-away VSP and logging as well as surface seismic in Athabasca oil sand area, Alberta. Seismic data were highly processed to make better image geothermal system. Through data processing, properties of natural fractures such as orientation and width were studied and high probable permeable zones were mapped along the deep drilled to the depth of 2363m deep into crystalline basement rocks. In addition to logging data, seismic data were processed to build a reliable image of underground. Velocity analysis in high resolution multi-component walk-away VSP informed us about the elastic anisotropy in place. Study of the natural and induced fracture as well as elastic anisotropy in the seismic data, led us to better map stress regime around the well bore. The seismic image and map of fractures optimizes enhanced geothermal stages through hydraulic stimulation. Keywords: geothermal, anisotropy, VSP, logging, Hunt well, seismic
Evaluation of seismic testing for quality assurance of lime-stabilized soil.
DOT National Transportation Integrated Search
2013-08-01
This study sought to determine the technical feasibility of using seismic techniques to measure the : laboratory and field seismic modulus of lime-stabilized soils (LSS), and to compare/correlate test results : from bench-top (free-free resonance) se...
Parsons, T.; Blakely, R.J.; Brocher, T.M.
2001-01-01
The geologic structure of the Earth's upper crust can be revealed by modeling variation in seismic arrival times and in potential field measurements. We demonstrate a simple method for sequentially satisfying seismic traveltime and observed gravity residuals in an iterative 3-D inversion. The algorithm is portable to any seismic analysis method that uses a gridded representation of velocity structure. Our technique calculates the gravity anomaly resulting from a velocity model by converting to density with Gardner's rule. The residual between calculated and observed gravity is minimized by weighted adjustments to the model velocity-depth gradient where the gradient is steepest and where seismic coverage is least. The adjustments are scaled by the sign and magnitude of the gravity residuals, and a smoothing step is performed to minimize vertical streaking. The adjusted model is then used as a starting model in the next seismic traveltime iteration. The process is repeated until one velocity model can simultaneously satisfy both the gravity anomaly and seismic traveltime observations within acceptable misfits. We test our algorithm with data gathered in the Puget Lowland of Washington state, USA (Seismic Hazards Investigation in Puget Sound [SHIPS] experiment). We perform resolution tests with synthetic traveltime and gravity observations calculated with a checkerboard velocity model using the SHIPS experiment geometry, and show that the addition of gravity significantly enhances resolution. We calculate a new velocity model for the region using SHIPS traveltimes and observed gravity, and show examples where correlation between surface geology and modeled subsurface velocity structure is enhanced.
Comprehensive Seismological Monitoring of Geomorphic Processes in Taiwan
NASA Astrophysics Data System (ADS)
Chao, W. A.; Chen, C. H.
2016-12-01
Geomorphic processes such as hillslope mass wasting and river sediment transport are important for studying landscape dynamics. Mass movements induced from geomorphic events can generate seismic waves and be recorded by seismometers. Recent studies demonstrate that seismic monitoring techniques not only fully map the spatiotemporal patterns of geomorphic activity but also allow for exploration of the dynamic links between hillslope failures and channel processes, which may not be resolved by conventional techniques (e.g., optical remote sensing). We have recently developed a real-time landquake monitoring system (RLMS, here we use the term `landquake' to represent all hillslope failures such as rockfall, rock avalanche and landslide), which has been continuously monitoring landquake activities in Taiwan since June 2015 based on broadband seismic records, yielding source information (e.g., location, occurrence time, magnitude and mechanism) for large-sized events (http://140.112.57.117/main.html). Several seismic arrays have also been deployed over the past few years around the catchments and along the river channels in Taiwan for monitoring erosion processes at catchment scale, improving the spatiotemporal resolution in exploring the interaction between geomorphic events and specific meteorological conditions. Based on a forward model accounting for the impulsive impacts of saltating particles, we can further invert for the sediment load flux, a critical parameter in landscape evolution studies, by fitting the seismic observations only. To test the validity of the seismologically determined sediment load flux, we conduct a series of controlled dam breaking experiments that are advantageous in well constraining the spatiotemporal variations of the sediment transport. Incorporating the seismological constrains on geomorphic processes with the effects of tectonic and/or climate perturbations can provide valuable and quantitative information for more fully
NASA Technical Reports Server (NTRS)
Dalins, I.; Mccarty, V. M.; Kaschak, G.; Donn, W. L.
1974-01-01
A reasonably comprehensive technical effort is described dealing with the investigations of acoustically generated seismic waves of Apollo 16 and Apollo 17 origin along the eastern seabord of the United States. This expanded effort is a continuation of earlier, rather successful detections of rocket-generated seismic disturbances on Skidaway Island, Georgia. The more recent effort has yielded few positive results other than a recording of an early-arriving seismic wave from Apollo 16 that was detected in Jacksonville. Evaluation of the negative results obtained in the Fort Monmouth area, with earlier studies of infrasound, local weather conditions, and geology, could be advantageous in the process of trying to gain a better insight into the acoustic-seismic resonance mechanism requiring phase-velocity matching at the atmosphere-ground interface.
EMERALD: A Flexible Framework for Managing Seismic Data
NASA Astrophysics Data System (ADS)
West, J. D.; Fouch, M. J.; Arrowsmith, R.
2010-12-01
The seismological community is challenged by the vast quantity of new broadband seismic data provided by large-scale seismic arrays such as EarthScope’s USArray. While this bonanza of new data enables transformative scientific studies of the Earth’s interior, it also illuminates limitations in the methods used to prepare and preprocess those data. At a recent seismic data processing focus group workshop, many participants expressed the need for better systems to minimize the time and tedium spent on data preparation in order to increase the efficiency of scientific research. Another challenge related to data from all large-scale transportable seismic experiments is that there currently exists no system for discovering and tracking changes in station metadata. This critical information, such as station location, sensor orientation, instrument response, and clock timing data, may change over the life of an experiment and/or be subject to post-experiment correction. Yet nearly all researchers utilize metadata acquired with the downloaded data, even though subsequent metadata updates might alter or invalidate results produced with older metadata. A third long-standing issue for the seismic community is the lack of easily exchangeable seismic processing codes. This problem stems directly from the storage of seismic data as individual time series files, and the history of each researcher developing his or her preferred data file naming convention and directory organization. Because most processing codes rely on the underlying data organization structure, such codes are not easily exchanged between investigators. To address these issues, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The goal of the EMERALD project is to provide seismic researchers with a unified, user-friendly, extensible system for managing seismic event data, thereby increasing the efficiency of scientific enquiry. EMERALD stores seismic data and metadata in a
CPT site characterization for seismic hazards in the New Madrid seismic zone
Liao, T.; Mayne, P.W.; Tuttle, M.P.; Schweig, E.S.; Van Arsdale, R.B.
2002-01-01
A series of cone penetration tests (CPTs) were conducted in the vicinity of the New Madrid seismic zone in central USA for quantifying seismic hazards, obtaining geotechnical soil properties, and conducting studies at liquefaction sites related to the 1811-1812 and prehistoric New Madrid earthquakes. The seismic piezocone provides four independent measurements for delineating the stratigraphy, liquefaction potential, and site amplification parameters. At the same location, two independent assessments of soil liquefaction susceptibility can be made using both the normalized tip resistance (qc1N) and shear wave velocity (Vs1). In lieu of traditional deterministic approaches, the CPT data can be processed using probability curves to assess the level and likelihood of future liquefaction occurrence. ?? 2002 Elsevier Science Ltd. All rights reserved.
Seismic and potential field studies over the East Midlands
NASA Astrophysics Data System (ADS)
Kirk, Wayne John
A seismic refraction profile was undertaken to investigate the source of an aeromagnetic anomaly located above the Widmerpool Gulf, East Midlands. Ten shots were fired into 51 stations at c. 1.5km spacing in a 70km profile during 41 days recording. The refraction data were processed using standard techniques to improve the data quality. A new filtering technique, known as Correlated Adaptive Noise Cancellation was tested on synthetic data and successfully applied to controlled source and quarry blast data. Study of strong motion data reveals that the previous method of site calibration is invalid. A new calibration technique, known as the Scaled Amplitude method is presented to provide safer charge size estimation. Raytrace modelling of the refraction data and two dimensional gravity interpretation confirms the presence of the Widmerpool Gulf but no support is found for the postulated intrusion. Two dimensional magnetic interpretation revealed that the aeromagnetic anomaly could be modelled with a Carboniferous igneous source. A Lower Palaeozoic refractor with a velocity of 6.0 km/s is identified at a maximum depth of c. 2.85km beneath the Widmerpool Gulf. Carboniferous and post-Carboniferous sediments within the gulf have velocities between 2.6-5.5 km/s with a strong vertical gradient. At the gulf margins, a refractor with a constant velocity of 5.2 km/s is identified as Dinantian limestone. A low velocity layer of proposed unaltered Lower Palaeozoics is identified beneath the limestone at the eastern edge of the Derbyshire Dome. The existence and areal extent of this layer are also determined from seismic reflection data. Image analysis of potential field data, presents a model identifying 3 structural provinces, the Midlands Microcraton, the Welsh and English Caledonides and a central region of complex linears. This model is used to explain the distribution of basement rocks determined from seismic and gravity profiles.
New Geophysical Techniques for Offshore Exploration.
ERIC Educational Resources Information Center
Talwani, Manik
1983-01-01
New seismic techniques have been developed recently that borrow theory from academic institutions and technology from industry, allowing scientists to explore deeper into the earth with much greater precision than possible with older seismic methods. Several of these methods are discussed, including the seismic reflection common-depth-point…
Towards a Comprehensive Catalog of Volcanic Seismicity
NASA Astrophysics Data System (ADS)
Thompson, G.
2014-12-01
Catalogs of earthquakes located using differential travel-time techniques are a core product of volcano observatories, and while vital, they represent an incomplete perspective of volcanic seismicity. Many (often most) earthquakes are too small to locate accurately, and are omitted from available catalogs. Low frequency events, tremor and signals related to rockfalls, pyroclastic flows and lahars are not systematically catalogued, and yet from a hazard management perspective are exceedingly important. Because STA/LTA detection schemes break down in the presence of high amplitude tremor, swarms or dome collapses, catalogs may suggest low seismicity when seismicity peaks. We propose to develop a workflow and underlying software toolbox that can be applied to near-real-time and offline waveform data to produce comprehensive catalogs of volcanic seismicity. Existing tools to detect and locate phaseless signals will be adapted to fit within this framework. For this proof of concept the toolbox will be developed in MATLAB, extending the existing GISMO toolbox (an object-oriented MATLAB toolbox for seismic data analysis). Existing database schemas such as the CSS 3.0 will need to be extended to describe this wider range of volcano-seismic signals. WOVOdat may already incorporate many of the additional tables needed. Thus our framework may act as an interface between volcano observatories (or campaign-style research projects) and WOVOdat. We aim to take the further step of reducing volcano-seismic catalogs to sets of continuous metrics that are useful for recognizing data trends, and for feeding alarm systems and forecasting techniques. Previous experience has shown that frequency index, peak frequency, mean frequency, mean event rate, median event rate, and cumulative magnitude (or energy) are potentially useful metrics to generate for all catalogs at a 1-minute sample rate (directly comparable with RSAM and similar metrics derived from continuous data). Our framework
Pole tide triggering of seismicity
NASA Astrophysics Data System (ADS)
Gorshkov, V.
2015-08-01
The influence of the pole tide (PT) on intensity of seismic process is searched on base of Harvard Centroid-moment tensors catalogue (CMT). The normal and shear stresses excited by PT were calculated for each earthquake (EQ) from CMT (32.3 thousands of EQ events after for- and aftershock declustering). There was revealed that there are two maxima of PT influence on weak (less 5.5 magnitudes) thrust-slip EQ near the both extrema (min and max) of shear stress. This influence has 95 % level of statistical significance by Schuster and χ^2 criteria and could explain the 0.6-year periodicity in seismic intensity spectrum. The PT influence on seismicity becomes negligible when PT variations decrease up to 100~mas. This could explain 6-7 years periodicity in seismic intensity spectrum.
Seismic wave interaction with underground cavities
NASA Astrophysics Data System (ADS)
Schneider, Felix M.; Esterhazy, Sofi; Perugia, Ilaria; Bokelmann, Götz
2016-04-01
Realization of the future Comprehensive Nuclear Test Ban Treaty (CTBT) will require ensuring its compliance, making the CTBT a prime example of forensic seismology. Following indications of a nuclear explosion obtained on the basis of the (IMS) monitoring network further evidence needs to be sought at the location of the suspicious event. For such an On-Site Inspection (OSI) at a possible nuclear test site the treaty lists several techniques that can be carried out by the inspection team, including aftershock monitoring and the conduction of active seismic surveys. While those techniques are already well established, a third group of methods labeled as "resonance seismometry" is less well defined and needs further elaboration. A prime structural target that is expected to be present as a remnant of an underground nuclear explosion is a cavity at the location and depth the bomb was fired. Originally "resonance seismometry" referred to resonant seismic emission of the cavity within the medium that could be stimulated by an incident seismic wave of the right frequency and observed as peaks in the spectrum of seismic stations in the vicinity of the cavity. However, it is not yet clear which are the conditions for which resonant emissions of the cavity could be observed. In order to define distance-, frequency- and amplitude ranges at which resonant emissions could be observed we study the interaction of seismic waves with underground cavities. As a generic model for possible resonances we use a spherical acoustic cavity in an elastic full-space. To solve the forward problem for the full elastic wave field around acoustic spherical inclusions, we implemented an analytical solution (Korneev, 1993). This yields the possibility of generating scattering cross-sections, amplitude spectrums and synthetic seismograms for plane incident waves. Here, we focus on the questions whether or not we can expect resonant responses in the wave field scattered from the cavity. We show
NASA Astrophysics Data System (ADS)
Bell, Andrew; Hernandez, Stephen; Gaunt, Elizabeth; Mothes, Patricia; Hidalgo, Silvana; Ruiz, Mario
2016-04-01
Tungurahua is a large andesitic stratovolcano located in the Andes of Ecuador. The current eruptive phase at Tungurahua began in 1999, and has been characterised by episodes of vulcanian and strombolian activity, interspersed by periods of relative quiescence. Despite showing only modest eruptive activity in 2015, seismic data revealed a pronounced change in the behaviour of the magma-conduit system compared to the preceding 15 years of activity. The change is most notable in the periodicity of interevent-times of volcanic earthquakes. Previous seismicity at Tungurahua is characterised by interevent-time periodicities typical of a Poisson process, or modestly clustered, with slightly elevated (anti-clustered) periodicities observed only rarely during vulcanian episodes. However, activity in 2015 saw a series of unrest episodes characterised by highly-periodic interevent-times, and including several notable episodes of 'drumbeat' earthquakes. Here we report seismic and associated geophysical signals recorded at Tungurahua in 2015 by the monitoring network of the Instituto Geofisico of Ecuador, their relation to conduit processes, and implications for the origins of unrest and likely future activity. Although the nature of the low-frequency seismic signals change both within and between unrest episodes, the underlying periodicity is more consistent and gradually evolving. Waveform similarity is high within phases, resulting from the repeated activation of persistent sources, but low between different episodes, suggesting the emergence of new sources and locations. The strength of periodicity is correlated with the average waveform similarity for all unrest episodes, with the relatively low waveform similarities observed for the highly periodic drumbeat earthquakes in April due to contamination from coexisting continuous tremor. Eruptive activity consisted of a few minor explosions and ash emission events. Notably, a short-lived episode of Strombolian activity in
Accelerated design of bioconversion processes using automated microscale processing techniques.
Lye, Gary J; Ayazi-Shamlou, Parviz; Baganz, Frank; Dalby, Paul A; Woodley, John M
2003-01-01
Microscale processing techniques are rapidly emerging as a means to increase the speed of bioprocess design and reduce material requirements. Automation of these techniques can reduce labour intensity and enable a wider range of process variables to be examined. This article examines recent research on various individual microscale unit operations including microbial fermentation, bioconversion and product recovery techniques. It also explores the potential of automated whole process sequences operated in microwell formats. The power of the whole process approach is illustrated by reference to a particular bioconversion, namely the Baeyer-Villiger oxidation of bicyclo[3.2.0]hept-2-en-6-one for the production of optically pure lactones.
NASA Astrophysics Data System (ADS)
Reymond, D.
2016-12-01
We present an open source software project (GNU public license), named STK: Seismic Tool-Kit, that is dedicated mainly for learning signal processing and seismology. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 20000 downloads at the date of writing.The STK project is composed of two main branches:First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The passage in the frequency domain via the Fourier transform is used to introduce the estimation of spectral density of the signal , with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noise), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. STK has been used in some schools for viewing and plotting seismic records provided by IRIS, and it has been used as a practical support for teaching the basis of signal processing. Useful links:http://sourceforge.net/projects/seismic-toolkit/http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/
Seismic tomography; theory and practice
Iver, H.M.; Hirahara, Kazuro
1993-01-01
Although highly theoretical and computer-orientated, seismic tomography has created spectacular images of anomolies within the Earth with dimensions of thousands of kilometers to few tens of meters. These images have enabled Earth scientists working on diverse areas to attack fundamental problems relating to the deep dynamical processes within our planet. Additionally, this technique is being used extensively to study the Earth's hazardous regions such as earthquake fault zones and volcanoes, as well as features beneficial to man such as oil or mineral-bearing structures. This book has been written by world experts and describes the theories, experimental and analytical procedures and results of applying seismic tomography from global to purely local scale. It represents the collective global perspective on the state of the art and focusses not only on the theoretical and practical aspects, but also on the uses for hydrocarbon, mineral and geothermal exploitation. Students and researchers in the Earth sciences, and research and exploration geophysicists should find this a useful, practical reference book for all aspects of their work.
NASA Astrophysics Data System (ADS)
Yu, H.; Gu, H.
2017-12-01
A novel multivariate seismic formation pressure prediction methodology is presented, which incorporates high-resolution seismic velocity data from prestack AVO inversion, and petrophysical data (porosity and shale volume) derived from poststack seismic motion inversion. In contrast to traditional seismic formation prediction methods, the proposed methodology is based on a multivariate pressure prediction model and utilizes a trace-by-trace multivariate regression analysis on seismic-derived petrophysical properties to calibrate model parameters in order to make accurate predictions with higher resolution in both vertical and lateral directions. With prestack time migration velocity as initial velocity model, an AVO inversion was first applied to prestack dataset to obtain high-resolution seismic velocity with higher frequency that is to be used as the velocity input for seismic pressure prediction, and the density dataset to calculate accurate Overburden Pressure (OBP). Seismic Motion Inversion (SMI) is an inversion technique based on Markov Chain Monte Carlo simulation. Both structural variability and similarity of seismic waveform are used to incorporate well log data to characterize the variability of the property to be obtained. In this research, porosity and shale volume are first interpreted on well logs, and then combined with poststack seismic data using SMI to build porosity and shale volume datasets for seismic pressure prediction. A multivariate effective stress model is used to convert velocity, porosity and shale volume datasets to effective stress. After a thorough study of the regional stratigraphic and sedimentary characteristics, a regional normally compacted interval model is built, and then the coefficients in the multivariate prediction model are determined in a trace-by-trace multivariate regression analysis on the petrophysical data. The coefficients are used to convert velocity, porosity and shale volume datasets to effective stress and then
Noise-based seismic monitoring of the Campi Flegrei caldera
NASA Astrophysics Data System (ADS)
Zaccarelli, Lucia; Bianco, Francesca
2017-03-01
The Campi Flegrei caldera is one of the highest risk volcanic fields worldwide, because of its eruptive history and the large population hosted within the caldera. It experiences bradiseismic crises: sudden uplift with low energetic seismic swarm occurrences. No seismicity is recorded out of these deformation rate changes. Therefore, a continuous seismic monitoring of the caldera is possible only by means of the ambient seismic noise. We apply a noise-based seismic monitoring technique to the cross correlations of 5 year recordings at the mobile seismic network. The resulting relative velocity variations are compared to the temporal behavior of the geophysical and geochemical observations routinely sampled at Campi Flegrei. We discriminate between two kinds of crustal stress field variations acting at different timescales. They are related to a possible magmatic intrusion and to the gradual heating of the hydrothermal system, respectively. This study sets up the basis for future volcano monitoring strategies.
NASA Astrophysics Data System (ADS)
Gaebler, P. J.; Ceranna, L.
2016-12-01
All nuclear explosions - on the Earth's surface, underground, underwater or in the atmosphere - are banned by the Comprehensive Nuclear-Test-Ban Treaty (CTBT). As part of this treaty, a verification regime was put into place to detect, locate and characterize nuclear explosion testings at any time, by anyone and everywhere on the Earth. The International Monitoring System (IMS) plays a key role in the verification regime of the CTBT. Out of the different monitoring techniques used in the IMS, the seismic waveform approach is the most effective technology for monitoring nuclear underground testing and to identify and characterize potential nuclear events. This study introduces a method of seismic threshold monitoring to assess an upper magnitude limit of a potential seismic event in a certain given geographical region. The method is based on ambient seismic background noise measurements at the individual IMS seismic stations as well as on global distance correction terms for body wave magnitudes, which are calculated using the seismic reflectivity method. From our investigations we conclude that a global detection threshold of around mb 4.0 can be achieved using only stations from the primary seismic network, a clear latitudinal dependence for the detection thresholdcan be observed between northern and southern hemisphere. Including the seismic stations being part of the auxiliary seismic IMS network results in a slight improvement of global detection capability. However, including wave arrivals from distances greater than 120 degrees, mainly PKP-wave arrivals, leads to a significant improvement in average global detection capability. In special this leads to an improvement of the detection threshold on the southern hemisphere. We further investigate the dependence of the detection capability on spatial (latitude and longitude) and temporal (time) parameters, as well as on parameters such as source type and percentage of operational IMS stations.
NASA Astrophysics Data System (ADS)
Montoya-Noguera, Silvana; Wang, Yu
2017-04-01
The Central and Eastern United States (CEUS) has experienced an abnormal increase in seismic activity, which is believed to be related to anthropogenic activities. The U.S. Geological Survey has acknowledged this situation and developed the CEUS 2016 1 year seismic hazard model using the catalog of 2015 by assuming stationary seismicity in that period. However, due to the nonstationary nature of induced seismicity, it is essential to identify change points for accurate probabilistic seismic hazard analysis (PSHA). We present a Bayesian procedure to identify the most probable change points in seismicity and define their respective seismic rates. It uses prior distributions in agreement with conventional PSHA and updates them with recent data to identify seismicity changes. It can determine the change points in a regional scale and may incorporate different types of information in an objective manner. It is first successfully tested with simulated data, and then it is used to evaluate Oklahoma's regional seismicity.
Impact-induced seismic activity on asteroid 433 Eros: a surface modification process.
Richardson, James E; Melosh, H Jay; Greenberg, Richard
2004-11-26
High-resolution images of the surface of asteroid 433 Eros revealed evidence of downslope movement of a loose regolith layer, as well as the degradation and erasure of small impact craters (less than approximately 100 meters in diameter). One hypothesis to explain these observations is seismic reverberation after impact events. We used a combination of seismic and geomorphic modeling to analyze the response of regolith-covered topography, particularly craters, to impact-induced seismic shaking. Applying these results to a stochastic cratering model for the surface of Eros produced good agreement with the observed size-frequency distribution of craters, including the paucity of small craters.
NASA Astrophysics Data System (ADS)
Lamarque, Gaëlle; Bascou, Jérôme; Maurice, Claire; Cottin, Jean-Yves; Ménot, René-Pierre
2015-04-01
The Mertz Shear Zone (MSZ; 146°E 67°S; East Antarctica) is one major lithospheric-scale structure which outcrops on the eastern edge of the Terre Adélie Craton (Ménot et al., 2007) and that could connected with shear zones of South Australia (e.g., Kalinjala or Coorong shear zone (Kleinschmidt and Talarico, 2000; Gibson et al., 2013)) before the Cretaceous opening of the Southern Ocean. Geochronological and metamorphic studies indicated an MSZ activity at 1.7 and 1.5 Ga respectively in amphibolite and greenschists facies conditions. The deformation affects both the intermediate and lower crust levels, without associated voluminous magma injection. Granulite crop out in the area of the MSZ. They were dated at 2.4 Ga (Ménot et al., 2005) and could represent some preserved Neoarchean tectonites. These rocks show various degrees of deformation including penetrative structures that may display comparable features with that observed in amphibolite and greenschists facies rocks, i.e. NS-striking and steeply dipping foliation with weekly plunging lineation. In the field, cinematic indicators for the MSZ argue for a dominant dextral shear sense. We proceed to optical analysis and crystallographic preferred orientation (CPO) measurements using EBSD technique in order to better constrain the deformation processes. Our results highlight (1) a microstructural gradient from highly deformed rocks (mylonites), forming plurimetric large shear bands and showing evidences of plastic deformation, to slightly deformed rocks in preserved cores with no evidences of plastic deformation or with a clear strong static recrystallization; (2) CPO of minerals related with variations on deformation conditions. Feldspar and quartz CPO argue for plastic deformation at high temperature in the most deformed domains and for the absence of deformation or an important stage of static recrystallization in preserved cores; (3) uncommon CPO in orthopyroxene which are characterized by [010]-axes
High-resolution seismic data regularization and wavefield separation
NASA Astrophysics Data System (ADS)
Cao, Aimin; Stump, Brian; DeShon, Heather
2018-04-01
We present a new algorithm, non-equispaced fast antileakage Fourier transform (NFALFT), for irregularly sampled seismic data regularization. Synthetic tests from 1-D to 5-D show that the algorithm may efficiently remove leaked energy in the frequency wavenumber domain, and its corresponding regularization process is accurate and fast. Taking advantage of the NFALFT algorithm, we suggest a new method (wavefield separation) for the detection of the Earth's inner core shear wave with irregularly distributed seismic arrays or networks. All interfering seismic phases that propagate along the minor arc are removed from the time window around the PKJKP arrival. The NFALFT algorithm is developed for seismic data, but may also be used for other irregularly sampled temporal or spatial data processing.
Driving Processes of Earthquake Swarms: Evidence from High Resolution Seismicity
NASA Astrophysics Data System (ADS)
Ellsworth, W. L.; Shelly, D. R.; Hill, D. P.; Hardebeck, J.; Hsieh, P. A.
2017-12-01
Earthquake swarms are transient increases in seismicity deviating from a typical mainshock-aftershock pattern. Swarms are most prevalent in volcanic and hydrothermal areas, yet also occur in other environments, such as extensional fault stepovers. Swarms provide a valuable opportunity to investigate source zone physics, including the causes of their swarm-like behavior. To gain insight into this behavior, we have used waveform-based methods to greatly enhance standard seismic catalogs. Depending on the application, we detect and precisely relocate 2-10x as many events as included in the initial catalog. Recently, we have added characterization of focal mechanisms (applied to a 2014 swarm in Long Valley Caldera, California), addressing a common shortcoming in microseismicity analyses (Shelly et al., JGR, 2016). In analysis of multiple swarms (both within and outside volcanic areas), several features stand out, including: (1) dramatic expansion of the active source region with time, (2) tendency for events to occur on the immediate fringe of prior activity, (3) overall upward migration, and (4) complex faulting structure. Some swarms also show an apparent mismatch between seismicity orientations (as defined by patterns in hypocentral locations) and slip orientations (as inferred from focal mechanisms). These features are largely distinct from those observed in mainshock-aftershock sequences. In combination, these swarm behaviors point to an important role for fluid pressure diffusion. Swarms may in fact be generated by a cascade of fluid pressure diffusion and stress transfer: in cases where faults are critically stressed, an increase in fluid pressure will trigger faulting. Faulting will in turn dramatically increase permeability in the faulted area, allowing rapid equilibration of fluid pressure to the fringe of the rupture zone. This process may perpetuate until fluid pressure perturbations drop and/or stresses become further from failure, such that any
South-Central Tibetan Seismicity from HiCLIMB Seismic Array Data
NASA Astrophysics Data System (ADS)
Carpenter, S.; Nabelek, J.; Braunmiller, J.
2010-12-01
The HiCLIMB broadband passive seismic experiment (2002-2005) operated 233 sites along a 800-km long north-south array extending from the Himalayan foreland into the Central Tibetan Plateau and a flanking 350x350 km lateral array in southern Tibet and eastern Nepal. We use data from the experiment’s second phase (June 2004 to August 2005), when stations operated in Tibet, to locate earthquakes in south-central Tibet, a region with no permanent seismic network where little is known about its seismicity. We used the Antelope software for automatic detection and arrival time picking, event-arrival association and event location. Requiring a low detection and event association threshold initially resulted in ~110,000 declared events. The large database size rendered manual inspection unfeasible and we developed automated post-processing modules to weed out spurious detections and erroneous phase and event associations, which stemmed, e.g., from multiple coincident earthquakes within the array or misplaced seismicity from the great 2004 Sumatra earthquake. The resulting database contains ~32,000 events within 5° distance from the closest station. We consider ~7,600 events defined by more than 30 P and S arrivals well located and discuss them here. Seismicity in the subset correlates well with mapped faults and structures seen on satellite imagery attesting to high location quality. This is confirmed by non-systematic, kilometer-scale differences between automatic and manual locations for selected events. Seismicity in south-central Tibet is intense north of the Yarlung-Tsangpo Suture. Almost 90% of events occurred in the Lhasa Terrane mainly along north-south trending rifts. Vigorous activity (>4,800 events) accompanied two M>6 earthquakes in the Payang Basin (84°E), ~100 km west of the linear array. The Tangra-Yum Co (86.5°E) and Pumqu-Xianza (88°E) rifts were very active (~1,000 events) without dominant main shocks indicating swarm like-behavior possibly related
Ultra high speed image processing techniques. [electronic packaging techniques
NASA Technical Reports Server (NTRS)
Anthony, T.; Hoeschele, D. F.; Connery, R.; Ehland, J.; Billings, J.
1981-01-01
Packaging techniques for ultra high speed image processing were developed. These techniques involve the development of a signal feedthrough technique through LSI/VLSI sapphire substrates. This allows the stacking of LSI/VLSI circuit substrates in a 3 dimensional package with greatly reduced length of interconnecting lines between the LSI/VLSI circuits. The reduced parasitic capacitances results in higher LSI/VLSI computational speeds at significantly reduced power consumption levels.
Medium effect on the characteristics of the coupled seismic and electromagnetic signals.
Huang, Qinghua; Ren, Hengxin; Zhang, Dan; Chen, Y John
2015-01-01
Recently developed numerical simulation technique can simulate the coupled seismic and electromagnetic signals for a double couple point source or a finite fault planar source. Besides the source effect, the simulation results showed that both medium structure and medium property could affect the coupled seismic and electromagnetic signals. The waveform of coupled signals for a layered structure is more complicated than that for a simple uniform structure. Different from the seismic signals, the electromagnetic signals are sensitive to the medium properties such as fluid salinity and fluid viscosity. Therefore, the co-seismic electromagnetic signals may be more informative than seismic signals.
Medium effect on the characteristics of the coupled seismic and electromagnetic signals
HUANG, Qinghua; REN, Hengxin; ZHANG, Dan; CHEN, Y. John
2015-01-01
Recently developed numerical simulation technique can simulate the coupled seismic and electromagnetic signals for a double couple point source or a finite fault planar source. Besides the source effect, the simulation results showed that both medium structure and medium property could affect the coupled seismic and electromagnetic signals. The waveform of coupled signals for a layered structure is more complicated than that for a simple uniform structure. Different from the seismic signals, the electromagnetic signals are sensitive to the medium properties such as fluid salinity and fluid viscosity. Therefore, the co-seismic electromagnetic signals may be more informative than seismic signals. PMID:25743062
Tectonics and seismicity of the southern Washington Cascade range
Stanley, W.D.; Johnson, S.Y.; Qamar, A.I.; Weaver, C.S.; Williams, J.M.
1996-01-01
Geophysical, geological, and seismicity data are combined to develop a transpressional strain model for the southern Washington Cascades region. We use this model to explain oblique fold and fault systems, transverse faults, and a linear seismic zone just west of Mt. Rainier known as the western Rainier zone. We also attempt to explain a concentration of earthquakes that connects the northwest-trending Mount St. Helens seismic zone to the north-trending western Rainier zone. Our tectonic model illustrates the pervasive effects of accretionary processes, combined with subsequent transpressive forces generated by oblique subduction, on Eocene to present crustal processes, such as seismicity and volcanism.
NASA Astrophysics Data System (ADS)
Talukdar, Karabi; Behera, Laxmidhar
2018-03-01
Imaging below the basalt for hydrocarbon exploration is a global problem because of poor penetration and significant loss of seismic energy due to scattering, attenuation, absorption and mode-conversion when the seismic waves encounter a highly heterogeneous and rugose basalt layer. The conventional (short offset) seismic data acquisition, processing and modeling techniques adopted by the oil industry generally fails to image hydrocarbon-bearing sub-trappean Mesozoic sediments hidden below the basalt and is considered as a serious problem for hydrocarbon exploration in the world. To overcome this difficulty of sub-basalt imaging, we have generated dense synthetic seismic data with the help of elastic finite-difference full-wave modeling using staggered-grid scheme for the model derived from ray-trace inversion using sparse wide-angle seismic data acquired along Sinor-Valod profile in the Deccan Volcanic Province of India. The full-wave synthetic seismic data generated have been processed and imaged using conventional seismic data processing technique with Kirchhoff pre-stack time and depth migrations. The seismic image obtained correlates with all the structural features of the model obtained through ray-trace inversion of wide-angle seismic data, validating the effectiveness of robust elastic finite-difference full-wave modeling approach for imaging below thick basalts. Using the full-wave modeling also allows us to decipher small-scale heterogeneities imposed in the model as a measure of the rugose basalt interfaces, which could not be dealt with ray-trace inversion. Furthermore, we were able to accurately image thin low-velocity hydrocarbon-bearing Mesozoic sediments sandwiched between and hidden below two thick sequences of high-velocity basalt layers lying above the basement.
Passive monitoring for near surface void detection using traffic as a seismic source
NASA Astrophysics Data System (ADS)
Zhao, Y.; Kuzma, H. A.; Rector, J.; Nazari, S.
2009-12-01
In this poster we present preliminary results based on our several field experiments in which we study seismic detection of voids using a passive array of surface geophones. The source of seismic excitation is vehicle traffic on nearby roads, which we model as a continuous line source of seismic energy. Our passive seismic technique is based on cross-correlation of surface wave fields and studying the resulting power spectra, looking for "shadows" caused by the scattering effect of a void. High frequency noise masks this effect in the time domain, so it is difficult to see on conventional traces. Our technique does not rely on phase distortions caused by small voids because they are generally too tiny to measure. Unlike traditional impulsive seismic sources which generate highly coherent broadband signals, perfect for resolving phase but too weak for resolving amplitude, vehicle traffic affords a high power signal a frequency range which is optimal for finding shallow structures. Our technique results in clear detections of an abandoned railroad tunnel and a septic tank. The ultimate goal of this project is to develop a technology for the simultaneous imaging of shallow underground structures and traffic monitoring near these structures.
Prospective testing of neo-deterministic seismic hazard scenarios for the Italian territory
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Vaccari, Franco; Kossobokov, Vladimir; Panza, Giuliano F.
2013-04-01
A reliable and comprehensive characterization of expected seismic ground shaking, eventually including the related time information, is essential in order to develop effective mitigation strategies and increase earthquake preparedness. Moreover, any effective tool for SHA must demonstrate its capability in anticipating the ground shaking related with large earthquake occurrences, a result that can be attained only through rigorous verification and validation process. So far, the major problems in classical probabilistic methods for seismic hazard assessment, PSHA, consisted in the adequate description of the earthquake recurrence, particularly for the largest and sporadic events, and of the attenuation models, which may be unable to account for the complexity of the medium and of the seismic sources and are often weekly constrained by the available observations. Current computational resources and physical knowledge of the seismic waves generation and propagation processes allow nowadays for viable numerical and analytical alternatives to the use of attenuation relations. Accordingly, a scenario-based neo-deterministic approach, NDSHA, to seismic hazard assessment has been proposed, which allows considering a wide range of possible seismic sources as the starting point for deriving scenarios by means of full waveforms modeling. The method does not make use of attenuation relations and naturally supplies realistic time series of ground shaking, including reliable estimates of ground displacement readily applicable to seismic isolation techniques. Based on NDSHA, an operational integrated procedure for seismic hazard assessment has been developed, that allows for the definition of time dependent scenarios of ground shaking, through the routine updating of formally defined earthquake predictions. The integrated NDSHA procedure for seismic input definition, which is currently applied to the Italian territory, combines different pattern recognition techniques, designed
Research on Seismic Wave Attenuation in Gas Hydrates Layer Using Vertical Cable Seismic Data
NASA Astrophysics Data System (ADS)
Wang, Xiangchun; Liang, Lunhang; Wu, Zhongliang
2018-06-01
Vertical cable seismic (VCS) data are the most suitable seismic data for estimating the quality factor Q values of layers under the sea bottom by now. Here the quality factor Q values are estimated using the high-precision logarithmic spectrum ratio method for VCS data. The estimated Q values are applied to identify the layers with gas hydrates and free gas. From the results it can be seen that the Q value in layer with gas hydrates becomes larger and the Q value in layer with free gas becomes smaller than layers without gas hydrates or free gas. Additionally, the estimated Q values are used for inverse Q filtering processing to compensate the attenuated seismic signal's high-frequency component. From the results it can be seen that the main frequency of seismic signal is improved and the frequency band is broadened, the resolution of the VCS data is improved effectively.
NASA Astrophysics Data System (ADS)
Capar, Laure
2013-04-01
Within the framework of the transnational project GeoMol geophysical and geological information on the entire Molasse Basin and on the Po Basin are gathered to build consistent cross-border 3D geological models based on borehole evidence and seismic data. Benefiting from important progress in seismic processing, these new models will provide some answers to various questions regarding the usage of subsurface resources, as there are geothermal energy, CO2 and gas storage, oil and gas production, and support decisions-making to national and local administrations as well as to industries. More than 28 000 km of 2D seismic lines are compiled reprocessed and harmonized. This work faces various problems like the vertical drop of more than 700 meters between West and East of the Molasse Basin and to al lesser extent in the Po Plain, the heterogeneities of the substratum, the large disparities between the period and parameters of seismic acquisition, and depending of their availability, the use of two types of seismic data, raw and processed seismic data. The main challenge is to harmonize all lines at the same reference level, amplitude and step of signal processing from France to Austria, spanning more than 1000 km, to avoid misfits at crossing points between seismic lines and artifacts at the country borders, facilitating the interpretation of the various geological layers in the Molasse Basin and Po Basin. A generalized stratigraphic column for the two basins is set up, representing all geological layers relevant to subsurface usage. This stratigraphy constitutes the harmonized framework for seismic reprocessing. In general, processed seismic data is available on paper at stack stage and the mandatory information to take these seismic lines to the final stage of processing, the migration step, are datum plane and replacement velocity. However several datum planes and replacement velocities were used during previous processing projects. Our processing sequence is to
Seismic Regionalization of Michoacan, Mexico and Recurrence Periods for Earthquakes
NASA Astrophysics Data System (ADS)
Magaña García, N.; Figueroa-Soto, Á.; Garduño-Monroy, V. H.; Zúñiga, R.
2017-12-01
Michoacán is one of the states with the highest occurrence of earthquakes in Mexico and it is a limit of convergence triggered by the subduction of Cocos plate over the North American plate, located in the zone of the Pacific Ocean of our country, in addition to the existence of active faults inside of the state like the Morelia-Acambay Fault System (MAFS).It is important to make a combination of seismic, paleosismological and geological studies to have good planning and development of urban complexes to mitigate disasters if destructive earthquakes appear. With statistical seismology it is possible to characterize the degree of seismic activity as well as to estimate the recurrence periods for earthquakes. For this work, seismicity catalog of Michoacán was compiled and homogenized in time and magnitude. This information was obtained from world and national agencies (SSN, CMT, etc), some data published by Mendoza and Martínez-López (2016) and starting from the seismic catalog homogenized by F. R. Zúñiga (Personal communication). From the analysis of the different focal mechanisms reported in the literature and geological studies, the seismic regionalization of the state of Michoacán complemented the one presented by Vázquez-Rosas (2012) and the recurrence periods for earthquakes within the four different seismotectonic regions. In addition, stable periods were determined for the b value of the Gutenberg-Richter (1944) using the Maximum Curvature and EMR (Entire Magnitude Range Method, 2005) techniques, which allowed us to determine recurrence periods: years for earthquakes upper to 7.5 for the subduction zone (A zone) with EMR technique and years with MAXC technique for the same years for earthquakes upper to 5 for B1 zone with EMR technique and years with MAXC technique; years for earthquakes upper to 7.0 for B2 zone with EMR technique and years with MAXC technique; and the last one, the Morelia-Acambay Fault Sistem zone (C zone) years for earthquakes
Patterns of significant seismic quiescence in the Pacific Mexican coast
NASA Astrophysics Data System (ADS)
Muñoz-Diosdado, Alejandro; Rudolf-Navarro, Adolfo; Barrera-Ferrer, Amilcar; Angulo-Brown, Fernando
2014-05-01
Mexico is one of the countries with higher seismicity. During the 20th century, 8% of all the earthquakes in the world of magnitude greater than or equal to 7.0 have taken place in Mexico. On average, an earthquake of magnitude greater than or equal to 7.0 occurred in Mexico every two and a half years. Great earthquakes in Mexico have their epicenters in the Pacific Coast in which some seismic gaps have been identified; for example, there is a mature gap in the Guerrero State Coast, which potentially can produce an earthquake of magnitude 8.2. With the purpose of making some prognosis, some researchers study the statistical behavior of certain physical parameters that could be related with the process of accumulation of stress in the Earth crust. Other researchers study seismic catalogs trying to find seismicity patterns that are manifested before the occurrence of great earthquakes. Many authors have proposed that the study of seismicity rates is an appropriate technique for evaluating how close a seismic gap may be to rupture. We designed an algorithm for identification of patterns of significant seismic quiescence by using the definition of seismic quiescence proposed by Schreider (1990). This algorithm shows the area of quiescence where an earthquake of great magnitude will probably occur. We apply our algorithm to the earthquake catalogue of the Mexican Pacific coast located between 14 and 21 degrees of North latitude and 94 and 106 degrees West longitude; with depths less or equal to 60 km and magnitude greater or equal to 4.2, which occurred from September, 1965 until December, 2014. We have found significant patterns of seismic quietude before the earthquakes of Oaxaca (November 1978, Mw = 7.8), Petatlán (March 1979, Mw = 7.6), Michoacán (September 1985, Mw = 8.0, and Mw = 7.6) and Colima (October 1995, Mw = 8.0). Fortunately, in this century have not occurred earthquakes of great magnitude in Mexico, however, we have identified well-defined seismic
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata; Lasocki, Stanislaw; Leptokaropoulos, Konstantinos
2014-05-01
The community focused on seismic processes induced by human operations has been organized within EPOS Integration Program as Working Group 10 Infrastructure for Georesources. This group has brought together representatives from the scientific community and industry from 13 European countries. WG10 aims to integrate the research infrastructure (RI) in the area of seismicity induced (IS) by human activity: tremors and rockbursts in underground mines, seismicity associated with conventional and unconventional oil and gas production, induced by geothermal energy extraction and by underground reposition and storage of liquids (e.g. water disposal associated with energy extraction) and gases (CO2 sequestration, inter alia) and triggered by filling surface water reservoirs, etc. WG10 priority is to create new research opportunities in the field responding to global challenges connected with exploitation of georesources. WG10 has prepared the model of integration fulfilling the scientific mission and raising the visibility of stakeholders. The end-state Induced Seismicity Thematic Core Service (IS TCS) has been designed together with key metrics for TCS benefits in four areas: scientific, societal, economic and capacity building. IS-EPOS project, funded by National Centre for Research and Development, Poland within the program "Innovative Economy Operational Program Priority Axis 2 - R&D Infrastructure", aims at building a prototype of IS TCS. The prototype will implement fully the designed logic of IS TCS. Research infrastructure integrated within the prototype will comprise altogether seven comprehensive data cases of seismicity linked to deep mining related, associating geothermal production and triggered by reservoir impoundment. The implemented thematic services will enable studies within the use-case "Clustering of induced earthquakes". The IS TCS prototype is expected to reach full functionality by the end of 2014.
NASA Astrophysics Data System (ADS)
Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.
2017-12-01
In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long
Using seismic derived lithology parameters for hydrocarbon indication
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Riel, P.; Sisk, M.
1996-08-01
The last two decades have shown a strong increase in the use of seismic amplitude information for direct hydrocarbon indication. However, working with seismic amplitudes (and seismic attributes) has several drawbacks: tuning effects must be handled; quantitative analysis is difficult because seismic amplitudes are not directly related to lithology; and seismic amplitudes are reflection events, making it is unclear if amplitude changes relate to lithology variations above or below the interface. These drawbacks are overcome by working directly on seismic derived lithology data, lithology being a layer property rather than an interface property. Technology to extract lithology from seismic datamore » has made great strides, and a large range of methods are now available to users including: (1) Bandlimited acoustic impedance (AI) inversion; (2) Reconstruction of the low AI frequencies from seismic velocities, from spatial well log interpolation, and using constrained sparse spike inversion techniques; (3) Full bandwidth reconstruction of multiple lithology properties (porosity, sand fraction, density etc.,) in time and depth using inverse modeling. For these technologies to be fully leveraged, accessibility by end users is critical. All these technologies are available as interactive 2D and 3D workstation applications, integrated with seismic interpretation functionality. Using field data examples, we will demonstrate the impact of these different approaches on deriving lithology, and in particular show how accuracy and resolution is increased as more geologic and well information is added.« less
New Version of SeismicHandler (SHX) based on ObsPy
NASA Astrophysics Data System (ADS)
Stammler, Klaus; Walther, Marcus
2016-04-01
The command line version of SeismicHandler (SH), a scientific analysis tool for seismic waveform data developed around 1990, has been redesigned in the recent years, based on a project funded by the Deutsche Forschungsgemeinschaft (DFG). The aim was to address new data access techniques, simplified metadata handling and a modularized software design. As a result the program was rewritten in Python in its main parts, taking advantage of simplicity of this script language and its variety of well developed software libraries, including ObsPy. SHX provides an easy access to waveforms and metadata via arclink and FDSN webservice protocols, also access to event catalogs is implemented. With single commands whole networks or stations within a certain area may be read in, the metadata are retrieved from the servers and stored in a local database. For data processing the large set of SH commands is available, as well as the SH scripting language. Via this SH language scripts or additional Python modules the command set of SHX is easily extendable. The program is open source, tested on Linux operating systems, documentation and download is found at URL "https://www.seismic-handler.org/".
Methods for Improving Seismic Event Location Processing
2004-10-22
1998), A re-examination of seismicity associated with the January 1983 dike intrusion at Kilauea volcano , Hawaii , J. Geophys, Res. 103, 10,003...utilized. Recent studies of earthquakes in Hawaii (Rubin et al., 1998), California (Waldhauser et al., 1999), and at the Soultz geothermal area (Rowe et al...multiplet relative relocation beneath the south flank of Kilauea , J. Geophy. Res., 99, 15,375-15,386. Hattingh, M. (1988), A new data adaptive
The Use of Signal Dimensionality for Automatic QC of Seismic Array Data
NASA Astrophysics Data System (ADS)
Rowe, C. A.; Stead, R. J.; Begnaud, M. L.; Draganov, D.; Maceira, M.; Gomez, M.
2014-12-01
A significant problem in seismic array analysis is the inclusion of bad sensor channels in the beam-forming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by-node basis, so the dimensionality of the node traffic is instead monitored for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. We examine the signal dimension in similar way to the method addressing node traffic anomalies in large computer systems. We explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements. We show preliminary results applied to arrays in Kazakhstan (Makanchi) and Argentina (Malargue).
Improved 3D seismic images of dynamic deformation in the Nankai Trough off Kumano
NASA Astrophysics Data System (ADS)
Shiraishi, K.; Moore, G. F.; Yamada, Y.; Kinoshita, M.; Sanada, Y.; Kimura, G.
2016-12-01
In order to improve the seismic reflection image of dynamic deformation and seismogenic faults in the Nankai trough, the 2006 Kumano 3D seismic dataset was reprocessed from the original field records by applying advanced technologies a decade after the data acquisition and initial processing. The 3D seismic survey revealed the geometry of megasplay fault system. However, there were still unclear regions in the accretionary prism beneath from Kumano basin to the outer ridge, because of sea floor multiple reflections and noise caused by the Kuroshio current. For the next stage of deep scientific drilling into the Nankai trough seismogenic zone, it is essential to know exactly the shape and depth of the megasplay, and fine structures around the drilling site. Three important improvements were achieved in data processing before imaging. First, full deghosting and optimized zero phasing techniques could recover broadband signals, especially in low frequency, by compensating for ghost effects at both source and receiver, and removing source bubbles. Second, the multiple reflections better attenuated by applying advanced techniques in combination, and the strong noise caused by the Kuroshio were attenuated carefully. Third, data regularization by means of the optimized 4D trace interpolation was effective both to mitigate non-uniform fold distribution and to improve data quality. Further imaging processes led to obvious improvement from previous results by applying PSTM with higher order correction of VTI anisotropy, and PSDM based on the velocity model built by reflection tomography with TTI anisotropy. Final reflection images show new geological aspects, such as clear steep dip faults around the "notch", and fine scale faults related to main thrusts in frontal thrust zone. The improved images will highly contribute to understanding the deformation process in the old accretionary prism and seismogenic features related to the megasplay faults.
NASA Astrophysics Data System (ADS)
Kim, W.; Kim, Y.; Min, D.; Oh, J.; Huh, C.; Kang, S.
2012-12-01
During last two decades, CO2 sequestration in the subsurface has been extensively studied and progressed as a direct tool to reduce CO2 emission. Commercial projects such as Sleipner, In Salah and Weyburn that inject more than one million tons of CO2 per year are operated actively as well as test projects such as Ketzin to study the behavior of CO2 and the monitoring techniques. Korea also began the CCS (CO2 capture and storage) project. One of the prospects for CO2 sequestration in Korea is the southwestern continental margin of Ulleung basin. To monitor the behavior of CO2 underground for the evaluation of stability and safety, several geophysical monitoring techniques should be applied. Among various geophysical monitoring techniques, seismic survey is considered as the most effective tool. To verify CO2 migration in the subsurface more effectively, seismic numerical simulation is an essential process. Furthermore, the efficiency of the seismic migration techniques should be investigated for various cases because numerical seismic simulation and migration test help us accurately interpret CO2 migration. In this study, we apply the reverse-time migration and Kirchhoff migration to synthetic seismic monitoring data generated for the simplified model based on the geological structures of Ulleung basin in Korea. Synthetic seismic monitoring data are generated for various cases of CO2 migration in the subsurface. From the seismic migration images, we can investigate CO2 diffusion patterns indirectly. From seismic monitoring simulation, it is noted that while the reverse-time migration generates clear subsurface images when subsurface structures are steeply dipping, Kirchhoff migration has an advantage in imaging horizontal-layered structures such as depositional sediments appearing in the continental shelf. The reverse-time migration and Kirchhoff migration present reliable subsurface images for the potential site characterized by stratigraphical traps. In case of
NASA Astrophysics Data System (ADS)
Frassetto, A.; Busby, R. W.; Hafner, K.; Woodward, R.; Sauter, A.
2013-12-01
In preparation for the upcoming deployment of EarthScope's USArray Transportable Array (TA) in Alaska, the National Science Foundation (NSF) has supported exploratory work on seismic station design, sensor emplacement, and communication concepts appropriate for this challenging high-latitude environment. IRIS has installed several experimental stations to evaluate different sensor emplacement schemes both in Alaska and in the lower-48 of the U.S. The goal of these tests is to maintain or enhance a station's noise performance while minimizing its footprint and the weight of the equipment, materials, and overall expense required for its construction. Motivating this approach are recent developments in posthole broadband seismometer design and the unique conditions for operating in Alaska, where there are few roads, cellular communications are scarce, most areas are only accessible by small plane or helicopter, and permafrost underlies much of the state. We will review the methods used for directly emplacing broadband seismometers in comparison to the current methods used for the lower-48 TA. These new methods primarily focus on using a portable drill to make a bored hole three to five meters, beneath the active layer of the permafrost, or by coring 1-2 meters deep into surface bedrock. Both methods are logistically effective in preliminary trials. Subsequent station performance has been assessed quantitatively using probability density functions summed from power spectral density estimates. These are calculated for the continuous time series of seismic data recorded for each channel of the seismometer. There are five test stations currently operating in Alaska. One was deployed in August 2011 and the remaining four in October 2012. Our results show that the performance of seismometers in Alaska with auger-hole or core-hole installations can sometimes exceed that of the quietest TA stations in the lower-48, particularly horizontal components at long periods. A
National Seismic Network of Georgia
NASA Astrophysics Data System (ADS)
Tumanova, N.; Kakhoberashvili, S.; Omarashvili, V.; Tserodze, M.; Akubardia, D.
2016-12-01
Georgia, as a part of the Southern Caucasus, is tectonically active and structurally complex region. It is one of the most active segments of the Alpine-Himalayan collision belt. The deformation and the associated seismicity are due to the continent-continent collision between the Arabian and Eurasian plates. Seismic Monitoring of country and the quality of seismic data is the major tool for the rapid response policy, population safety, basic scientific research and in the end for the sustainable development of the country. National Seismic Network of Georgia has been developing since the end of 19th century. Digital era of the network started from 2003. Recently continuous data streams from 25 stations acquired and analyzed in the real time. Data is combined to calculate rapid location and magnitude for the earthquake. Information for the bigger events (Ml>=3.5) is simultaneously transferred to the website of the monitoring center and to the related governmental agencies. To improve rapid earthquake location and magnitude estimation the seismic network was enhanced by installing additional 7 new stations. Each new station is equipped with coupled Broadband and Strong Motion seismometers and permanent GPS system as well. To select the sites for the 7 new base stations, we used standard network optimization techniques. To choose the optimal sites for new stations we've taken into account geometry of the existed seismic network, topographic conditions of the site. For each site we studied local geology (Vs30 was mandatory for each site), local noise level and seismic vault construction parameters. Due to the country elevation, stations were installed in the high mountains, no accessible in winter due to the heavy snow conditions. To secure online data transmission we used satellite data transmission as well as cell data network coverage from the different local companies. As a result we've already have the improved earthquake location and event magnitudes. We
The New Italian Seismic Hazard Model
NASA Astrophysics Data System (ADS)
Marzocchi, W.; Meletti, C.; Albarello, D.; D'Amico, V.; Luzi, L.; Martinelli, F.; Pace, B.; Pignone, M.; Rovida, A.; Visini, F.
2017-12-01
of the different components of the PSHA model that has been built through three different independent steps: a formal experts' elicitation, the outcomes of the testing phase, and the correlation between the outcomes. Finally, we explore through different techniques the influence on seismic hazard of the declustering procedure.
The 2017 Maple Creek Seismic Swarm in Yellowstone National Park
NASA Astrophysics Data System (ADS)
Pang, G.; Hale, J. M.; Farrell, J.; Burlacu, R.; Koper, K. D.; Smith, R. B.
2017-12-01
The University of Utah Seismograph Stations (UUSS) performs near-real-time monitoring of seismicity in the region around Yellowstone National Park in partnership with the United States Geological Survey and the National Park Service. UUSS operates and maintains 29 seismic stations with network code WY (short-period, strong-motion, and broadband) and records data from five other seismic networks—IW, MB, PB, TA, and US—to enhance the location capabilities in the Yellowstone region. A seismic catalog is produced using a conventional STA/LTA detector and single-event location techniques (Hypoinverse). On June 12, 2017, a seismic swarm began in Yellowstone National Park about 5 km east of Hebgen Lake. The swarm is adjacent to the source region of the 1959 MW 7.3 Hebgen Lake earthquake, in an area corresponding to positive Coulumb stress change from that event. As of Aug. 1, 2017, the swarm consists of 1481 earthquakes with 1 earthquake above magnitude 4, 8 earthquakes in the magnitude 3 range, 115 earthquakes in the magnitude 2 range, 469 earthquakes in the magnitude 1 range, 856 earthquakes in the magnitude 0 range, 22 earthquakes with negative magnitudes, and 10 earthquakes with no magnitude. Earthquake depths are mostly between 3 and 10 km and earthquake depth increases toward the northwest. Moment tensors for the 2 largest events (3.6 MW and 4.4. MW) show strike-slip faulting with T axes oriented NE-SW, consistent with the regional stress field. We are currently using waveform cross-correlation methods to measure differential travel times that are being used with the GrowClust program to generate high-accuracy relative relocations. Those locations will be used to identify structures in the seismicity and make inferences about the tectonic and magmatic processes causing the swarm.
Seismic Risk Mitigation of Historical Minarets Using SMA Wire Dampers
NASA Astrophysics Data System (ADS)
El-Attar, Adel G.; Saleh, Ahmed M.; El-Habbal, Islam R.
2008-07-01
This paper presents the results of a research program sponsored by the European Commission through project WIND-CHIME (Wide Range Non-INtrusive Devices toward Conservation of HIstorical Monuments in the MEditerranean Area), in which the possibility of using advanced seismic protection technologies to preserve historical monuments in the Mediterranean area is investigated. In the current research, two outstanding Egyptian Mamluk-Style minarets, are investigated. The first is the southern minaret of Al-Sultaniya (1340 A.D, 739 Hijri Date (H.D.)), the second is the minaret of Qusun minaret (1337 A.D, 736 H.D.), both located within the city of Cairo. Based on previous studies on the minarets by the authors, a seismic retrofit technique is proposed. The technique utilizes shape memory alloy (SMA) wires as dampers for the upper, more flexible, parts of the minarets in addition to vertical pre-stressing of the lower parts found to be prone to tensile cracking under ground excitation. The effectiveness of the proposed technique is numerically evaluated via nonlinear transient dynamic analyses. The results indicate the effectiveness of the technique in mitigating the seismic hazard, demonstrated by the effective reduction in stresses and in dynamic response.
Seismic Interface Waves in Coastal Waters: A Review
1980-11-15
Being at the low- 4 frequency end of classical sonar activity and at the high-frequency end of seismic research, the propagation of infrasonic energy...water areas. Certainly this and other seismic detection methods will never replace the highly-developed sonar techniques but in coastal waters they...for many sonar purposes [5, 85 to 90) shows that very simple bottom models may already be sufficient to make allowance for the influence of the sea
The Budget Guide to Seismic Network Management
NASA Astrophysics Data System (ADS)
Hagerty, M. T.; Ebel, J. E.
2007-05-01
Regardless of their size, there are certain tasks that all seismic networks must perform, including data collection and processing, earthquake location, information dissemination, and quality control. Small seismic networks are unlikely to possess the resources -- manpower and money -- required to do much in-house development. Fortunately, there are a lot of free or inexpensive software solutions available that are able to perform many of the required tasks. Often the available solutions are all-in-one turnkey packages designed and developed for much larger seismic networks, and the cost of adapting them to a smaller network must be weighed against the ease with which other, non-seismic software can be adapted to the same task. We describe here the software and hardware choices we have made for the New England Seismic Network (NESN), a sparse regional seismic network responsible for monitoring and reporting all seismicity within the New England region in the northeastern U.S. We have chosen to use a cost-effective approach to monitoring using free, off-the-shelf solutions where available (e.g., Earthworm, HYP2000) and modifying freeware solutions when it is easier than trying to adapt a large, complicated package. We have selected for use software that is: free, likely to receive continued support from the seismic or, preferably, larger internet community, and modular. Modularity is key to our design because it ensures that if one component of our processing system becomes obsolete, we can insert a suitable replacement with few modifications to the other modules. Our automated event detection, identification and location system is based on a wavelet transform analysis of station data that arrive continuously via TCP/IP transmission over the internet. Our system for interactive analyst review of seismic events and remote system monitoring utilizes a combination of Earthworm modules, Perl cgi-bin scripts, Java, and native Unix commands and can now be carried out via
Bayesian Inference for Signal-Based Seismic Monitoring
NASA Astrophysics Data System (ADS)
Moore, D.
2015-12-01
Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. SIG-VISA (Signal-based Vertically Integrated Seismic Analysis) is a system for global seismic monitoring through Bayesian inference on seismic signals. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of recent geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a global network of stations. We demonstrate recent progress in scaling up SIG-VISA to efficiently process the data stream of global signals recorded by the International Monitoring System (IMS), including comparisons against existing processing methods that show increased sensitivity from our signal-based model and in particular the ability to locate events (including aftershock sequences that can tax analyst processing) precisely from waveform correlation effects. We also provide a Bayesian analysis of an alleged low-magnitude event near the DPRK test site in May 2010 [1] [2], investigating whether such an event could plausibly be detected through automated processing in a signal-based monitoring system. [1] Zhang, Miao and Wen, Lianxing. "Seismological Evidence for a Low-Yield Nuclear Test on 12 May 2010 in North Korea". Seismological Research Letters, January/February 2015. [2] Richards, Paul. "A Seismic Event in North Korea on 12 May 2010". CTBTO SnT 2015 oral presentation, video at https://video-archive.ctbto.org/index.php/kmc/preview/partner_id/103/uiconf_id/4421629/entry_id/0_ymmtpps0/delivery/http
Improving Seismic Event Characterisation
1996-07-22
classificat i,; and further phase identification . 6.4.3 Seismic event interpretation The’ system of event processing is based on an assumption tree ...and is enhanced with usez by a network. 14, SUBJECT TERMSý 15. NUMBER OF PAGES seismic models, travel. timtes phase identification 16 PRICE CODE 17...hesimwinlia’ rati of t lieDl scisillograonis is 2/3 secondIs andI the receiver spaci mi is 1 /3 degreeus. ’lIi iiaiiiii iltdiwic’ ewe ii rayv-the~oret~icaIl
Seismic Oceanography's Failure to Flourish: A Possible Solution
NASA Astrophysics Data System (ADS)
Ruddick, B. R.
2018-01-01
A recent paper in Journal of Geophysical Research: Oceans used multichannel seismic observations to map estimates of internal wave mixing in the Gulf of Mexico, finding greatly enhanced mixing over the slope region. These results suggest that the ocean margins may supply the mixing required to close the global thermohaline circulation, and the techniques demonstrated here might be used to map mixing over much of the world's continental shelves. The use of multichannel seismics to image ocean phenomena is nearly 15 years old, and despite the initial promise, the techniques have not become as broadly used as initially expected. We discuss possible reasons for this, and suggest an alternative approach that might gain broader success.
NASA Astrophysics Data System (ADS)
Fedotov, S. A.; Slavina, L. B.; Senyukov, S. L.; Kuchay, M. S.
2015-12-01
Seismic and volcanic processes in the area of the northern group of volcanoes (NGV) in Kamchatka Peninsula that accompanied the Great Tolbachik Fissure Eruption (GTFE) of 1975-1976 and the Tolbachik Fissure Eruption (TFE, or "50 let IViS" due to anniversary of the Institute of Volcanology and Seismology, Far East Branch, Russian Academy of Sciences) of 2012-2013 and the seismic activity between these events are considered. The features of evolution of seismic processes of the major NGV volcanoes (Ploskii Tolbachik, Klyuchevskoy, Bezymannyi, and Shiveluch) are revealed. The distribution of earthquakes along depth, their spatial and temporal migration, and the relation of seismic and volcanic activity are discussed. The major features of seismic activity during the GTFE preparation and evolution and a development of earthquake series preceding the origin of the northern and southern breaks are described. The character of seismic activity between the GTFE and TFE is shown. The major peculiarities of evolution of seismic activity preceding and accompanying the TFE are described. The major magma sources and conduits of the NGV volcanoes are identified, as is the existence of a main conduit in the mantle and a common intermediate source for the entire NGV, the depth of which is 25-35 km according to seismic data. The depth of a neutral buoyancy layer below the NGV is 15-20 km and the source of areal volcanism of magnesian basalts northeast of the Klyuchevskoy volcano is located at depth of ~20 km. These data support the major properties of a 2010 geophysical model of magmatic feeding system of the Klyuchevskoy group of volcanoes. The present paper covers a wider NGV area and is based on the real experimental observations.
NASA Astrophysics Data System (ADS)
Sens-Schönfelder, Christoph; Pomponi, Eraldo
2014-05-01
The velocity of seismic waves propagating in the edifice of Piton de la Fournaise volcano (La Reunion) is known to change in response to volcanic eruptions. Here we present a detailed investigation of a the period from end of 2009 until end of 2011 that contains eruptions, non-eruptive intrusions and periods of relaxation and perform a detailed comparison of the associated velocity signals. We use data from by 21 seismograph stations of the IPGP/OVPF seismic network installed on Piton de la Fournaise volcano within the UnderVolc project. Seismic noise of vertical and horizontal components of all possible station pairs is cross-correlated in chunks of 24 hours to obtain daily approximations of Green's functions in order to monitor tiny changes in therein that are related to changes of the elastic properties in the volcano. Velocity changes are measured as apparent stretching of the coda. For some station pairs the apparent velocity changes exceed 1% and a decorrelation of waveforms is observed at the time of volcanic activity. This distorts monitoring results if changes are measured with respect to a global reference. To overcome this we present a method to estimate changes using multiple references that stabilizes the quality of estimated velocity changes. We observe abrupt changes that occur coincident with volcanic events as well as long term transient signals. Using a simple assumption about the spatial sensitivity of our measurements we can map the spatial distribution of velocity changes for selected periods. Comparing these signals with volcanic activity and GPS derived surface displacement we can identify patterns of the velocity changes that appear characteristic for the different types of volcanic activity. We can differentiate intrusive processes associated with inflation and increased seismic activity, periods of relaxation without seismicity and eruptions solely based on the velocity signal. This information can help to assess the processes acting in
NASA Astrophysics Data System (ADS)
Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir
2016-04-01
We present a new seismic velocity model of the shallow lunar crust, including, for the first time, shear wave velocity information. So far, the shear wave velocity structure of the lunar near-surface was effectively unconstrained due to the complexity of lunar seismograms. Intense scattering and low attenuation in the lunar crust lead to characteristic long-duration reverberations on the seismograms. The reverberations obscure later arriving shear waves and mode conversions, rendering them impossible to identify and analyze. Additionally, only vertical component data were recorded during the Apollo active seismic experiments, which further compromises the identification of shear waves. We applied a novel processing and analysis technique to the data of the Apollo 17 lunar seismic profiling experiment (LSPE), which involved recording seismic energy generated by several explosive packages on a small areal array of four vertical component geophones. Our approach is based on the analysis of the spatial gradients of the seismic wavefield and yields key parameters such as apparent phase velocity and rotational ground motion as a function of time (depth), which cannot be obtained through conventional seismic data analysis. These new observables significantly enhance the data for interpretation of the recorded seismic wavefield and allow, for example, for the identification of S wave arrivals based on their lower apparent phase velocities and distinct higher amount of generated rotational motion relative to compressional (P-) waves. Using our methodology, we successfully identified pure-mode and mode-converted refracted shear wave arrivals in the complex LSPE data and derived a P- and S-wave velocity model of the shallow lunar crust at the Apollo 17 landing site. The extracted elastic-parameter model supports the current understanding of the lunar near-surface structure, suggesting a thin layer of low-velocity lunar regolith overlying a heavily fractured crust of basaltic
NASA Astrophysics Data System (ADS)
WANG, Q.
2017-12-01
Used the finite element analysis software GeoStudio to establish vibration analysis model of Qianjiangping landslide, which locates at the Three Gorges Reservoir area. In QUAKE/W module, we chosen proper Dynamic elasticity modulus and Poisson's ratio of soil layer and rock stratum. When loading, we selected the waveform data record of Three Gorge Telemetric Seismic Network as input ground motion, which includes five rupture events recorded of Lujiashan seismic station. In dynamic simulating, we mainly focused on sliding process when the earthquake date record was applied. The simulation result shows that Qianjiangping landslide wasn't not only affected by its own static force, but also experienced the dynamic process of micro fracture-creep-slip rupture-creep-slip.it provides a new approach for the early warning feasibility of rock landslide in future research.
Bedload transport from spectral analysis of seismic noise near rivers
NASA Astrophysics Data System (ADS)
Hsu, L.; Finnegan, N. J.; Brodsky, E. E.
2010-12-01
Channel change in rivers is driven by bedload sediment transport. However, the nonlinear nature of sediment transport combined with the difficulty of making direct observations in rivers at flood hinder prediction of the timing and magnitude of bedload movement. Recent studies have shown that spectral analysis of seismic noise from seismometers near rivers illustrate a correlation between the relative amplitude of high frequency (>1 Hz) seismic noise and conditions for bedload transport, presumably from the energy transferred from clast collisions with the channel. However, a previous study in the Himalayas did not contain extensive bedload transport or discharge measurements, and the correspondence of seismic noise with proxy variables such as regional hydrologic and meteorologic data was not exact. A more complete understanding of the relationship between bedload transport and seismic noise would be valuable for extending the spatial and temporal extent of bedload data. To explore the direct relationship between bedload transport and seismic noise, we examine data from several seismic stations near the Trinity River in California, where the fluvial morphodynamics and bedload rating curves have been studied extensively. We compare the relative amplitude of the ambient seismic noise with records of water discharge and sediment transport. We also examine the noise at hourly, daily, and seasonal timescales to determine other possible sources of noise. We report the influence of variables such as local river slope, adjacent geology, anthropogenic noise, and distance from the river. The results illustrate the feasibility of using existing seismic arrays to sense radiated energy from processes of bedload transport. In addition, the results can be used to design future seismic array campaigns to optimize information about bedload transport. This technique provides great spatial and temporal coverage, and can be performed where direct bedload measurements are difficult or
Joint seismic-infrasonic processing of recordings from a repeating source of atmospheric explosions.
Gibbons, Steven J; Ringdal, Frode; Kvaerna, Tormod
2007-11-01
A database has been established of seismic and infrasonic recordings from more than 100 well-constrained surface explosions, conducted by the Finnish military to destroy old ammunition. The recorded seismic signals are essentially identical and indicate that the variation in source location and magnitude is negligible. In contrast, the infrasonic arrivals on both seismic and infrasound sensors exhibit significant variation both with regard to the number of detected phases, phase travel times, and phase amplitudes, which would be attributable to atmospheric factors. This data set provides an excellent database for studies in sound propagation, infrasound array detection, and direction estimation.
A synthetic seismicity model for the Middle America Trench
NASA Technical Reports Server (NTRS)
Ward, Steven N.
1991-01-01
A novel iterative technique, based on the concept of fault segmentation and computed using 2D static dislocation theory, for building models of seismicity and fault interaction which are physically acceptable and geometrically and kinematically correct, is presented. The technique is applied in two steps to seismicity observed at the Middle America Trench. The first constructs generic models which randomly draw segment strengths and lengths from a 2D probability distribution. The second constructs predictive models in which segment lengths and strengths are adjusted to mimic the actual geography and timing of large historical earthquakes. Both types of models reproduce the statistics of seismicity over five units of magnitude and duplicate other aspects including foreshock and aftershock sequences, migration of foci, and the capacity to produce both characteristic and noncharacteristic earthquakes. Over a period of about 150 yr the complex interaction of fault segments and the nonlinear failure conditions conspire to transform an apparently deterministic model into a chaotic one.
High-Resolution Seismic Imaging of Near-Surface Voids
NASA Astrophysics Data System (ADS)
Gritto, R.; Korneev, V. A.; Elobaid, E. A.; Mohamed, F.; Sadooni, F.
2017-12-01
A major hazard in Qatar is the presence of karst, which is ubiquitous throughout the country including depressions, sinkholes, and caves. Causes for the development of karst include faulting and fracturing where fluids find pathways through limestone and dissolve the host rock to form caverns. Of particular concern in rapidly growing metropolitan areas that expand in heretofore unexplored regions are the collapse of such caverns. Because Qatar has seen a recent boom in construction, including the planning and development of complete new sub-sections of metropolitan areas, the development areas need to be investigated for the presence of karst to determine their suitability for the planned project. We present a suite of seismic techniques applied to a controlled experiment to detect, locate and estimate the size of a karst analog in form of a man-made water shaft on the campus of Qatar University, Doha, Qatar. Seismic waves are well suited for karst detection and characterization. Voids represent high-contrast seismic objects that exhibit strong responses due to incident seismic waves. However, the complex geometry of karst, including shape and size, makes their imaging nontrivial. While karst detection can be reduced to the simple problem of detecting an anomaly, karst characterization can be complicated by the 3D nature of the problem of unknown scale, where irregular surfaces can generate diffracted waves of different kind. In our presentation, we employ a variety of seismic techniques to demonstrate the detection and characterization of a vertical water collection shaft analyzing the phase, amplitude and spectral information of seismic waves that have been scattered by the object. We use the reduction in seismic wave amplitudes and the delay in phase arrival times in the geometrical shadow of the vertical shaft to independently detect and locate the object in space. Additionally, we use narrow band-pass filtered data combining two orthogonal transmission surveys
NASA Technical Reports Server (NTRS)
Phillips, Roger J.; Grimm, Robert E.
1991-01-01
The design and ultimate success of network seismology experiments on Mars depends on the present level of Martian seismicity. Volcanic and tectonic landforms observed from imaging experiments show that Mars must have been a seismically active planet in the past and there is no reason to discount the notion that Mars is seismically active today but at a lower level of activity. Models are explored for present day Mars seismicity. Depending on the sensitivity and geometry of a seismic network and the attenuation and scattering properties of the interior, it appears that a reasonable number of Martian seismic events would be detected over the period of a decade. The thermoelastic cooling mechanism as estimated is surely a lower bound, and a more refined estimate would take into account specifically the regional cooling of Tharsis and lead to a higher frequency of seismic events.
Classifying seismic noise and sources from OBS data using unsupervised machine learning
NASA Astrophysics Data System (ADS)
Mosher, S. G.; Audet, P.
2017-12-01
The paradigm of plate tectonics was established mainly by recognizing the central role of oceanic plates in the production and destruction of tectonic plates at their boundaries. Since that realization, however, seismic studies of tectonic plates and their associated deformation have slowly shifted their attention toward continental plates due to the ease of installation and maintenance of high-quality seismic networks on land. The result has been a much more detailed understanding of the seismicity patterns associated with continental plate deformation in comparison with the low-magnitude deformation patterns within oceanic plates and at their boundaries. While the number of high-quality ocean-bottom seismometer (OBS) deployments within the past decade has demonstrated the potential to significantly increase our understanding of tectonic systems in oceanic settings, OBS data poses significant challenges to many of the traditional data processing techniques in seismology. In particular, problems involving the detection, location, and classification of seismic sources occurring within oceanic settings are much more difficult due to the extremely noisy seafloor environment in which data are recorded. However, classifying data without a priori constraints is a problem that is routinely pursued via unsupervised machine learning algorithms, which remain robust even in cases involving complicated datasets. In this research, we apply simple unsupervised machine learning algorithms (e.g., clustering) to OBS data from the Cascadia Initiative in an attempt to classify and detect a broad range of seismic sources, including various noise sources and tremor signals occurring within ocean settings.
CALIBRATION OF SEISMIC ATTRIBUTES FOR RESERVOIR CHARACTERIZATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wayne D. Pennington; Horacio Acevedo; Aaron Green
2002-10-01
The project, ''Calibration of Seismic Attributes for Reservoir Calibration,'' is now complete. Our original proposed scope of work included detailed analysis of seismic and other data from two to three hydrocarbon fields; we have analyzed data from four fields at this level of detail, two additional fields with less detail, and one other 2D seismic line used for experimentation. We also included time-lapse seismic data with ocean-bottom cable recordings in addition to the originally proposed static field data. A large number of publications and presentations have resulted from this work, including several that are in final stages of preparation ormore » printing; one of these is a chapter on ''Reservoir Geophysics'' for the new Petroleum Engineering Handbook from the Society of Petroleum Engineers. Major results from this project include a new approach to evaluating seismic attributes in time-lapse monitoring studies, evaluation of pitfalls in the use of point-based measurements and facies classifications, novel applications of inversion results, improved methods of tying seismic data to the wellbore, and a comparison of methods used to detect pressure compartments. Some of the data sets used are in the public domain, allowing other investigators to test our techniques or to improve upon them using the same data. From the public-domain Stratton data set we have demonstrated that an apparent correlation between attributes derived along ''phantom'' horizons are artifacts of isopach changes; only if the interpreter understands that the interpretation is based on this correlation with bed thickening or thinning, can reliable interpretations of channel horizons and facies be made. From the public-domain Boonsville data set we developed techniques to use conventional seismic attributes, including seismic facies generated under various neural network procedures, to subdivide regional facies determined from logs into productive and non-productive subfacies, and we
Calibration of Seismic Attributes for Reservoir Characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wayne D. Pennington
2002-09-29
The project, "Calibration of Seismic Attributes for Reservoir Characterization," is now complete. Our original proposed scope of work included detailed analysis of seismic and other data from two to three hydrocarbon fields; we have analyzed data from four fields at this level of detail, two additional fields with less detail, and one other 2D seismic line used for experimentation. We also included time-lapse seismic data with ocean-bottom cable recordings in addition to the originally proposed static field data. A large number of publications and presentations have resulted from this work, inlcuding several that are in final stages of preparation ormore » printing; one of these is a chapter on "Reservoir Geophysics" for the new Petroleum Engineering Handbook from the Society of Petroleum Engineers. Major results from this project include a new approach to evaluating seismic attributes in time-lapse monitoring studies, evaluation of pitfalls in the use of point-based measurements and facies classifications, novel applications of inversion results, improved methods of tying seismic data to the wellbore, and a comparison of methods used to detect pressure compartments. Some of the data sets used are in the public domain, allowing other investigators to test our techniques or to improve upon them using the same data. From the public-domain Stratton data set we have demonstrated that an apparent correlation between attributes derived along 'phantom' horizons are artifacts of isopach changes; only if the interpreter understands that the interpretation is based on this correlation with bed thickening or thinning, can reliable interpretations of channel horizons and facies be made. From the public-domain Boonsville data set we developed techniques to use conventional seismic attributes, including seismic facies generated under various neural network procedures, to subdivide regional facies determined from logs into productive and non-productive subfacies, and we
NASA Astrophysics Data System (ADS)
Seleznev, V. S.; Soloviev, V. M.; Emanov, A. F.
The paper is devoted to researches of influence of seismic actions for industrial and civil buildings and people. The seismic actions bring influence directly on the people (vibration actions, force shocks at earthquakes) or indirectly through various build- ings and the constructions and can be strong (be felt by people) and weak (be fixed by sensing devices). The great number of work is devoted to influence of violent seismic actions (first of all of earthquakes) on people and various constructions. This work is devoted to study weak, but long seismic actions on various buildings and people. There is a need to take into account seismic oscillations, acting on the territory, at construction of various buildings on urbanized territories. Essential influence, except for violent earthquakes, man-caused seismic actions: the explosions, seismic noise, emitted by plant facilities and moving transport, radiation from high-rise buildings and constructions under action of a wind, etc. can exert. Materials on increase of man- caused seismicity in a number of regions in Russia, which earlier were not seismic, are presented in the paper. Along with maps of seismic microzoning maps to be built indicating a variation of amplitude spectra of seismic noise within day, months, years. The presence of an information about amplitudes and frequencies of oscillations from possible earthquakes and man-caused oscillations in concrete regions allows carry- ing out soundly designing and construction of industrial and civil housing projects. The construction of buildings even in not seismically dangerous regions, which have one from resonance frequencies coincident on magnitude to frequency of oscillations, emitted in this place by man-caused objects, can end in failure of these buildings and heaviest consequences for the people. The practical examples of detail of engineering- seismological investigation of large industrial and civil housing projects of Siberia territory (hydro power
Seismic velocity estimation from time migration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cameron, Maria Kourkina
2007-01-01
This is concerned with imaging and wave propagation in nonhomogeneous media, and includes a collection of computational techniques, such as level set methods with material transport, Dijkstra-like Hamilton-Jacobi solvers for first arrival Eikonal equations and techniques for data smoothing. The theoretical components include aspects of seismic ray theory, and the results rely on careful comparison with experiment and incorporation as input into large production-style geophysical processing codes. Producing an accurate image of the Earth's interior is a challenging aspect of oil recovery and earthquake analysis. The ultimate computational goal, which is to accurately produce a detailed interior map of themore » Earth's makeup on the basis of external soundings and measurements, is currently out of reach for several reasons. First, although vast amounts of data have been obtained in some regions, this has not been done uniformly, and the data contain noise and artifacts. Simply sifting through the data is a massive computational job. Second, the fundamental inverse problem, namely to deduce the local sound speeds of the earth that give rise to measured reacted signals, is exceedingly difficult: shadow zones and complex structures can make for ill-posed problems, and require vast computational resources. Nonetheless, seismic imaging is a crucial part of the oil and gas industry. Typically, one makes assumptions about the earth's substructure (such as laterally homogeneous layering), and then uses this model as input to an iterative procedure to build perturbations that more closely satisfy the measured data. Such models often break down when the material substructure is significantly complex: not surprisingly, this is often where the most interesting geological features lie. Data often come in a particular, somewhat non-physical coordinate system, known as time migration coordinates. The construction of substructure models from these data is less and less reliable
The Collaborative Seismic Earth Model: Generation 1
NASA Astrophysics Data System (ADS)
Fichtner, Andreas; van Herwaarden, Dirk-Philip; Afanasiev, Michael; SimutÄ--, SaulÄ--; Krischer, Lion; ćubuk-Sabuncu, Yeşim; Taymaz, Tuncay; Colli, Lorenzo; Saygin, Erdinc; Villaseñor, Antonio; Trampert, Jeannot; Cupillard, Paul; Bunge, Hans-Peter; Igel, Heiner
2018-05-01
We present a general concept for evolutionary, collaborative, multiscale inversion of geophysical data, specifically applied to the construction of a first-generation Collaborative Seismic Earth Model. This is intended to address the limited resources of individual researchers and the often limited use of previously accumulated knowledge. Model evolution rests on a Bayesian updating scheme, simplified into a deterministic method that honors today's computational restrictions. The scheme is able to harness distributed human and computing power. It furthermore handles conflicting updates, as well as variable parameterizations of different model refinements or different inversion techniques. The first-generation Collaborative Seismic Earth Model comprises 12 refinements from full seismic waveform inversion, ranging from regional crustal- to continental-scale models. A global full-waveform inversion ensures that regional refinements translate into whole-Earth structure.
NASA Astrophysics Data System (ADS)
Cassisi, Carmelo; Prestifilippo, Michele; Cannata, Andrea; Montalto, Placido; Patanè, Domenico; Privitera, Eugenio
2016-07-01
From January 2011 to December 2015, Mt. Etna was mainly characterized by a cyclic eruptive behavior with more than 40 lava fountains from New South-East Crater. Using the RMS (Root Mean Square) of the seismic signal recorded by stations close to the summit area, an automatic recognition of the different states of volcanic activity (QUIET, PRE-FOUNTAIN, FOUNTAIN, POST-FOUNTAIN) has been applied for monitoring purposes. Since values of the RMS time series calculated on the seismic signal are generated from a stochastic process, we can try to model the system generating its sampled values, assumed to be a Markov process, using Hidden Markov Models (HMMs). HMMs analysis seeks to recover the sequence of hidden states from the observations. In our framework, observations are characters generated by the Symbolic Aggregate approXimation (SAX) technique, which maps RMS time series values with symbols of a pre-defined alphabet. The main advantages of the proposed framework, based on HMMs and SAX, with respect to other automatic systems applied on seismic signals at Mt. Etna, are the use of multiple stations and static thresholds to well characterize the volcano states. Its application on a wide seismic dataset of Etna volcano shows the possibility to guess the volcano states. The experimental results show that, in most of the cases, we detected lava fountains in advance.
Seismic depth imaging of sequence boundaries beneath the New Jersey shelf
NASA Astrophysics Data System (ADS)
Riedel, M.; Reiche, S.; Aßhoff, K.; Buske, S.
2018-06-01
Numerical modelling of fluid flow and transport processes relies on a well-constrained geological model, which is usually provided by seismic reflection surveys. In the New Jersey shelf area a large number of 2D seismic profiles provide an extensive database for constructing a reliable geological model. However, for the purpose of modelling groundwater flow, the seismic data need to be depth-converted which is usually accomplished using complementary data from borehole logs. Due to the limited availability of such data in the New Jersey shelf, we propose a two-stage processing strategy with particular emphasis on reflection tomography and pre-stack depth imaging. We apply this workflow to a seismic section crossing the entire New Jersey shelf. Due to the tomography-based velocity modelling, the processing flow does not depend on the availability of borehole logging data. Nonetheless, we validate our results by comparing the migrated depths of selected geological horizons to borehole core data from the IODP expedition 313 drill sites, located at three positions along our seismic line. The comparison yields that in the top 450 m of the migrated section, most of the selected reflectors were positioned with an accuracy close to the seismic resolution limit (≈ 4 m) for that data. For deeper layers the accuracy still remains within one seismic wavelength for the majority of the tested horizons. These results demonstrate that the processed seismic data provide a reliable basis for constructing a hydrogeological model. Furthermore, the proposed workflow can be applied to other seismic profiles in the New Jersey shelf, which will lead to an even better constrained model.
Earthquake Monitoring with the MyShake Global Smartphone Seismic Network
NASA Astrophysics Data System (ADS)
Inbal, A.; Kong, Q.; Allen, R. M.; Savran, W. H.
2017-12-01
Smartphone arrays have the potential for significantly improving seismic monitoring in sparsely instrumented urban areas. This approach benefits from the dense spatial coverage of users, as well as from communication and computational capabilities built into smartphones, which facilitate big seismic data transfer and analysis. Advantages in data acquisition with smartphones trade-off with factors such as the low-quality sensors installed in phones, high noise levels, and strong network heterogeneity, all of which limit effective seismic monitoring. Here we utilize network and array-processing schemes to asses event detectability with the MyShake global smartphone network. We examine the benefits of using this network in either triggered or continuous modes of operation. A global database of ground motions measured on stationary phones triggered by M2-6 events is used to establish detection probabilities. We find that the probability of detecting an M=3 event with a single phone located <10 km from the epicenter exceeds 70%. Due to the sensor's self-noise, smaller magnitude events at short epicentral distances are very difficult to detect. To increase the signal-to-noise ratio, we employ array back-projection techniques on continuous data recorded by thousands of phones. In this class of methods, the array is used as a spatial filter that suppresses signals emitted from shallow noise sources. Filtered traces are stacked to further enhance seismic signals from deep sources. We benchmark our technique against traditional location algorithms using recordings from California, a region with large MyShake user database. We find that locations derived from back-projection images of M 3 events recorded by >20 nearby phones closely match the regional catalog locations. We use simulated broadband seismic data to examine how location uncertainties vary with user distribution and noise levels. To this end, we have developed an empirical noise model for the metropolitan Los
NASA Astrophysics Data System (ADS)
Taisne, B.; Caudron, C.; Kugaenko, Y.; Saltykov, V.
2015-12-01
In contrast of the 1975-76 Tolbachik eruption, the 2012-2013 Tolbachik eruption was not preceded by any striking change in seismic activity. By processing the Klyuchevskoy volcano group seismic data with the Seismic Amplitude Ratio Analysis (SARA) method, we gain insights into the dynamics of magma transfer prior to this important eruption. We highlighted a clear migration of the source of the microseismicity within the seismic swarm, starting 20 hours before the reported eruption onset (05:15 UTC, 26 November 2012). This migration proceeded in different phases and ended when eruptive tremor, corresponding to lava extrusion, was recorded (at ~11:00 UTC, 27 November 2012). In order to get a first order approximation of the location of the magma, we compare the calculated seismic intensity ratios with the theoretical ones. As expected, the observations suggest a migration toward the eruptive vent. However, we explain the pre-eruptive observed ratios by a vertical migration under the northern slope of Plosky Tolbachik volcano that would interact at shallower depth with an intermediate storage region and initiate the lateral migration toward the eruptive vents. Another migration is also captured by this technique and coincides with a seismic swarm that started 16-20 km to the south of Plosky Tolbachik at 20:31 UTC on November 28 and lasted for more than 2 days. This seismic swarm is very similar to the seismicity preceding the 1975-76 Tolbachik eruption and can be considered as a possible aborted eruption.
Seismic risk assessment and application in the central United States
Wang, Z.
2011-01-01
Seismic risk is a somewhat subjective, but important, concept in earthquake engineering and other related decision-making. Another important concept that is closely related to seismic risk is seismic hazard. Although seismic hazard and seismic risk have often been used interchangeably, they are fundamentally different: seismic hazard describes the natural phenomenon or physical property of an earthquake, whereas seismic risk describes the probability of loss or damage that could be caused by a seismic hazard. The distinction between seismic hazard and seismic risk is of practical significance because measures for seismic hazard mitigation may differ from those for seismic risk reduction. Seismic risk assessment is a complicated process and starts with seismic hazard assessment. Although probabilistic seismic hazard analysis (PSHA) is the most widely used method for seismic hazard assessment, recent studies have found that PSHA is not scientifically valid. Use of PSHA will lead to (1) artifact estimates of seismic risk, (2) misleading use of the annual probability of exccedance (i.e., the probability of exceedance in one year) as a frequency (per year), and (3) numerical creation of extremely high ground motion. An alternative approach, which is similar to those used for flood and wind hazard assessments, has been proposed. ?? 2011 ASCE.
Seismic Fracture Characterization Methodologies for Enhanced Geothermal Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Queen, John H.
2016-05-09
Executive Summary The overall objective of this work was the development of surface and borehole seismic methodologies using both compressional and shear waves for characterizing faults and fractures in Enhanced Geothermal Systems. We used both surface seismic and vertical seismic profile (VSP) methods. We adapted these methods to the unique conditions encountered in Enhanced Geothermal Systems (EGS) creation. These conditions include geological environments with volcanic cover, highly altered rocks, severe structure, extreme near surface velocity contrasts and lack of distinct velocity contrasts at depth. One of the objectives was the development of methods for identifying more appropriate seismic acquisition parametersmore » for overcoming problems associated with these geological factors. Because temperatures up to 300º C are often encountered in these systems, another objective was the testing of VSP borehole tools capable of operating at depths in excess of 1,000 m and at temperatures in excess of 200º C. A final objective was the development of new processing and interpretation techniques based on scattering and time-frequency analysis, as well as the application of modern seismic migration imaging algorithms to seismic data acquired over geothermal areas. The use of surface seismic reflection data at Brady's Hot Springs was found useful in building a geological model, but only when combined with other extensive geological and geophysical data. The use of fine source and geophone spacing was critical in producing useful images. The surface seismic reflection data gave no information about the internal structure (extent, thickness and filling) of faults and fractures, and modeling suggests that they are unlikely to do so. Time-frequency analysis was applied to these data, but was not found to be significantly useful in their interpretation. Modeling does indicate that VSP and other seismic methods with sensors located at depth in wells will be the most
Design and development of digital seismic amplifier recorder
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samsidar, Siti Alaa; Afuar, Waldy; Handayani, Gunawan, E-mail: gunawanhandayani@gmail.com
2015-04-16
A digital seismic recording is a recording technique of seismic data in digital systems. This method is more convenient because it is more accurate than other methods of seismic recorders. To improve the quality of the results of seismic measurements, the signal needs to be amplified to obtain better subsurface images. The purpose of this study is to improve the accuracy of measurement by amplifying the input signal. We use seismic sensors/geophones with a frequency of 4.5 Hz. The signal is amplified by means of 12 units of non-inverting amplifier. The non-inverting amplifier using IC 741 with the resistor values 1KΩmore » and 1MΩ. The amplification results were 1,000 times. The results of signal amplification converted into digital by using the Analog Digital Converter (ADC). Quantitative analysis in this study was performed using the software Lab VIEW 8.6. The Lab VIEW 8.6 program was used to control the ADC. The results of qualitative analysis showed that the seismic conditioning can produce a large output, so that the data obtained is better than conventional data. This application can be used for geophysical methods that have low input voltage such as microtremor application.« less
Applications of seismic spatial wavefield gradient and rotation data in exploration seismology
NASA Astrophysics Data System (ADS)
Schmelzbach, C.; Van Renterghem, C.; Sollberger, D.; Häusler, M.; Robertsson, J. O. A.
2017-12-01
Seismic spatial wavefield gradient and rotation data have the potential to open up new ways to address long-standing problems in land-seismic exploration such as identifying and separating P-, S-, and surface waves. Gradient-based acquisition and processing techniques could enable replacing large arrays of densely spaced receivers by sparse spatially-compact receiver layouts or even one single multicomponent station with dedicated instruments (e.g., rotational seismometers). Such approaches to maximize the information content of single-station recordings are also of significant interest for seismic measurements at sites with limited access such as boreholes, the sea bottom, and extraterrestrial seismology. Arrays of conventional three-component (3C) geophones enable measuring not only the particle velocity in three dimensions but also estimating their spatial gradients. Because the free-surface condition allows to express vertical derivatives in terms of horizontal derivatives, the full gradient tensor and, hence, curl and divergence of the wavefield can be computed. In total, three particle velocity components, three rotational components, and divergence, result seven-component (7C) seismic data. Combined particle velocity and gradient data can be used to isolate the incident P- or S-waves at the land surface or the sea bottom using filtering techniques based on the elastodynamic representation theorem. Alternatively, as only S-waves exhibit rotational motion, rotational measurements can directly be used to identify S-waves. We discuss the derivations of the gradient-based filters as well as their application to synthetic and field data, demonstrating that rotational data can be of particular interest to S-wave reflection and P-to-S-wave conversion imaging. The concept of array-derived gradient estimation can be extended to source arrays as well. Therefore, source arrays allow us to emulate rotational (curl) and dilatational (divergence) sources. Combined with 7C
The Time-Frequency Signatures of Advanced Seismic Signals Generated by Debris Flows
NASA Astrophysics Data System (ADS)
Chu, C. R.; Huang, C. J.; Lin, C. R.; Wang, C. C.; Kuo, B. Y.; Yin, H. Y.
2014-12-01
The seismic monitoring is expected to reveal the process of debris flow from the initial area to alluvial fan, because other field monitoring techniques, such as the video camera and the ultrasonic sensor, are limited by detection range. For this reason, seismic approaches have been used as the detection system of debris flows over the past few decades. The analysis of the signatures of the seismic signals in time and frequency domain can be used to identify the different phases of debris flow. This study dedicates to investigate the different stages of seismic signals due to debris flow, including the advanced signal, the main front, and the decaying tail. Moreover, the characteristics of the advanced signals forward to the approach of main front were discussed for the warning purpose. This study presents a permanent system, composed by two seismometers, deployed along the bank of Ai-Yu-Zi Creek in Nantou County, which is one of the active streams with debris flow in Taiwan. The three axes seismometer with frequency response of 7 sec - 200 Hz was developed by the Institute of Earth Sciences (IES), Academia Sinica for the purpose to detect debris flow. The original idea of replacing the geophone system with the seismometer technique was for catching the advanced signals propagating from the upper reach of the stream before debris flow arrival because of the high sensitivity. Besides, the low frequency seismic waves could be also early detected because of the low attenuation. However, for avoiding other unnecessary ambient vibrations, the sensitivity of seismometer should be lower than the general seismometer for detecting teleseism. Three debris flows with different mean velocities were detected in 2013 and 2014. The typical triangular shape was obviously demonstrated in time series data and the spectrograms of the seismic signals from three events. The frequency analysis showed that enormous debris flow bearing huge boulders would induce low frequency seismic
Seismicity and seismic hazard in Sabah, East Malaysia from earthquake and geodetic data
NASA Astrophysics Data System (ADS)
Gilligan, A.; Rawlinson, N.; Tongkul, F.; Stephenson, R.
2017-12-01
While the levels of seismicity are low in most of Malaysia, the state of Sabah in northern Borneo has moderate levels of seismicity. Notable earthquakes in the region include the 1976 M6.2 Lahad Datu earthquake and the 2015 M6 Ranau earthquake. The recent Ranau earthquake resulted in the deaths of 18 people on Mt Kinabalu, an estimated 100 million RM ( US$23 million) damage to buildings, roads, and infrastructure from shaking, and flooding, reduced water quality, and damage to farms from landslides. Over the last 40 years the population of Sabah has increased to over four times what it was in 1976, yet seismic hazard in Sabah remains poorly understood. Using seismic and geodetic data we hope to better quantify the hazards posed by earthquakes in Sabah, and thus help to minimize risk. In order to do this we need to know about the locations of earthquakes, types of earthquakes that occur, and faults that are generating them. We use data from 15 MetMalaysia seismic stations currently operating in Sabah to develop a region-specific velocity model from receiver functions and a pre-existing surface wave model. We use this new velocity model to (re)locate earthquakes that occurred in Sabah from 2005-2016, including a large number of aftershocks from the 2015 Ranau earthquake. We use a probabilistic nonlinear earthquake location program to locate the earthquakes and then refine their relative locations using a double difference method. The recorded waveforms are further used to obtain moment tensor solutions for these earthquakes. Earthquake locations and moment tensor solutions are then compared with the locations of faults throughout Sabah. Faults are identified from high-resolution IFSAR images and subsequent fieldwork, with a particular focus on the Lahad Datau and Ranau areas. Used together, these seismic and geodetic data can help us to develop a new seismic hazard model for Sabah, as well as aiding in the delivery of outreach activities regarding seismic hazard
Quantitative Seismic Interpretation: Applying Rock Physics Tools to Reduce Interpretation Risk
NASA Astrophysics Data System (ADS)
Sondergeld, Carl H.
This book is divided into seven chapters that cover rock physics, statistical rock physics, seismic inversion techniques, case studies, and work flows. On balance, the emphasis is on rock physics. Included are 56 color figures that greatly help in the interpretation of more complicated plots and displays.The domain of rock physics falls between petrophysics and seismics. It is the basis for interpreting seismic observations and therefore is pivotal to the understanding of this book. The first two chapters are dedicated to this topic (109 pages).
Building configuration and seismic design: The architecture of earthquake resistance
NASA Astrophysics Data System (ADS)
Arnold, C.; Reitherman, R.; Whitaker, D.
1981-05-01
The architecture of a building in relation to its ability to withstand earthquakes was determined. Aspects of round motion which are significant to building behavior are discussed. Results of a survey of configuration decisions that affect the performance of buildings with a focus on the architectural aspects of configuration design are provided. Configuration derivation, building type as it relates to seismic design, and seismic design, and seismic issues in the design process are examined. Case studies of the Veterans' Administration Hospital in Loma Linda, California, and the Imperial Hotel in Tokyo, Japan, are presented. The seismic design process is described paying special attention to the configuration issues. The need is stressed for guidelines, codes, and regulations to ensure design solutions that respect and balance the full range of architectural, engineering, and material influences on seismic hazards.
Using Network Theory to Understand Seismic Noise in Dense Arrays
NASA Astrophysics Data System (ADS)
Riahi, N.; Gerstoft, P.
2015-12-01
Dense seismic arrays offer an opportunity to study anthropogenic seismic noise sources with unprecedented detail. Man-made sources typically have high frequency, low intensity, and propagate as surface waves. As a result attenuation restricts their measurable footprint to a small subset of sensors. Medium heterogeneities can further introduce wave front perturbations that limit processing based on travel time. We demonstrate a non-parametric technique that can reliably identify very local events within the array as a function of frequency and time without using travel-times. The approach estimates the non-zero support of the array covariance matrix and then uses network analysis tools to identify clusters of sensors that are sensing a common source. We verify the method on simulated data and then apply it to the Long Beach (CA) geophone array. The method exposes a helicopter traversing the array, oil production facilities with different characteristics, and the fact that noise sources near roads tend to be around 10-20 Hz.
Infrasound and Seismic Recordings of Rocket Launches from Kennedy Space Center, 2016-2017
NASA Astrophysics Data System (ADS)
McNutt, S. R.; Thompson, G.; Brown, R. G.; Braunmiller, J.; Farrell, A. K.; Mehta, C.
2017-12-01
We installed a temporary 3-station seismic-infrasound network at Kennedy Space Center (KSC) in February 2016 to test sensor calibrations and train students in field deployment and data acquisitions techniques. Each station featured a single broadband 3-component seismometer and a 3-element infrasound array. In May 2016 the network was scaled back to a single station due to other projects competing for equipment. To date 8 rocket launches have been recorded by the infrasound array, as well as 2 static tests, 1 aborted launch and 1 rocket explosion (see next abstract). Of the rocket launches recorded 4 were SpaceX Falcon-9, 2 were ULA Atlas-5 and 2 were ULA Delta-IV. A question we attempt to answer is whether the rocket engine type and launch trajectory can be estimated with appropriate travel-time, amplitude-ratio and spectral techniques. For example, there is a clear Doppler shift in seismic and infrasound spectrograms from all launches, with lower frequencies occurring later in the recorded signal as the rocket accelerates away from the array. Another question of interest is whether there are relationships between jet noise frequency, thrust and/or nozzle velocity. Infrasound data may help answer these questions. We are now in the process of deploying a permanent seismic and infrasound array at the Astronaut Beach House. 10 more rocket launches are schedule before AGU. NASA is also conducting a series of 33 sonic booms over KSC beginning on Aug 21st. Launches and other events at KSC have provided rich sources of signals that are useful to characterize and gain insight into physical processes and wave generation from man-made sources.
NASA Astrophysics Data System (ADS)
Heckels, R. EG; Savage, M. K.; Townend, J.
2018-05-01
Quantifying seismic velocity changes following large earthquakes can provide insights into fault healing and reloading processes. This study presents temporal velocity changes detected following the 2010 September Mw 7.1 Darfield event in Canterbury, New Zealand. We use continuous waveform data from several temporary seismic networks lying on and surrounding the Greendale Fault, with a maximum interstation distance of 156 km. Nine-component, day-long Green's functions were computed for frequencies between 0.1 and 1.0 Hz for continuous seismic records from immediately after the 2010 September 04 earthquake until 2011 January 10. Using the moving-window cross-spectral method, seismic velocity changes were calculated. Over the study period, an increase in seismic velocity of 0.14 ± 0.04 per cent was determined near the Greendale Fault, providing a new constraint on post-seismic relaxation rates in the region. A depth analysis further showed that velocity changes were confined to the uppermost 5 km of the subsurface. We attribute the observed changes to post-seismic relaxation via crack healing of the Greendale Fault and throughout the surrounding region.
Observing Drought-Induced Groundwater Depletion in California with Seismic Noise
NASA Astrophysics Data System (ADS)
Clements, T.; Denolle, M.
2017-12-01
While heavy rainfall replenished reservoirs and snowpack recovered in winter 2016/2017, groundwater levels across much of California are still at or near all-time lows following one of the worst droughts in the state's history. Groundwater depletion in California has been studied extensively using GPS, InSAR, and GRACE. Here, we propose to monitor groundwater levels across California through measuring the temporal variation in seismic velocity (dv/v) at a regional scale. In the last decade, dv/v has emerged as a technique to investigate near surface and surficial processes such as landslides, volcanic eruptions, and earthquakes. Toward predicting groundwater levels through real-time monitoring with seismic noise, we investigate the relations between the dv/v time series and observed groundwater levels. 12 years (Jan 2006 - July 2017) of noise cross-correlation functions (CCF) are computed from continuous vertical component seismic data recorded at 100+ sites across California. Velocity changes (dv/v) are obtained by inverting all daily CCFs to produce a dv/v time series for each station pair. Our preliminary results show a seasonal variation in dv/v along with a gradual increase in dv/v throughout the drought. We interpret the increase in dv/v as a response to declining groundwater levels.
Waveform Retrieval and Phase Identification for Seismic Data from the CASS Experiment
NASA Astrophysics Data System (ADS)
Li, Zhiwei; You, Qingyu; Ni, Sidao; Hao, Tianyao; Wang, Hongti; Zhuang, Cantao
2013-05-01
The little destruction to the deployment site and high repeatability of the Controlled Accurate Seismic Source (CASS) shows its potential for investigating seismic wave velocities in the Earth's crust. However, the difficulty in retrieving impulsive seismic waveforms from the CASS data and identifying the seismic phases substantially prevents its wide applications. For example, identification of the seismic phases and accurate measurement of travel times are essential for resolving the spatial distribution of seismic velocities in the crust. Until now, it still remains a challenging task to estimate the accurate travel times of different seismic phases from the CASS data which features extended wave trains, unlike processing of the waveforms from impulsive events such as earthquakes or explosive sources. In this study, we introduce a time-frequency analysis method to process the CASS data, and try to retrieve the seismic waveforms and identify the major seismic phases traveling through the crust. We adopt the Wigner-Ville Distribution (WVD) approach which has been used in signal detection and parameter estimation for linear frequency modulation (LFM) signals, and proves to feature the best time-frequency convergence capability. The Wigner-Hough transform (WHT) is applied to retrieve the impulsive waveforms from multi-component LFM signals, which comprise seismic phases with different arrival times. We processed the seismic data of the 40-ton CASS in the field experiment around the Xinfengjiang reservoir with the WVD and WHT methods. The results demonstrate that these methods are effective in waveform retrieval and phase identification, especially for high frequency seismic phases such as PmP and SmS with strong amplitudes in large epicenter distance of 80-120 km. Further studies are still needed to improve the accuracy on travel time estimation, so as to further promote applicability of the CASS for and imaging the seismic velocity structure.
Kinematic Seismic Rupture Parameters from a Doppler Analysis
NASA Astrophysics Data System (ADS)
Caldeira, Bento; Bezzeghoud, Mourad; Borges, José F.
2010-05-01
The radiation emitted from extended seismic sources, mainly when the rupture spreads in preferred directions, presents spectral deviations as a function of the observation location. This aspect, unobserved to point sources, and named as directivity, are manifested by an increase in the frequency and amplitude of seismic waves when the rupture occurs in the direction of the seismic station and a decrease in the frequency and amplitude if it occurs in the opposite direction. The model of directivity that supports the method is a Doppler analysis based on a kinematic source model of rupture and wave propagation through a structural medium with spherical symmetry [1]. A unilateral rupture can be viewed as a sequence of shocks produced along certain paths on the fault. According this model, the seismic record at any point on the Earth's surface contains a signature of the rupture process that originated the recorded waveform. Calculating the rupture direction and velocity by a general Doppler equation, - the goal of this work - using a dataset of common time-delays read from waveforms recorded at different distances around the epicenter, requires the normalization of measures to a standard value of slowness. This normalization involves a non-linear inversion that we solve numerically using an iterative least-squares approach. The evaluation of the performance of this technique was done through a set of synthetic and real applications. We present the application of the method at four real case studies, the following earthquakes: Arequipa, Peru (Mw = 8.4, June 23, 2001); Denali, AK, USA (Mw = 7.8; November 3, 2002); Zemmouri-Boumerdes, Algeria (Mw = 6.8, May 21, 2003); and Sumatra, Indonesia (Mw = 9.3, December 26, 2004). The results obtained from the dataset of the four earthquakes agreed, in general, with the values presented by other authors using different methods and data. [1] Caldeira B., Bezzeghoud M, Borges JF, 2009; DIRDOP: a directivity approach to determining
A permanent seismic station beneath the Ocean Bottom
NASA Astrophysics Data System (ADS)
Harris, David; Cessaro, Robert K.; Duennebier, Fred K.; Byrne, David A.
1987-03-01
The Hawaii Institute of Geophysics began development of the Ocean Subbottom Seisometer (OSS) system in 1978, and OSS systems were installed in four locations between 1979 and 1982. The OSS system is a permanent, deep ocean borehole seismic recording system composed of a borehole sensor package (tool), an electromechanical cable, recorder package, and recovery system. Installed near the bottom of a borehole (drilled by the D/V Glomar Challenger), the tool contains three orthogonal, 4.5-Hz geophones, two orthogonal tilt meters; and a temperature sensor. Signals from these sensors are multiplexed, digitized (with a floating point technique), and telemetered through approximately 10 km of electromechanical cable to a recorder package located near the ocean bottom. Electrical power for the tool is supplied from the recorder package. The digital seismic signals are demultiplexed, converted back to analog form, processed through an automatic gain control (AGC) circuit, and recorded along with a time code on magnetic tape cassettes in the recorder package. Data may be recorded continuously for up to two months in the self-contained recorder package. Data may also be recorded in real time (digital formal) during the installation and subsequent recorder package servicing. The recorder package is connected to a submerged recovery buoy by a length of bouyant polypropylene rope. The anchor on the recovery buoy is released by activating either of the acoustical command releases. The polypropylene rope may also be seized with a grappling hook to effect recovery. The recorder package may be repeatedly serviced as long as the tool remains functional A wide range of data has been recovered from the OSS system. Recovered analog records include signals from natural seismic sources such as earthquakes (teleseismic and local), man-made seismic sources such as refraction seismic shooting (explosives and air cannons), and nuclear tests. Lengthy continuous recording has permitted analysis
Hierarchical Bayesian Modeling of Fluid-Induced Seismicity
NASA Astrophysics Data System (ADS)
Broccardo, M.; Mignan, A.; Wiemer, S.; Stojadinovic, B.; Giardini, D.
2017-11-01
In this study, we present a Bayesian hierarchical framework to model fluid-induced seismicity. The framework is based on a nonhomogeneous Poisson process with a fluid-induced seismicity rate proportional to the rate of injected fluid. The fluid-induced seismicity rate model depends upon a set of physically meaningful parameters and has been validated for six fluid-induced case studies. In line with the vision of hierarchical Bayesian modeling, the rate parameters are considered as random variables. We develop both the Bayesian inference and updating rules, which are used to develop a probabilistic forecasting model. We tested the Basel 2006 fluid-induced seismic case study to prove that the hierarchical Bayesian model offers a suitable framework to coherently encode both epistemic uncertainty and aleatory variability. Moreover, it provides a robust and consistent short-term seismic forecasting model suitable for online risk quantification and mitigation.
NASA Astrophysics Data System (ADS)
Dorbath, C.; Calo, M.; Cornet, F.; Frogneux, M.
2011-12-01
One major goal of monitoring seismicity accompanying hydraulic fracturing of a reservoir is to recover the seismic velocity field in and around the geothermal site. Several studies have shown that the 4D (time dependent) seismic tomographies are very useful to illustrate and study the temporal variation of the seismic velocities conditioned by injected fluids. However, only an appropriate separation of the data in subsets and a reliable tomographic method allow studying representative variations of the seismic velocities during and after the injection periods. We present here new 4D seismic tomographies performed using datasets regarding some stimulation tests performed at the Enhanced Geothermal System (EGS) site of Soultz-sous-Forêts (Alsace, France). The data used were recorded during the stimulation tests occurred in 2000, 2003 and 2004 that involved the wells GPK2, GPK3 and GPK4. For each set of events, the subsetting of the data was performed by taking into account the injection parameters of the stimulation tests (namely the injected flow rate and the wellhead pressure). The velocity models have been obtained using the Double-Difference tomographic method (Zhang and Thurber 2003) and further improved with the post-processing WAM technique (Calo' et al., 2009, 2011). This technique resulted very powerful because combines high resolution and reliablity of the seismic velocity fields calculated even with small datasets. In this work we show the complete sequence of the time-lapse tomographies and their variations in time and between different stimulation tests.
Quantitative modeling of reservoir-triggered seismicity
NASA Astrophysics Data System (ADS)
Hainzl, S.; Catalli, F.; Dahm, T.; Heinicke, J.; Woith, H.
2017-12-01
Reservoir-triggered seismicity might occur as the response to the crustal stress caused by the poroelastic response to the weight of the water volume and fluid diffusion. Several cases of high correlations have been found in the past decades. However, crustal stresses might be altered by many other processes such as continuous tectonic stressing and coseismic stress changes. Because reservoir-triggered stresses decay quickly with distance, even tidal or rainfall-triggered stresses might be of similar size at depth. To account for simultaneous stress sources in a physically meaningful way, we apply a seismicity model based on calculated stress changes in the crust and laboratory-derived friction laws. Based on the observed seismicity, the model parameters can be determined by maximum likelihood method. The model leads to quantitative predictions of the variations of seismicity rate in space and time which can be used for hypothesis testing and forecasting. For case studies in Talala (India), Val d'Agri (Italy) and Novy Kostel (Czech Republic), we show the comparison of predicted and observed seismicity, demonstrating the potential and limitations of the approach.
Probabilistic Seismic Hazard Assessment for Iraq
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onur, Tuna; Gok, Rengin; Abdulnaby, Wathiq
Probabilistic Seismic Hazard Assessments (PSHA) form the basis for most contemporary seismic provisions in building codes around the world. The current building code of Iraq was published in 1997. An update to this edition is in the process of being released. However, there are no national PSHA studies in Iraq for the new building code to refer to for seismic loading in terms of spectral accelerations. As an interim solution, the new draft building code was considering to refer to PSHA results produced in the late 1990s as part of the Global Seismic Hazard Assessment Program (GSHAP; Giardini et al.,more » 1999). However these results are: a) more than 15 years outdated, b) PGA-based only, necessitating rough conversion factors to calculate spectral accelerations at 0.3s and 1.0s for seismic design, and c) at a probability level of 10% chance of exceedance in 50 years, not the 2% that the building code requires. Hence there is a pressing need for a new, updated PSHA for Iraq.« less
3D Modelling of Seismically Active Parts of Underground Faults via Seismic Data Mining
NASA Astrophysics Data System (ADS)
Frantzeskakis, Theofanis; Konstantaras, Anthony
2015-04-01
During the last few years rapid steps have been taken towards drilling for oil in the western Mediterranean sea. Since most of the countries in the region benefit mainly from tourism and considering that the Mediterranean is a closed sea only replenishing its water once every ninety years careful measures are being taken to ensure safe drilling. In that concept this research work attempts to derive a three dimensional model of the seismically active parts of the underlying underground faults in areas of petroleum interest. For that purpose seismic spatio-temporal clustering has been applied to seismic data to identify potential distinct seismic regions in the area of interest. Results have been coalesced with two dimensional maps of underground faults from past surveys and seismic epicentres, having followed careful reallocation processing, have been used to provide information regarding the vertical extent of multiple underground faults in the region of interest. The end product is a three dimensional map of the possible underground location and extent of the seismically active parts of underground faults. Indexing terms: underground faults modelling, seismic data mining, 3D visualisation, active seismic source mapping, seismic hazard evaluation, dangerous phenomena modelling Acknowledgment This research work is supported by the ESPA Operational Programme, Education and Life Long Learning, Students Practical Placement Initiative. References [1] Alves, T.M., Kokinou, E. and Zodiatis, G.: 'A three-step model to assess shoreline and offshore susceptibility to oil spills: The South Aegean (Crete) as an analogue for confined marine basins', Marine Pollution Bulletin, In Press, 2014 [2] Ciappa, A., Costabile, S.: 'Oil spill hazard assessment using a reverse trajectory method for the Egadi marine protected area (Central Mediterranean Sea)', Marine Pollution Bulletin, vol. 84 (1-2), pp. 44-55, 2014 [3] Ganas, A., Karastathis, V., Moshou, A., Valkaniotis, S., Mouzakiotis
Physical modeling of the formation and evolution of seismically active fault zones
Ponomarev, A.V.; Zavyalov, A.D.; Smirnov, V.B.; Lockner, D.A.
1997-01-01
Acoustic emission (AE) in rocks is studied as a model of natural seismicity. A special technique for rock loading has been used to help study the processes that control the development of AE during brittle deformation. This technique allows us to extend to hours fault growth which would normally occur very rapidly. In this way, the period of most intense interaction of acoustic events can be studied in detail. Characteristics of the acoustic regime (AR) include the Gutenberg-Richter b-value, spatial distribution of hypocenters with characteristic fractal (correlation) dimension d, Hurst exponent H, and crack concentration parameter Pc. The fractal structure of AR changes with the onset of the drop in differential stress during sample deformation. The change results from the active interaction of microcracks. This transition of the spatial distribution of AE hypocenters is accompanied by a corresponding change in the temporal correlation of events and in the distribution of event amplitudes as signified by a decrease of b-value. The characteristic structure that develops in the low-energy background AE is similar to the sequence of the strongest microfracture events. When the AR fractal structure develops, the variations of d and b are synchronous and d = 3b. This relation which occurs once the fractal structure is formed only holds for average values of d and b. Time variations of d and b are anticorrelated. The degree of temporal correlation of AR has time variations that are similar to d and b variations. The observed variations in laboratory AE experiments are compared with natural seismicity parameters. The close correspondence between laboratory-scale observations and naturally occurring seismicity suggests a possible new approach for understanding the evolution of complex seismicity patterns in nature. ?? 1997 Elsevier Science B.V. All rights reserved.
Full-waveform seismic tomography of the Vrancea, Romania, subduction region
NASA Astrophysics Data System (ADS)
Baron, Julie; Morelli, Andrea
2017-12-01
The Vrancea region is one of the few locations of deep seismicity in Europe. Seismic tomography has been able to map lithospheric downwelling, but has not been able yet to clearly discriminate between competing geodynamic interpretations of the geological and geophysical evidence available. We study the seismic structure of the Vrancea subduction zone, using adjoint-based, full-waveform tomography to map the 3D vP and vS structure in detail. We use the database that was built during the CALIXTO (Carpathian Arc Lithosphere X-Tomography) temporary experiment, restricted to the broadband sensors and local intermediate-depth events. We fit waveforms with a cross-correlation misfit criterion in separate time windows around the expected P and S arrivals, and perform 17 iterations of vP and vS model updates (altogether, requiring about 16 million CPU hours) before reaching stable convergence. Among other features, our resulting model shows a nearly vertical, high-velocity body, that overlaps with the distribution of seismicity in its northeastern part. In its southwestern part, a slab appears to dip less steeply to the NW, and is suggestive of ongoing - or recently concluded - subduction geodynamic processes. Joint inversion for vP and vS allow us to address the vP/vS ratio distribution, that marks high vP/vS in the crust beneath the Focsani sedimentary basin - possibly due to high fluid pressure - and a low vP/vS edge along the lower plane of the subducting lithosphere, that in other similar environment has been attributed to dehydration of serpentine in the slab. In spite of the restricted amount of data available, and limitations on the usable frequency pass-band, full-waveform inversion reveals its potential to improve the general quality of imaging with respect to other tomographic techniques - although at a sensible cost in terms of computing resources. Our study also shows that re-analysis of legacy data sets with up-to-date techniques may bring new, useful
Multicomponent ensemble models to forecast induced seismicity
NASA Astrophysics Data System (ADS)
Király-Proag, E.; Gischig, V.; Zechar, J. D.; Wiemer, S.
2018-01-01
In recent years, human-induced seismicity has become a more and more relevant topic due to its economic and social implications. Several models and approaches have been developed to explain underlying physical processes or forecast induced seismicity. They range from simple statistical models to coupled numerical models incorporating complex physics. We advocate the need for forecast testing as currently the best method for ascertaining if models are capable to reasonably accounting for key physical governing processes—or not. Moreover, operational forecast models are of great interest to help on-site decision-making in projects entailing induced earthquakes. We previously introduced a standardized framework following the guidelines of the Collaboratory for the Study of Earthquake Predictability, the Induced Seismicity Test Bench, to test, validate, and rank induced seismicity models. In this study, we describe how to construct multicomponent ensemble models based on Bayesian weightings that deliver more accurate forecasts than individual models in the case of Basel 2006 and Soultz-sous-Forêts 2004 enhanced geothermal stimulation projects. For this, we examine five calibrated variants of two significantly different model groups: (1) Shapiro and Smoothed Seismicity based on the seismogenic index, simple modified Omori-law-type seismicity decay, and temporally weighted smoothed seismicity; (2) Hydraulics and Seismicity based on numerically modelled pore pressure evolution that triggers seismicity using the Mohr-Coulomb failure criterion. We also demonstrate how the individual and ensemble models would perform as part of an operational Adaptive Traffic Light System. Investigating seismicity forecasts based on a range of potential injection scenarios, we use forecast periods of different durations to compute the occurrence probabilities of seismic events M ≥ 3. We show that in the case of the Basel 2006 geothermal stimulation the models forecast hazardous levels
NASA Astrophysics Data System (ADS)
Nixon, C.; Kofman, R.; Schmitt, D. R.; Lofi, J.; Gulick, S. P. S.; Christeson, G. L.; Saustrup, S., Sr.; Morgan, J. V.
2017-12-01
We acquired a closely-spaced vertical seismic profile (VSP) in the Chicxulub K-Pg Impact Crater drilling program borehole to calibrate the existing surface seismic profiles and provide complementary measurements of in situ seismic wave speeds. Downhole seismic records were obtained at spacings ranging from 1.25 m to 5 m along the borehole from 47.5 m to 1325 mwsf (meters wireline below sea floor) (Fig 1a) using a Sercel SlimwaveTM geophone chain (University of Alberta). The seismic source was a 30/30ci Sercel Mini GI airgun (University of Texas), fired a minimum of 5 times per station. Seismic data processing used a combination of a commercial processing package (Schlumberger's VISTA) and MatlabTM codes. The VSP displays detailed reflectivity (Fig. 1a) with the strongest reflection seen at 600 mwsf (280 ms one-way time), geologically corresponding to the sharp contact between the post-impact sediments and the target peak ring rock, thus confirming the pre-drilling interpretations of the seismic profiles. A two-way time trace extracted from the separated up-going wavefield matches the major reflection both in travel time and character. In the granitic rocks that form the peak ring of the Chicxulub impact crater, we observe P-wave velocities of 4000-4500 m/s which are significantly less than the expected values of granitoids ( 6000 m/s) (Fig. 1b). The VSP measured wave speeds are confirmed against downhole sonic logging and in laboratory velocimetry measurements; these data provide additional evidence that the crustal material displaced by the impact experienced a significant amount of damage. Samples and data provided by IODP. Samples can be requested at http://web.iodp.tamu.edu/sdrm after 19 October 2017. Expedition 364 was jointly funded by ECORD, ICDP, and IODP with contributions and logistical support from the Yucatan State Government and UNAM. The downhole seismic chain and wireline system is funded by grants to DRS from the Canada Foundation for Innovation and
NASA Astrophysics Data System (ADS)
De Freitas, J. M.
2011-05-01
This review looks at recent developments in seismic seabed oil reservoir monitoring techniques using fibre-optic sensing networks. After a brief introduction covering the background and scope of the review, the following section focuses on state-of-the-art fibre-optic hydrophones and accelerometers used for seismic applications. Related metrology aspects of the sensor such as measurement of sensitivity, noise and cross-axis performance are addressed. The third section focuses on interrogation systems. Two main phase-based competing systems have emerged over the past two decades for seismic applications, with a third technique showing much promise; these have been compared in terms of general performance.
Reassessment of the Seismicity and seismic hazards of Libya
NASA Astrophysics Data System (ADS)
Ben Suleman, A.; Elmeladi, A.
2009-04-01
The tectonic evolution of Libya, located at the northern extreme of the African continent, has yielded a complex crustal structure that is composed of a series of basins and uplifts. The present day deformation of Libya is the result of the Eurasia-Africa continental collision. At the end of the year 2005, The Libyan National Seismological Network was established to monitor local, regional and teleseismic activities, as well as to provide high quality data for research projects both locally and on the regional and global scale. This study aims to discuss the seismicity of Libya by using the new data from the Libyan national seismological network and to focus on the seismic hazards. At first glance the seismic activity map shows dominant trends of seismicity with most of the seismic activity concentrated along the northern coastal areas. Four major seismic trends were quite noticeable. A first trend is a NW-SE direction coinciding with the eastern boarder of the Hun Graben. A second trend is also a NW-SE direction in the offshore area and might be a continuation of this trend. The other two trends were located in the western Gulf of Sirt and Cyrenaica platform. The rest of seismicity is diffuse either offshore or in land, with no good correlation with well-mapped faults. Detailed investigations of the Libyan seismicity indicates that the Libya has experienced earthquakes of varying magnitudes and that there is definitely a certain amount of seismic risk involved in engineering projects, particularly in the northern regions. Detailed investigation of the distribution of the Libyan earthquakes in space and time along with all other geological considerations suggested the classification of the country into four seismic zones with the Hun graben zone being the most seismically active zone.
NASA Astrophysics Data System (ADS)
Baziw, Erick; Verbeek, Gerald
2012-12-01
Among engineers there is considerable interest in the real-time identification of "events" within time series data with a low signal to noise ratio. This is especially true for acoustic emission analysis, which is utilized to assess the integrity and safety of many structures and is also applied in the field of passive seismic monitoring (PSM). Here an array of seismic receivers are used to acquire acoustic signals to monitor locations where seismic activity is expected: underground excavations, deep open pits and quarries, reservoirs into which fluids are injected or from which fluids are produced, permeable subsurface formations, or sites of large underground explosions. The most important element of PSM is event detection: the monitoring of seismic acoustic emissions is a continuous, real-time process which typically runs 24 h a day, 7 days a week, and therefore a PSM system with poor event detection can easily acquire terabytes of useless data as it does not identify crucial acoustic events. This paper outlines a new algorithm developed for this application, the so-called SEED™ (Signal Enhancement and Event Detection) algorithm. The SEED™ algorithm uses real-time Bayesian recursive estimation digital filtering techniques for PSM signal enhancement and event detection.
Nature of the seismic crust at the Aegir Ridge: A downward continuation approach
NASA Astrophysics Data System (ADS)
Rai, Abhishek; Breivik, Asbj|rn; Mjelde, Rolf; Hanan, Barry; Ito, Garrett; Sayit, Kaan; Howell, Sam; Vogt, Peter; Pedersen, Rolf-Birger
2013-04-01
The marine seismic data are influenced by variations in the thickness and velocity of the water column which causes fluctuations in the arrival times of seismic phases. Downward continuation of the ocean-bottom seismometer data are used to remove the contributions of the water column by bring the shot and receiver at a common datum such as the seafloor. Additionally, the downward continuation focus the seismic energy and hence improves the resolution. We apply the downward continuation technique to analyze the OBS data collected along the eastern shoulder of the Aegir Ridge. The Aegir Ridge is an extinct spreading ridge in the North-East Atlantic ocean. Its proximity to the active Iceland hot-spot makes it important for understanding the process of hotspot-ridge interaction during the Oligocene. We present results of an OBS experiment, supported by single channel streamer, gravity and magnetic observations. Usable seismic data from 20 OBSs distributed along ~550 km length of the profile reveal the variations in crustal thickness and seismic velocities. Regional magnetic anomalies show a faster spreading rate towards the north and a slower spreading towards the southern end near the Iceland hotspot during the active period of the ridge. However, the observed and the predicted crustal thickness show an opposite trend. We interpret this anti-correlation between the seafloor spreading rate and the crustal thickness as a result of the interaction between the Iceland hotspot and the Aegir Ridge.
Vertical Cable Seismic Survey for Hydrothermal Deposit
NASA Astrophysics Data System (ADS)
Asakawa, E.; Murakami, F.; Sekino, Y.; Okamoto, T.; Ishikawa, K.; Tsukahara, H.; Shimura, T.
2012-04-01
The vertical cable seismic is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. This type of survey is generally called VCS (Vertical Cable Seismic). Because VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed the method for the hydrothermal deposit survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We are now developing a VCS system, including not only data acquisition hardware but data processing and analysis technique. Our first experiment of VCS surveys has been carried out in Lake Biwa, JAPAN in November 2009 for a feasibility study. Prestack depth migration is applied to the 3D VCS data to obtain a high quality 3D depth volume. Based on the results from the feasibility study, we have developed two autonomous recording VCS systems. After we carried out a trial experiment in the actual ocean at a water depth of about 400m and we carried out the second VCS survey at Iheya Knoll with a deep-towed source. In this survey, we could establish the procedures for the deployment/recovery of the system and could examine the locations and the fluctuations of the vertical cables at a water depth of around 1000m. The acquired VCS data clearly shows the reflections from the sub-seafloor. Through the experiment, we could confirm that our VCS system works well even in the severe circumstances around the locations of seafloor hydrothermal deposits. We have, however, also confirmed that the uncertainty in the locations of the source and of the hydrophones could lower the quality of subsurface image. It is, therefore, strongly necessary to develop a total survey system that assures a accurate positioning and a deployment techniques
Intelligent seismic risk mitigation system on structure building
NASA Astrophysics Data System (ADS)
Suryanita, R.; Maizir, H.; Yuniorto, E.; Jingga, H.
2018-01-01
Indonesia located on the Pacific Ring of Fire, is one of the highest-risk seismic zone in the world. The strong ground motion might cause catastrophic collapse of the building which leads to casualties and property damages. Therefore, it is imperative to properly design the structural response of building against seismic hazard. Seismic-resistant building design process requires structural analysis to be performed to obtain the necessary building responses. However, the structural analysis could be very difficult and time consuming. This study aims to predict the structural response includes displacement, velocity, and acceleration of multi-storey building with the fixed floor plan using Artificial Neural Network (ANN) method based on the 2010 Indonesian seismic hazard map. By varying the building height, soil condition, and seismic location in 47 cities in Indonesia, 6345 data sets were obtained and fed into the ANN model for the learning process. The trained ANN can predict the displacement, velocity, and acceleration responses with up to 96% of predicted rate. The trained ANN architecture and weight factors were later used to build a simple tool in Visual Basic program which possesses the features for prediction of structural response as mentioned previously.
Landslide maps and seismic noise: Rockmass weakening caused by shallow earthquakes
NASA Astrophysics Data System (ADS)
Uchida, Tara; Marc, Odin; Sens-Schönfelder, Christoph; Sawazaki, Kaoru; Hobiger, Manuel; Hovius, Niels
2015-04-01
Some studies have suggested that the shaking and deformation associated with earthquake would result in a temporary increased hillslope erodibility. However very few data have been able to clarify such effect. We present integrated geomorphic data constraining an elevated landslide rate following 4 continental shallow earthquakes, the Mw 6.9 Finisterre (1993), the Mw 7.6 ChiChi (1999), the Mw 6.6 Niigata (2004) and the Mw 6.8 Iwate-Miyagi (2008) earthquakes. We constrained the magnitude, the recovery time and somewhat the mechanism at the source of this higher landslide risk. We provide some evidences excluding aftershocks or rain forcing intensity as possible mechanism and leaving subsurface weakening as the most likely. The landslide data suggest that this ground strength weakening is not limited to the soil cover but also affect the shallow bedrock. Additionally, we used ambient noise autocorrelation techniques to monitor shallow subsurface seismic velocity within the epicentral area of three of those earthquakes. For most stations we observe a velocity drop followed by a recovery processes of several years in fair agreement with the recovery time estimated based on landslide observation. Thus a common processes could alter the strength of the first 10m of soil/rock and simultaneously drive the landslide rate increase and the seismic velocity drop. The ability to firmly demonstrate this link require additional constraints on the seismic signal interpretation but would provide a very useful tool for post-earthquake risk managment.
Development of Vertical Cable Seismic System (3)
NASA Astrophysics Data System (ADS)
Asakawa, E.; Murakami, F.; Tsukahara, H.; Mizohata, S.; Ishikawa, K.
2013-12-01
The VCS (Vertical Cable Seismic) is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. Because VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed the method for the hydrothermal deposit survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We are now developing a VCS system, including not only data acquisition hardware but data processing and analysis technique. We carried out several VCS surveys combining with surface towed source, deep towed source and ocean bottom source. The water depths of the survey are from 100m up to 2100m. The target of the survey includes not only hydrothermal deposit but oil and gas exploration. Through these experiments, our VCS data acquisition system has been completed. But the data processing techniques are still on the way. One of the most critical issues is the positioning in the water. The uncertainty in the positions of the source and of the hydrophones in water degraded the quality of subsurface image. GPS navigation system are available on sea surface, but in case of deep-towed source or ocean bottom source, the accuracy of shot position with SSBL/USBL is not sufficient for the very high-resolution imaging. We have developed another approach to determine the positions in water using the travel time data from the source to VCS hydrophones. In the data acquisition stage, we estimate the position of VCS location with slant ranging method from the sea surface. The deep-towed source or ocean bottom source is estimated by SSBL/USBL. The water velocity profile is measured by XCTD. After the data acquisition, we pick the first break times of the VCS recorded data. The estimated positions of
Seismic tomography as a tool for measuring stress in mines
Scott, Douglas F.; Williams, T.J.; Denton, D.K.; Friedel, M.J.
1999-01-01
Spokane Research Center personnel have been investigating the use of seismic tomography to monitor the behavior of a rock mass, detect hazardous ground conditions and assess the mechanical integrity of a rock mass affected by mining. Seismic tomography can be a valuable tool for determining relative stress in deep, >1,220-m (>4,000-ft), underground pillars. If high-stress areas are detected, they can be destressed prior to development or they can be avoided. High-stress areas can be monitored with successive seismic surveys to determine if stress decreases to a level where development can be initiated safely. There are several benefits to using seismic tomography to identify high stress in deep underground pillars. The technique is reliable, cost-effective, efficient and noninvasive. Also, investigators can monitor large rock masses, as well as monitor pillars during the mining cycle. By identifying areas of high stress, engineers will be able to assure that miners are working in a safer environment.Spokane Research Center personnel have been investigating the use of seismic tomography to monitor the behavior of a rock mass, detect hazardous ground conditions and assess the mechanical integrity of a rock mass affected by mining. Seismic tomography can be a valuable tool for determining relative stress in deep, >1,200-m (>4,000-ft), underground pillars. If high-stress areas are detected, they can be destressed prior to development or they can be avoided. High-stress areas can be monitored with successive seismic surveys to determine if stress decreases to a level where development can be initiated safely. There are several benefits to using seismic tomography to identify high stress in deep underground pillars. The technique is reliable, cost-effective, efficient and noninvasive. Also, investigators can monitor large rock masses, as well as monitor pillars during the mining cycle. By identifying areas of high stress. engineers will be able to assure that miners are
Mammoth Mountain, California broadband seismic experiment
NASA Astrophysics Data System (ADS)
Dawson, P. B.; Pitt, A. M.; Wilkinson, S. K.; Chouet, B. A.; Hill, D. P.; Mangan, M.; Prejean, S. G.; Read, C.; Shelly, D. R.
2013-12-01
Mammoth Mountain is a young cumulo-volcano located on the southwest rim of Long Valley caldera, California. Current volcanic processes beneath Mammoth Mountain are manifested in a wide range of seismic signals, including swarms of shallow volcano-tectonic earthquakes, upper and mid-crustal long-period earthquakes, swarms of brittle-failure earthquakes in the lower crust, and shallow (3-km depth) very-long-period earthquakes. Diffuse emissions of C02 began after a magmatic dike injection beneath the volcano in 1989, and continue to present time. These indications of volcanic unrest drive an extensive monitoring effort of the volcano by the USGS Volcano Hazards Program. As part of this effort, eleven broadband seismometers were deployed on Mammoth Mountain in November 2011. This temporary deployment is expected to run through the fall of 2013. These stations supplement the local short-period and broadband seismic stations of the Northern California Seismic Network (NCSN) and provide a combined network of eighteen broadband stations operating within 4 km of the summit of Mammoth Mountain. Data from the temporary stations are not available in real-time, requiring the merging of the data from the temporary and permanent networks, timing of phases, and relocation of seismic events to be accomplished outside of the standard NCSN processing scheme. The timing of phases is accomplished through an interactive Java-based phase-picking routine, and the relocation of seismicity is achieved using the probabilistic non-linear software package NonLinLoc, distributed under the GNU General Public License by Alomax Scientific. Several swarms of shallow volcano-tectonic earthquakes, spasmodic bursts of high-frequency earthquakes, a few long-period events located within or below the edifice of Mammoth Mountain and numerous mid-crustal long-period events have been recorded by the network. To date, about 900 of the ~2400 events occurring beneath Mammoth Mountain since November 2011 have
50 years of Global Seismic Observations
NASA Astrophysics Data System (ADS)
Anderson, K. R.; Butler, R.; Berger, J.; Davis, P.; Derr, J.; Gee, L.; Hutt, C. R.; Leith, W. S.; Park, J. J.
2007-12-01
Seismological recordings have been made on Earth for hundreds of years in some form or another, however, global monitoring of earthquakes only began in the 1890's when John Milne created 40 seismic observatories to measure the waves from these events. Shortly after the International Geophysical Year (IGY), a concerted effort was made to establish and maintain a more modern standardized seismic network on the global scale. In the early 1960's, the World-Wide Standardized Seismograph Network (WWSSN) was established through funding from the Advanced Research Projects Agency (ARPA) and was installed and maintained by the USGS's Albuquerque Seismological Laboratory (then a part of the US Coast and Geodetic Survey). This network of identical seismic instruments consisted of 120 stations in 60 countries. Although the network was motivated by nuclear test monitoring, the WWSSN facilitated numerous advances in observational seismology. From the IGY to the present, the network has been upgraded (High-Gain Long-Period Seismograph Network, Seismic Research Observatories, Digital WWSSN, Global Telemetered Seismograph Network, etc.) and expanded (International Deployment of Accelerometers, US National Seismic Network, China Digital Seismograph Network, Joint Seismic Project, etc.), bringing the modern day Global Seismographic Network (GSN) to a current state of approximately 150 stations. The GSN consists of state-of-the-art very broadband seismic transducers, continuous power and communications, and ancillary sensors including geodetic, geomagnetic, microbarographic, meteorological and other related instrumentation. Beyond the GSN, the system of global network observatories includes contributions from other international partners (e.g., GEOSCOPE, GEOFON, MEDNET, F-Net, CTBTO), forming an even larger backbone of permanent seismological observatories as a part of the International Federation of Digital Seismograph Networks. 50 years of seismic network operations have provided
Opto-mechanical lab-on-fibre seismic sensors detected the Norcia earthquake.
Pisco, Marco; Bruno, Francesco Antonio; Galluzzo, Danilo; Nardone, Lucia; Gruca, Grzegorz; Rijnveld, Niek; Bianco, Francesca; Cutolo, Antonello; Cusano, Andrea
2018-04-27
We have designed and developed lab-on-fibre seismic sensors containing a micro-opto-mechanical cavity on the fibre tip. The mechanical cavity is designed as a double cantilever suspended on the fibre end facet and connected to a proof mass to tune its response. Ground acceleration leads to displacement of the cavity length, which in turn can be remotely detected using an interferometric interrogation technique. After the sensors characterization, an experimental validation was conducted at the Italian National Institute of Geophysics and Volcanology (INGV), which is responsible for seismic surveillance over the Italian country. The fabricated sensors have been continuously used for long periods to demonstrate their effectiveness as seismic accelerometer sensors. During the tests, fibre optic seismic accelerometers clearly detected the seismic sequence that culminated in the severe Mw6.5 Norcia earthquake that struck central Italy on October 30, 2016. The seismic data provided by the optical sensors were analysed by specialists at the INGV. The wave traces were compared with state-of-the-art traditional sensors typically incorporated into the INGV seismic networks. The comparison verifies the high fidelity of the optical sensors in seismic wave detection, indicating their suitability for a novel class of seismic sensors to be employed in practical scenarios.
Newberry Seismic Deployment Fieldwork Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, J; Templeton, D C
2012-03-21
This report summarizes the seismic deployment of Lawrence Livermore National Laboratory (LLNL) Geotech GS-13 short-period seismometers at the Newberry Enhanced Geothermal System (EGS) Demonstration site located in Central Oregon. This Department of Energy (DOE) demonstration project is managed by AltaRock Energy Inc. AltaRock Energy had previously deployed Geospace GS-11D geophones at the Newberry EGS Demonstration site, however the quality of the seismic data was somewhat low. The purpose of the LLNL deployment was to install more sensitive sensors which would record higher quality seismic data for use in future seismic studies, such as ambient noise correlation, matched field processing earthquakemore » detection studies, and general EGS microearthquake studies. For the LLNL deployment, seven three-component seismic stations were installed around the proposed AltaRock Energy stimulation well. The LLNL seismic sensors were connected to AltaRock Energy Gueralp CMG-DM24 digitizers, which are powered by AltaRock Energy solar panels and batteries. The deployment took four days in two phases. In phase I, the sites were identified, a cavity approximately 3 feet deep was dug and a flat concrete pad oriented to true North was made for each site. In phase II, we installed three single component GS-13 seismometers at each site, quality controlled the data to ensure that each station was recording data properly, and filled in each cavity with native soil.« less
Ultrasonic laboratory measurements of the seismic velocity changes due to CO2 injection
NASA Astrophysics Data System (ADS)
Park, K. G.; Choi, H.; Park, Y. C.; Hwang, S.
2009-04-01
Monitoring the behavior and movement of carbon dioxide (CO2) in the subsurface is a quite important in sequestration of CO2 in geological formation because such information provides a basis for demonstrating the safety of CO2 sequestration. Recent several applications in many commercial and pilot scale projects and researches show that 4D surface or borehole seismic methods are among the most promising techniques for this purpose. However, such information interpreted from the seismic velocity changes can be quite subjective and qualitative without petrophysical characterization for the effect of CO2 saturation on the seismic changes since seismic wave velocity depends on various factors and parameters like mineralogical composition, hydrogeological factors, in-situ conditions. In this respect, we have developed an ultrasonic laboratory measurement system and have carried out measurements for a porous sandstone sample to characterize the effects of CO2 injection to seismic velocity and amplitude. Measurements are done by ultrasonic piezoelectric transducer mounted on both ends of cylindrical core sample under various pressure, temperature, and saturation conditions. According to our fundamental experiments, injected CO2 introduces the decrease of seismic velocity and amplitude. We identified that the velocity decreases about 6% or more until fully saturated by CO2, but the attenuation of seismic amplitude is more drastically than the velocity decrease. We also identified that Vs/Vp or elastic modulus is more sensitive to CO2 saturation. We note that this means seismic amplitude and elastic modulus change can be an alternative target anomaly of seismic techniques in CO2 sequestration monitoring. Thus, we expect that we can estimate more quantitative petrophysical relationships between the changes of seismic attributes and CO2 concentration, which can provide basic relation for the quantitative assessment of CO2 sequestration by further researches.
EMERALD: Coping with the Explosion of Seismic Data
NASA Astrophysics Data System (ADS)
West, J. D.; Fouch, M. J.; Arrowsmith, R.
2009-12-01
The geosciences are currently generating an unparalleled quantity of new public broadband seismic data with the establishment of large-scale seismic arrays such as the EarthScope USArray, which are enabling new and transformative scientific discoveries of the structure and dynamics of the Earth’s interior. Much of this explosion of data is a direct result of the formation of the IRIS consortium, which has enabled an unparalleled level of open exchange of seismic instrumentation, data, and methods. The production of these massive volumes of data has generated new and serious data management challenges for the seismological community. A significant challenge is the maintenance and updating of seismic metadata, which includes information such as station location, sensor orientation, instrument response, and clock timing data. This key information changes at unknown intervals, and the changes are not generally communicated to data users who have already downloaded and processed data. Another basic challenge is the ability to handle massive seismic datasets when waveform file volumes exceed the fundamental limitations of a computer’s operating system. A third, long-standing challenge is the difficulty of exchanging seismic processing codes between researchers; each scientist typically develops his or her own unique directory structure and file naming convention, requiring that codes developed by another researcher be rewritten before they can be used. To address these challenges, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The overarching goal of the EMERALD project is to enable more efficient and effective use of seismic datasets ranging from just a few hundred to millions of waveforms with a complete database-driven system, leading to higher quality seismic datasets for scientific analysis and enabling faster, more efficient scientific research. We will present a preliminary (beta) version of EMERALD, an integrated
A new seismic probe for coal seam hazard detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, W.R.; Owen, T.E.; Thill, R.E.
1985-01-01
An experimental hole-to-hole seismic probe system has been developed for use in coal measure geology as a means of determining the structural conditions of coal seams. The source probe produces a 500-joule electric arc discharge whose seismic wavelet has a spectrum in the 200 to 2,000 Hz frequency range. Low compliance hydrophones contained in the source probe as well as in a separate seismic detector probe are matched to the frequency range of the source. Both probes are constructed with 5.72 cm diameter housings. The transducers in the probes are equipped with fluid-inflatable boots to permit operation in either wetmore » or dry boreholes. Preliminary tests in vertical boreholes drilled 213 m apart in sedimentary rock formations show reliable operation and useful seismic propagation measurements along horizontal and oblique paths up to 232 m in length. Because the seismic wavelet has an accurately repeatable waveshape, multiple shots and signal averaging techniques can be used to enhance the signal-to-noise ratio and extend the transmission distances.« less
Ischia Island: Historical Seismicity and Dynamics
NASA Astrophysics Data System (ADS)
Carlino, S.; Cubellis, E.; Iannuzzi, R.; Luongo, G.; Obrizzo, F.
2003-04-01
The seismic energy release in volcanic areas is a complex process and the island of Ischia provides a significant scenario of historical seismicity. This is characterized by the occurence of earthquakes with low energy and high intensity. Information on the seismicity of the island spans about eight centuries, starting from 1228. With regard to effects, the most recent earthquake of 1883 is extensively documented both in the literature and unpublished sources. The earthquake caused 2333 deaths and the destruction of the historical and environmental heritage of some areas of the island. The most severe damage occurred in Casamicciola. This event, which was the first great catastrophe after the unification of Italy in the 1860s (Imax = XI degree MCS), represents an important date in the prevention of natural disasters, in that it was after this earthquake that the first Seismic Safety Act in Italy was passed by which lower risk zones were identified for new settlements. Thanks to such detailed analysis, reliable modelling of the seismic source was also obtained. The historical data onwards makes it possible to identify the area of the epicenter of all known earthquakes as the northern slope of Monte Epomeo, while analysis of the effects of earthquakes and the geological structures allows us to evaluate the stress fields that generate the earthquakes. In a volcanic area, interpretation of the mechanisms of release and propagation of seismic energy is made even more complex as the stress field that acts at a regional level is compounded by that generated from migration of magmatic masses towards the surface, as well as the rheologic properties of the rocks dependent on the high geothermic gradient. Such structural and dynamic conditions make the island of Ischia a seismic area of considerable interest. It would appear necessary to evaluate the expected damage caused by a new event linked to the renewal of dynamics of the island, where high population density and the
NASA Astrophysics Data System (ADS)
Ku, C. S.; You, S. H.; Kuo, Y. T.; Huang, B. S.; Wu, Y. M.; Chen, Y. G.; Taylor, F. W.
2015-12-01
A MW 8.1 earthquake occurred on 1 April 2007 in the western Solomon Islands. Following this event, a damaging tsunami was induced and hit the Island Gizo where the capital city of Western Province of Solomon Islands located. Several buildings of this city were destroyed and several peoples lost their lives during this earthquake. However, during this earthquake, no near source seismic instrument has been installed in this region. The seismic evaluations for the aftershock sequence, the possible earthquake early warning and tsunami warning were unavailable. For the purpose of knowing more detailed information about seismic activity in this region, we have installed 9 seismic stations (with Trillium 120PA broadband seismometer and Q330S 24bit digitizer) around the rupture zone of the 2007 earthquake since September of 2009. Within a decade, it has been demonstrated both theoretically and experimentally that the Green's function or impulse response between two seismic stations can be retrieved from the cross-correlation of ambient noise. In this study, 6 stations' observations which are more complete during 2011/10 ~ 2012/12 period, were selected for the purpose of the cross-correlation analysis of ambient seismic noise. The group velocities at period 2-20 seconds of 15 station-pairs were extracted by using multiple filter technique (MFT) method. The analyzed results of this study presented significant results of group velocities with higher frequency contents than other studies (20-60 seconds in usually cases) and opened new opportunities to study the shallow crustal structure of the western Solomon Islands.
Validating induced seismicity forecast models—Induced Seismicity Test Bench
NASA Astrophysics Data System (ADS)
Király-Proag, Eszter; Zechar, J. Douglas; Gischig, Valentin; Wiemer, Stefan; Karvounis, Dimitrios; Doetsch, Joseph
2016-08-01
Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. In this study, we propose an Induced Seismicity Test Bench to test and rank such models; this test bench can be used for model development, model selection, and ensemble model building. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models: Shapiro and Smoothed Seismicity (SaSS) and Hydraulics and Seismics (HySei). These models incorporate a different mix of physics-based elements and stochastic representation of the induced sequences. Our results show that neither model is fully superior to the other. Generally, HySei forecasts the seismicity rate better after shut-in but is only mediocre at forecasting the spatial distribution. On the other hand, SaSS forecasts the spatial distribution better and gives better seismicity rate estimates before shut-in. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in.
Pre-processing ambient noise cross-correlations with equalizing the covariance matrix eigenspectrum
NASA Astrophysics Data System (ADS)
Seydoux, Léonard; de Rosny, Julien; Shapiro, Nikolai M.
2017-09-01
Passive imaging techniques from ambient seismic noise requires a nearly isotropic distribution of the noise sources in order to ensure reliable traveltime measurements between seismic stations. However, real ambient seismic noise often partially fulfils this condition. It is generated in preferential areas (in deep ocean or near continental shores), and some highly coherent pulse-like signals may be present in the data such as those generated by earthquakes. Several pre-processing techniques have been developed in order to attenuate the directional and deterministic behaviour of this real ambient noise. Most of them are applied to individual seismograms before cross-correlation computation. The most widely used techniques are the spectral whitening and temporal smoothing of the individual seismic traces. We here propose an additional pre-processing to be used together with the classical ones, which is based on the spatial analysis of the seismic wavefield. We compute the cross-spectra between all available stations pairs in spectral domain, leading to the data covariance matrix. We apply a one-bit normalization to the covariance matrix eigenspectrum before extracting the cross-correlations in the time domain. The efficiency of the method is shown with several numerical tests. We apply the method to the data collected by the USArray, when the M8.8 Maule earthquake occurred on 2010 February 27. The method shows a clear improvement compared with the classical equalization to attenuate the highly energetic and coherent waves incoming from the earthquake, and allows to perform reliable traveltime measurement even in the presence of the earthquake.
Travel time seismic tomography on Reykjanes, SW Iceland
NASA Astrophysics Data System (ADS)
Jousset, Philippe; Ágústsson, Kristjan; Blanck, Hanna; Metz, Malte; Franke, Steven; Pàll Hersir, Gylfi; Bruhn, David; Flovenz, Ólafur; Friðleifsson, Guðmundur
2017-04-01
We present updated tomographic results obtained using seismic data recorded around geothermal reservoirs located both on-land Reykjanes, SW-Iceland and offshore along Reykjanes Ridge. We gathered records from a network of 234 seismic stations (including 24 Ocean Bottom Seismometers) deployed between April 2014 and August 2015. In order to determine the orientation of the OBS stations, we used Rayleigh waves planar particle motions from large magnitude earthquakes. This method proved suitable using the on-land stations: orientations determined using this method with the orientations measured using a giro-compass agreed. We focus on the 3D velocity images using local earthquakes to perform travel time tomography. The processing includes first arrival picking of P- and S- phases using an automatic detection and picking technique based on Akaike Information Criteria. We locate earthquakes by using a non-linear localization technique, as a priori information for deriving a 1D velocity model. We then computed 3D velocity model by joint inversion of each earthquake's location and velocity lateral anomalies with respect to the 1D model. Our models confirms previous models obtained in the area, with enhanced details. In a second step, we performed inversion of the Vp/Vs ratio. Results indicate a low Vp/Vs ratio anomaly at depth suggesting the absence of large magmatic body under Reykjanes, unlike results obtained at other geothermal field, sucha as Krafla and Hengill. We discuss implications of those results in the light of recent IDDP drilling in Reykjanes.
Wood Technology: Techniques, Processes, and Products
ERIC Educational Resources Information Center
Oatman, Olan
1975-01-01
Seven areas of wood technology illustrates applicable techniques, processes, and products for an industrial arts woodworking curriculum. They are: wood lamination; PEG (polyethylene glycol) diffusion processes; wood flour and/or particle molding; production product of industry; WPC (wood-plastic-composition) process; residential construction; and…
Parsons, T.; McCarthy, J.; Kohler, W.M.; Ammon, C.J.; Benz, H.M.; Hole, J.A.; Criley, E.E.
1996-01-01
The Colorado Plateau is a large crustal block in the southwestern United States that has been raised intact nearly 2 km above sea level since Cretaceous marine sediments were deposited on its surface. Controversy exists concerning the thickness of the plateau crust and the source of its buoyancy. Interpretations of seismic data collected on the plateau vary as to whether the crust is closer to 40 or 50 km thick. A thick crust could support the observed topography of the Colorado Plateau isostatically, while a thinner crust would indicate the presence of an underlying low-density mantle. This paper reports results on long-offset seismic data collected during the 1989 segment of the U.S. Geological Survey Pacific to Arizona Crustal Experiment that extended from the Transition Zone into the Colorado Plateau in northwest Arizona. We apply two new methods to analyze long-offset data that employ finite difference travel time calculations: (1) a first-arrival time inverter to find upper crustal velocity structure and (2) a forward-modeling technique that allows the direct use of the inverted upper crustal solution in modeling secondary reflected arrivals. We find that the crustal thickness increases from 30 km beneath the metamorphic core complexes in the southern Basin and Range province to about 42 km beneath the northern Transition Zone and southern Colorado Plateau margin. We observe some crustal thinning (to ???37 km thick) and slightly higher lower crustal velocities farther inboard; beneath the Kaibab uplift on the north rim of the Grand Canyon the crust thickens to a maximum of 48 km. We observe a nonuniform crustal thickness beneath the Colorado Plateau that varies by ???15% and corresponds approximately to variations in topography with the thickest crust underlying the highest elevations. Crustal compositions (as inferred from seismic velocities) appear to be the same beneath the Colorado Plateau as those in the Basin and Range province to the southwest
Optical seismic sensor systems and methods
Beal, A. Craig; Cummings, Malcolm E.; Zavriyev, Anton; Christensen, Caleb A.; Lee, Keun
2015-12-08
Disclosed is an optical seismic sensor system for measuring seismic events in a geological formation, including a surface unit for generating and processing an optical signal, and a sensor device optically connected to the surface unit for receiving the optical signal over an optical conduit. The sensor device includes at least one sensor head for sensing a seismic disturbance from at least one direction during a deployment of the sensor device within a borehole of the geological formation. The sensor head includes a frame and a reference mass attached to the frame via at least one flexure, such that movement of the reference mass relative to the frame is constrained to a single predetermined path.
Machine Learning Method for Pattern Recognition in Volcano Seismic Spectra
NASA Astrophysics Data System (ADS)
Radic, V.; Unglert, K.; Jellinek, M.
2016-12-01
Variations in the spectral content of volcano seismicity related to changes in volcanic activity are commonly identified manually in spectrograms. However, long time series of monitoring data at volcano observatories require tools to facilitate automated and rapid processing. Techniques such as Self-Organizing Maps (SOM), Principal Component Analysis (PCA) and clustering methods can help to quickly and automatically identify important patterns related to impending eruptions. In this study we develop and evaluate an algorithm applied on a set of synthetic volcano seismic spectra as well as observed spectra from Kılauea Volcano, Hawai`i. Our goal is to retrieve a set of known spectral patterns that are associated with dominant phases of volcanic tremor before, during, and after periods of volcanic unrest. The algorithm is based on training a SOM on the spectra and then identifying local maxima and minima on the SOM 'topography'. The topography is derived from the first two PCA modes so that the maxima represent the SOM patterns that carry most of the variance in the spectra. Patterns identified in this way reproduce the known set of spectra. Our results show that, regardless of the level of white noise in the spectra, the algorithm can accurately reproduce the characteristic spectral patterns and their occurrence in time. The ability to rapidly classify spectra of volcano seismic data without prior knowledge of the character of the seismicity at a given volcanic system holds great potential for real time or near-real time applications, and thus ultimately for eruption forecasting.
Development of the Multi-Level Seismic Receiver (MLSR)
NASA Astrophysics Data System (ADS)
Sleefe, G. E.; Engler, B. P.; Drozda, P. M.; Franco, R. J.; Morgan, Jeff
1995-02-01
The Advanced Geophysical Technology Department (6114) and the Telemetry Technology Development Department (2664) have, in conjunction with the Oil Recovery Technology Partnership, developed a Multi-Level Seismic Receiver (MLSR) for use in crosswell seismic surveys. The MLSR was designed and evaluated with the significant support of many industry partners in the oil exploration industry. The unit was designed to record and process superior quality seismic data operating in severe borehole environments, including high temperature (up to 200 C) and static pressure (10,000 psi). This development has utilized state-of-the-art technology in transducers, data acquisition, and real-time data communication and data processing. The mechanical design of the receiver has been carefully modeled and evaluated to insure excellent signal coupling into the receiver.
A frequency-domain seismic blind deconvolution based on Gini correlations
NASA Astrophysics Data System (ADS)
Wang, Zhiguo; Zhang, Bing; Gao, Jinghuai; Huo Liu, Qing
2018-02-01
In reflection seismic processing, the seismic blind deconvolution is a challenging problem, especially when the signal-to-noise ratio (SNR) of the seismic record is low and the length of the seismic record is short. As a solution to this ill-posed inverse problem, we assume that the reflectivity sequence is independent and identically distributed (i.i.d.). To infer the i.i.d. relationships from seismic data, we first introduce the Gini correlations (GCs) to construct a new criterion for the seismic blind deconvolution in the frequency-domain. Due to a unique feature, the GCs are robust in their higher tolerance of the low SNR data and less dependent on record length. Applications of the seismic blind deconvolution based on the GCs show their capacity in estimating the unknown seismic wavelet and the reflectivity sequence, whatever synthetic traces or field data, even with low SNR and short sample record.
CyberShake: Running Seismic Hazard Workflows on Distributed HPC Resources
NASA Astrophysics Data System (ADS)
Callaghan, S.; Maechling, P. J.; Graves, R. W.; Gill, D.; Olsen, K. B.; Milner, K. R.; Yu, J.; Jordan, T. H.
2013-12-01
As part of its program of earthquake system science research, the Southern California Earthquake Center (SCEC) has developed a simulation platform, CyberShake, to perform physics-based probabilistic seismic hazard analysis (PSHA) using 3D deterministic wave propagation simulations. CyberShake performs PSHA by simulating a tensor-valued wavefield of Strain Green Tensors, and then using seismic reciprocity to calculate synthetic seismograms for about 415,000 events per site of interest. These seismograms are processed to compute ground motion intensity measures, which are then combined with probabilities from an earthquake rupture forecast to produce a site-specific hazard curve. Seismic hazard curves for hundreds of sites in a region can be used to calculate a seismic hazard map, representing the seismic hazard for a region. We present a recently completed PHSA study in which we calculated four CyberShake seismic hazard maps for the Southern California area to compare how CyberShake hazard results are affected by different SGT computational codes (AWP-ODC and AWP-RWG) and different community velocity models (Community Velocity Model - SCEC (CVM-S4) v11.11 and Community Velocity Model - Harvard (CVM-H) v11.9). We present our approach to running workflow applications on distributed HPC resources, including systems without support for remote job submission. We show how our approach extends the benefits of scientific workflows, such as job and data management, to large-scale applications on Track 1 and Leadership class open-science HPC resources. We used our distributed workflow approach to perform CyberShake Study 13.4 on two new NSF open-science HPC computing resources, Blue Waters and Stampede, executing over 470 million tasks to calculate physics-based hazard curves for 286 locations in the Southern California region. For each location, we calculated seismic hazard curves with two different community velocity models and two different SGT codes, resulting in over
A vector scanning processing technique for pulsed laser velocimetry
NASA Technical Reports Server (NTRS)
Wernet, Mark P.; Edwards, Robert V.
1989-01-01
Pulsed laser sheet velocimetry yields nonintrusive measurements of two-dimensional velocity vectors across an extended planar region of a flow. Current processing techniques offer high precision (1 pct) velocity estimates, but can require several hours of processing time on specialized array processors. Under some circumstances, a simple, fast, less accurate (approx. 5 pct), data reduction technique which also gives unambiguous velocity vector information is acceptable. A direct space domain processing technique was examined. The direct space domain processing technique was found to be far superior to any other techniques known, in achieving the objectives listed above. It employs a new data coding and reduction technique, where the particle time history information is used directly. Further, it has no 180 deg directional ambiguity. A complex convection vortex flow was recorded and completely processed in under 2 minutes on an 80386 based PC, producing a 2-D velocity vector map of the flow field. Hence, using this new space domain vector scanning (VS) technique, pulsed laser velocimetry data can be reduced quickly and reasonably accurately, without specialized array processing hardware.
NASA Astrophysics Data System (ADS)
Busby, Robert; Frassetto, Andy; Hafner, Katrin; Woodward, Robert; Sauter, Allan
2013-04-01
In preparation for deployment of EarthScope's USArray Transportable Array (TA) in Alaska beginning in 2014, the National Science Foundation (NSF) is supporting exploratory work on seismic station design, sensor emplacement and communication concepts appropriate for the challenging high-latitude environment that is proposed for deployment. IRIS has installed several experimental stations to evaluate different sensor emplacement schemes both in Alaska and the lower-48 U.S. The goal of these tests is to maintain or enhance a station's noise performance while minimizing its footprint and the equipment, materials, and overall expense required for its construction. Motivating this approach are recent developments in posthole broadband seismometer design and the unique conditions for operating in Alaska, where there are few roads, cellular communications are scarce, most areas are only accessible by small plane or helicopter, and permafrost underlies much of the northern tundra. In this study we review our methods used for directly emplacing of broadband seismometers in comparison to the current methods used to deploy TA stations. These primarily focus on using an auger to drill three to five meters, beneath the active layer of the permafrost, or coring directly into surface bedrock to one meter depth using a portable drill. Both methods have proven logistically effective in trials. Subsequent station performance can be quantitatively assessed using probability density functions summed from power spectral density estimates. These are calculated for the continuous time series of seismic data recorded for each channel of the seismometer. There are five test stations currently operating in Alaska. One was deployed in August 2011 and the remaining four in October 2012. Our results show that the performance of seismometers in Alaska with auger-hole or core-hole installations equals or exceeds that of the quietest TA stations in the lower-48, particularly at long periods, and
Seismic hazard estimation of northern Iran using smoothed seismicity
NASA Astrophysics Data System (ADS)
Khoshnevis, Naeem; Taborda, Ricardo; Azizzadeh-Roodpish, Shima; Cramer, Chris H.
2017-07-01
This article presents a seismic hazard assessment for northern Iran, where a smoothed seismicity approach has been used in combination with an updated seismic catalog and a ground motion prediction equation recently found to yield good fit with data. We evaluate the hazard over a geographical area including the seismic zones of Azerbaijan, the Alborz Mountain Range, and Kopeh-Dagh, as well as parts of other neighboring seismic zones that fall within our region of interest. In the chosen approach, seismic events are not assigned to specific faults but assumed to be potential seismogenic sources distributed within regular grid cells. After performing the corresponding magnitude conversions, we decluster both historical and instrumental seismicity catalogs to obtain earthquake rates based on the number of events within each cell, and smooth the results to account for the uncertainty in the spatial distribution of future earthquakes. Seismicity parameters are computed for each seismic zone separately, and for the entire region of interest as a single uniform seismotectonic region. In the analysis, we consider uncertainties in the ground motion prediction equation, the seismicity parameters, and combine the resulting models using a logic tree. The results are presented in terms of expected peak ground acceleration (PGA) maps and hazard curves at selected locations, considering exceedance probabilities of 2 and 10% in 50 years for rock site conditions. According to our results, the highest levels of hazard are observed west of the North Tabriz and east of the North Alborz faults, where expected PGA values are between about 0.5 and 1 g for 10 and 2% probability of exceedance in 50 years, respectively. We analyze our results in light of similar estimates available in the literature and offer our perspective on the differences observed. We find our results to be helpful in understanding seismic hazard for northern Iran, but recognize that additional efforts are necessary to
NASA Technical Reports Server (NTRS)
Kovach, R. L.; Watkins, J. S.; Talwani, P.
1972-01-01
The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.
Geomechanics-Based Stochastic Analysis of Injection- Induced Seismicity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghassemi, Ahmad
relying on our a priori knowledge of criticality distribution, we combine an initial probabilistic description of criticality with the information contained in microseismic measurements to arrive at criticality solutions that are conditioned on both field data and our prior knowledge. Previous SBRC have relied upon a deterministic inversion approach to estimate the permeability, and the extent of the stimulated zone, whereas a stochastic inversion algorithm that recognizes and quantifies the uncertainties in the prior model, the time evolution of pore pressure distributions (modeling errors), and the observed seismic events is developed and used herein to realistically assess the quality of the solution. Finally, we developed a technique for processing discrete MEQ data to estimate fracture network properties such as dip and dip directions. The approach was successfully applied to the Fenton Hill HRD experiment and the Newberry EGS with results in good agreement with field observations.« less
NASA Astrophysics Data System (ADS)
Hole, J. A.; Stock, J. M.; Fuis, G. S.; Rymer, M. J.; Murphy, J. M.; Sickler, R. R.; Criley, C. J.; Goldman, M.; Catchings, R. D.; Ricketts, J. W.; Gonzalez-Fernandez, A.; Driscoll, N.; Kent, G.; Harding, A. J.; Klemperer, S. L.
2009-12-01
The Salton Seismic Imaging Project (SSIP) and coordinated projects will acquire seismic data in and across the Salton Trough in southern California and northern Mexico, including the Coachella, Imperial, and Mexicali Valleys. These projects address both rifting processes at the northern end of the Gulf of California extensional province and earthquake hazards at the southern end of the San Andreas Fault system. In the central Salton Trough, North American lithosphere appears to have been rifted completely apart. Based primarily on a 1979 seismic refraction project, the 20-22 km thick crust is apparently composed entirely of new crust added by magmatism from below and sedimentation from above. The new data will constrain the style of continental breakup, the role and mode of magmatism, the effects of rapid Colorado River sedimentation upon extension and magmatism, and the partitioning of oblique extension. The southernmost San Andreas Fault is considered at high risk of producing a large damaging earthquake, yet structures of the fault and adjacent basins are poorly constrained. To improve hazard models, SSIP will image the geometry of the San Andreas and Imperial Faults, structure of sedimentary basins in the Salton Trough, and three-dimensional seismic velocity of the crust and uppermost mantle. SSIP and collaborating projects have been funded by several different programs at NSF and the USGS. These projects include seven lines of land refraction and low-fold reflection data, airguns and OBS data in the Salton Sea, coordinated fieldwork for onshore-offshore and 3-D data, and a densely sampled line of broadband stations across the trough. Fieldwork is tentatively scheduled for 2010. Preliminary work in 2009 included calibration shots in the Imperial Valley that quantified strong ground motion and proved lack of harm to agricultural irrigation tile drains from explosive shots. Piggyback and complementary studies are encouraged.
Polarization Analysis of Ambient Seismic Noise Green's Functions for Monitoring Glacial State
NASA Astrophysics Data System (ADS)
Fry, B.; Horgan, H. J.; Levy, R. H.; Bertler, N. A. N.
2017-12-01
Analysis of continuously recorded background seismic noise has emerged as a powerful technique to monitor changes within the Earth. In a process analogous to Einstein's 'Brownian motion', seismic energy enters the Earth through a variety of mechanisms and then is dissipated through scattering processes or through a semi-random distribution of sources. Eventually, in stratified media, some of this energy assembles itself in coherent packets and propagates as seismic surface waves. Through careful analysis of these waves as recorded by two seismic stations over a short period of time, we can reconstruct Empirical Green's Functions (EGF). EGF are sensitive to the material through which the waves are travelling between the two stations. They can thus provide 4D estimates of material properties such as seismic velocity and anisotropy. We specifically analyze both the bulk velocity and the complex phase of these EGF to look for subtle changes in velocity with direction of propagation as well as the nature of particle polarization and ellipticity. These characteristics can then be used as a proxy for contemporaneous stress and strain or 'inherited' strain. Similar approaches have proven successful in mapping stresses and strain in the crust, on plate interface faults, volcanoes, and on glaciers and the Greenland ice sheet. We will present results from applying this approach to continuous broadband data recorded on the West Antarctic Ice Sheet through the Polenet project. Our results suggest that we can reconstruct EGF at least between frequencies of 300mHz and 50mHz for time periods, providing information about the contemporary state of ice and underlying lithosphere on a seasonal or annual basis. Our primary goals are determining glacial state by linking wave propagation to material fabric on micro (crystal orientation) and macro (strain marker) scales and well as rebound processes in the lithosphere during glacial loading and unloading. We will present our current
Mini-Sosie high-resolution seismic method aids hazards studies
Stephenson, W.J.; Odum, J.; Shedlock, K.M.; Pratt, T.L.; Williams, R.A.
1992-01-01
The Mini-Sosie high-resolution seismic method has been effective in imaging shallow-structure and stratigraphic features that aid in seismic-hazard and neotectonic studies. The method is not an alternative to Vibroseis acquisition for large-scale studies. However, it has two major advantages over Vibroseis as it is being used by the USGS in its seismic-hazards program. First, the sources are extremely portable and can be used in both rural and urban environments. Second, the shifting-and-summation process during acquisition improves the signal-to-noise ratio and cancels out seismic noise sources such as cars and pedestrians. -from Authors
New data of the Gakkel Ridge seismicity
NASA Astrophysics Data System (ADS)
Antonovskaya, Galina; Basakina, Irina; Kremenetskaya, Elena
2016-04-01
250 earthquakes were recorded in the Gakkel Ridge during the period 2012-2014 by Arkhangelsk seismic network. The magnitude Ml of these earthquakes is 1.5 - 5.7, 70% of them have Ml up to 3.0. Seismic events are arranged along to a narrow center line of the Mid-Arctic Ridge, most of the earthquakes are confined to the southern board of the Ridge. Presumably it's connected with the reflection of spreading processes. The high seismic activity zones, which we associate with the volcano-tectonic processes, have been identified. Have been recorded 13 events per day in the Western Volcanic Zone. The largest number of events (75%) is confined to the Sparsely Magmatic Zone. About 30% of all recorded earthquakes with magnitudes above 2.9 have a T-phase. We divided the Gakkel Ridge's earthquakes into two groups by using spectral-time analysis. The first group: maximum energy of the earthquake is observed from 1.5 to 10 Hz, values of magnitudes Ml 2.50-5.29. Earthquakes are distributed along the Gakkel Ridge. The second group: maximum energy of the earthquake is observed from 1.5 to 20 Hz, clearly expressed a high-frequency component, values of magnitudes Ml 2.3-3.4. Earthquakes 2 groups focused only in the Sparsely Magmatic Zone. The new seismic data shows an unique information about geodynamic processes of the Gakkel Ridge.
Heeszel, David S.; Fricker, Helen A.; Bassis, Jeremy N.; O'Neel, Shad; Walter, Fabian
2014-01-01
Iceberg calving is a dominant mass loss mechanism for Antarctic ice shelves, second only to basal melting. An important known process involved in calving is the initiation and propagation of through-penetrating fractures called rifts; however, the mechanisms controlling rift propagation remain poorly understood. To investigate the mechanics of ice-shelf rifting, we analyzed seismicity associated with a propagating rift tip on the Amery Ice Shelf, using data collected during the Austral summers of 2004-2007. We investigated seismicity associated with fracture propagation using a suite of passive seismological techniques including icequake locations, back projection, and moment tensor inversion. We confirm previous results that show that seismicity is characterized by periods of relative quiescence punctuated by swarms of intense seismicity of one to three hours. However, even during periods of quiescence, we find significant seismic deformation around the rift tip. Moment tensors, calculated for a subset of the largest icequakes (MW > -2.0) located near the rift tip, show steeply dipping fault planes, horizontal or shallowly plunging stress orientations, and often have a significant volumetric component. They also reveal that much of the observed seismicity is limited to the upper 50 m of the ice shelf. This suggests a complex system of deformation that involves the propagating rift, the region behind the rift tip, and a system of rift-transverse crevasses. Small-scale variations in the mechanical structure of the ice shelf, especially rift-transverse crevasses and accreted marine ice, play an important role in modulating the rate and location of seismicity associated with propagating ice shelf rifts.
Automated detection and characterization of harmonic tremor in continuous seismic data
NASA Astrophysics Data System (ADS)
Roman, Diana C.
2017-06-01
Harmonic tremor is a common feature of volcanic, hydrothermal, and ice sheet seismicity and is thus an important proxy for monitoring changes in these systems. However, no automated methods for detecting harmonic tremor currently exist. Because harmonic tremor shares characteristics with speech and music, digital signal processing techniques for analyzing these signals can be adapted. I develop a novel pitch-detection-based algorithm to automatically identify occurrences of harmonic tremor and characterize their frequency content. The algorithm is applied to seismic data from Popocatepetl Volcano, Mexico, and benchmarked against a monthlong manually detected catalog of harmonic tremor events. During a period of heightened eruptive activity from December 2014 to May 2015, the algorithm detects 1465 min of harmonic tremor, which generally precede periods of heightened explosive activity. These results demonstrate the algorithm's ability to accurately characterize harmonic tremor while highlighting the need for additional work to understand its causes and implications at restless volcanoes.
NASA Astrophysics Data System (ADS)
Stallone, A.; Marzocchi, W.
2017-12-01
Earthquake occurrence may be approximated by a multidimensional Poisson clustering process, where each point of the Poisson process is replaced by a cluster of points, the latter corresponding to the well-known aftershock sequence (triggered events). Earthquake clusters and their parents are assumed to occur according to a Poisson process at a constant temporal rate proportional to the tectonic strain rate, while events within a cluster are modeled as generations of dependent events reproduced by a branching process. Although the occurrence of such space-time clusters is a general feature in different tectonic settings, seismic sequences seem to have marked differences from region to region: one example, among many others, is that seismic sequences of moderate magnitude in Italian Apennines seem to last longer than similar seismic sequences in California. In this work we investigate on the existence of possible differences in the earthquake clustering process in these two areas. At first, we separate the triggered and background components of seismicity in the Italian and Southern California seismic catalog. Then we study the space-time domain of the triggered earthquakes with the aim to identify possible variations in the triggering properties across the two regions. In the second part of the work we focus our attention on the characteristics of the background seismicity in both seismic catalogs. The assumption of time stationarity of the background seismicity (which includes both cluster parents and isolated events) is still under debate. Some authors suggest that the independent component of seismicity could undergo transient perturbations at various time scales due to different physical mechanisms, such as, for example, viscoelastic relaxation, presence of fluids, non-stationary plate motion, etc, whose impact may depend on the tectonic setting. Here we test if the background seismicity in the two regions can be satisfactorily described by the time
Patterns of significant seismic quiescence on the Mexican Pacific coast
NASA Astrophysics Data System (ADS)
Muñoz-Diosdado, A.; Rudolf-Navarro, A. H.; Angulo-Brown, F.; Barrera-Ferrer, A. G.
Many authors have proposed that the study of seismicity rates is an appropriate technique for evaluating how close a seismic gap may be to rupture. We designed an algorithm for identification of patterns of significant seismic quiescence by using the definition of seismic quiescence proposed by Schreider (1990). This algorithm shows the area of quiescence where an earthquake of great magnitude may probably occur. We have applied our algorithm to the earthquake catalog on the Mexican Pacific coast located between 14 and 21 degrees of North latitude and 94 and 106 degrees West longitude; with depths less than or equal to 60 km and magnitude greater than or equal to 4.3, which occurred from January, 1965 until December, 2014. We have found significant patterns of seismic quietude before the earthquakes of Oaxaca (November 1978, Mw = 7.8), Petatlán (March 1979, Mw = 7.6), Michoacán (September 1985, Mw = 8.0, and Mw = 7.6) and Colima (October 1995, Mw = 8.0). Fortunately, in this century earthquakes of great magnitude have not occurred in Mexico. However, we have identified well-defined seismic quiescences in the Guerrero seismic-gap, which are apparently correlated with the occurrence of silent earthquakes in 2002, 2006 and 2010 recently discovered by GPS technology.
Seismicity in Pennsylvania: Evidence for Anthropogenic Events?
NASA Astrophysics Data System (ADS)
Homman, K.; Nyblade, A.
2015-12-01
The deployment and operation of the USArray Transportable Array (TA) and the PASEIS (XY) seismic networks in Pennsylvania during 2013 and 2014 provide a unique opportunity for investigating the seismicity of Pennsylvania. These networks, along with several permanent stations in Pennsylvania, resulted in a total of 104 seismometers in and around Pennsylvania that have been used in this study. Event locations were first obtained with Antelope Environmental Monitoring Software using P-wave arrival times. Arrival times were hand picked using a 1-5 Hz bandpass filter to within 0.1 seconds. Events were then relocated using a velocity model developed for Pennsylvania and the HYPOELLIPSE location code. In this study, 1593 seismic events occurred between February 2013 and December 2014 in Pennsylvania. These events ranged between magnitude (ML) 1.04 and 2.89 with an average MLof 1.90. Locations of the events occur across the state in many areas where no seismicity has been previously reported. Preliminary results indicate that most of these events are related to mining activity. Additional work using cross-correlation techniques is underway to examine a number of event clusters for evidence of hydraulic fracturing or wastewater injection sources.
Effect of strong elastic contrasts on the propagation of seismic wave in hard-rock environments
NASA Astrophysics Data System (ADS)
Saleh, R.; Zheng, L.; Liu, Q.; Milkereit, B.
2013-12-01
Understanding the propagation of seismic waves in a presence of strong elastic contrasts, such as topography, tunnels and ore-bodies is still a challenge. Safety in mining is a major concern and seismic monitoring is the main tool here. For engineering purposes, amplitudes (peak particle velocity/acceleration) and travel times of seismic events (mostly blasts or microseismic events) are critical parameters that have to be determined at various locations in a mine. These parameters are useful in preparing risk maps or to better understand the process of spatial and temporal stress distributions in a mine. Simple constant velocity models used for monitoring studies in mining, cannot explain the observed complexities in scattered seismic waves. In hard-rock environments modeling of elastic seismic wavefield require detailed 3D petrophysical, infrastructure and topographical data to simulate the propagation of seismic wave with a frequencies up to few kilohertz. With the development of efficient numerical techniques, and parallel computation facilities, a solution for such a problem is achievable. In this study, the effects of strong elastic contrasts such as ore-bodies, rough topography and tunnels will be illustrated using 3D modeling method. The main tools here are finite difference code (SOFI3D)[1] that has been benchmarked for engineering studies, and spectral element code (SPECFEM) [2], which was, developed for global seismology problems. The modeling results show locally enhanced peak particle velocity due to presence of strong elastic contrast and topography in models. [1] Bohlen, T. Parallel 3-D viscoelastic finite difference seismic modeling. Computers & Geosciences 28 (2002) 887-899 [2] Komatitsch, D., and J. Tromp, Introduction to the spectral-element method for 3-D seismic wave propagation, Geophys. J. Int., 139, 806-822, 1999.
Full Waveform Modelling for Subsurface Characterization with Converted-Wave Seismic Reflection
NASA Astrophysics Data System (ADS)
Triyoso, Wahyu; Oktariena, Madaniya; Sinaga, Edycakra; Syaifuddin, Firman
2017-04-01
While a large number of reservoirs have been explored using P-waves seismic data, P-wave seismic survey ceases to provide adequate result in seismically and geologically challenging areas, like gas cloud, shallow drilling hazards, strong multiples, highly fractured, anisotropy. Most of these reservoir problems can be addressed using P and PS seismic data combination. Multicomponent seismic survey records both P-wave and S-wave unlike conventional survey that only records compressional P-wave. Under certain conditions, conventional energy source can be used to record P and PS data using the fact that compressional wave energy partly converts into shear waves at the reflector. Shear component can be recorded using down going P-wave and upcoming S-wave by placing a horizontal component geophone on the ocean floor. A synthetic model is created based on real data to analyze the effect of gas cloud existence to PP and PS wave reflections which has a similar characteristic to Sub-Volcanic imaging. The challenge within the multicomponent seismic is the different travel time between P-wave and S-wave, therefore the converted-wave seismic data should be processed with different approach. This research will provide a method to determine an optimum converted point known as Common Conversion Point (CCP) that can solve the Asymmetrical Conversion Point of PS data. The value of γ (Vp/Vs) is essential to estimate the right CCP that will be used in converted-wave seismic processing. This research will also continue to the advanced processing method of converted-wave seismic by applying Joint Inversion to PP&PS seismic. Joint Inversion is a simultaneous model-based inversion that estimates the P&S-wave impedance which are consistent with the PP&PS amplitude data. The result reveals a more complex structure mirrored in PS data below the gas cloud area. Through estimated γ section resulted from Joint Inversion, we receive a better imaging improvement below gas cloud area tribute to
NASA Astrophysics Data System (ADS)
Silva, Sónia; Terrinha, Pedro; Matias, Luis; Duarte, João C.; Roque, Cristina; Ranero, César R.; Geissler, Wolfram H.; Zitellini, Nevio
2017-10-01
The Gulf of Cadiz seismicity is characterized by persistent low to intermediate magnitude earthquakes, occasionally punctuated by high magnitude events such as the M 8.7 1755 Great Lisbon earthquake and the M = 7.9 event of February 28th, 1969. Micro-seismicity was recorded during 11 months by a temporary network of 25 ocean bottom seismometers (OBSs) in an area of high seismic activity, encompassing the potential source areas of the mentioned large magnitude earthquakes. We combined micro-seismicity analysis with processing and interpretation of deep crustal seismic reflection profiles and available refraction data to investigate the possible tectonic control of the seismicity in the Gulf of Cadiz area. Three controlling mechanisms are explored: i) active tectonic structures, ii) transitions between different lithospheric domains and inherited Mesozoic structures, and iii) fault weakening mechanisms. Our results show that micro-seismicity is mostly located in the upper mantle and is associated with tectonic inversion of extensional rift structures and to the transition between different lithospheric/rheological domains. Even though the crustal structure is well imaged in the seismic profiles and in the bathymetry, crustal faults show low to negligible seismic activity. A possible explanation for this is that the crustal thrusts are thin-skinned structures rooting in relatively shallow sub-horizontal décollements associated with (aseismic) serpentinization levels at the top of the lithospheric mantle. Therefore, co-seismic slip along crustal thrusts may only occur during large magnitude events, while for most of the inter-seismic cycle these thrusts remain locked, or slip aseismically. We further speculate that high magnitude earthquake's ruptures may only nucleate in the lithospheric mantle and then propagate into the crust across the serpentinized layers.
Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.
2015-01-01
The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after
Hawaiian Volcano Observatory Seismic Data, January to December 2007
Nakata, Jennifer S.; Okubo, Paul G.
2008-01-01
The U.S. Geological Survey (USGS), Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year. The seismic summary is offered without interpretation as a source of preliminary data and is complete in that most data for events of M=1.5 are included. All latitude and longitude references in this report are stated in Old Hawaiian Datum. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data necessitated an annual publication, beginning with Summary 74 for the year 1974. Beginning in 2004, summaries are simply identified by the year, rather than by summary number. Summaries originally issued as administrative reports were republished in 2007 as Open-File Reports. All the summaries since 1956 are listed at http://geopubs.wr.usgs.gov/ (last accessed September 30, 2008). In January 1986, HVO adopted CUSP (California Institute of Technology USGS Seismic Processing). Summary 86 includes a description of the seismic instrumentation, calibration, and processing used in recent years. The present summary includes background information about the seismic network to provide the end user an understanding of the processing parameters and how the data were gathered. A report by Klein and Koyanagi (1980) tabulates instrumentation, calibration, and recording history of each seismic station in the network. It is designed as a reference for users of seismograms and phase data and includes and augments the information in the station table in this summary.
Seismic Sources for the Territory of Georgia
NASA Astrophysics Data System (ADS)
Tsereteli, N. S.; Varazanashvili, O.
2011-12-01
The southern Caucasus is an earthquake prone region where devastating earthquakes have repeatedly caused significant loss of lives, infrastructure and buildings. High geodynamic activity of the region expressed in both seismic and aseismic deformations, is conditioned by the still-ongoing convergence of lithospheric plates and northward propagation of the Afro-Arabian continental block at a rate of several cm/year. The geometry of tectonic deformations in the region is largely determined by the wedge-shaped rigid Arabian block intensively intended into the relatively mobile Middle East-Caucasian region. Georgia is partner of ongoing regional project EMME. The main objective of EMME is calculation of Earthquake hazard uniformly with heights standards. One approach used in the project is the probabilistic seismic hazard assessment. In this approach the first parameter requirement is the definition of seismic source zones. Seismic sources can be either faults or area sources. Seismoactive structures of Georgia are identified mainly on the basis of the correlation between neotectonic structures of the region and earthquakes. Requirements of modern PSH software to geometry of faults is very high. As our knowledge of active faults geometry is not sufficient, area sources were used. Seismic sources are defined as zones that are characterized with more or less uniform seismicity. Poor knowledge of the processes occurring in deep of the Earth is connected with complexity of direct measurement. From this point of view the reliable data obtained from earthquake fault plane solution is unique for understanding the character of a current tectonic life of investigated area. There are two methods of identification if seismic sources. The first is the seimsotectonic approach, based on identification of extensive homogeneous seismic sources (SS) with the definition of probability of occurrence of maximum earthquake Mmax. In the second method the identification of seismic sources
The improved broadband Real-Time Seismic Network in Romania
NASA Astrophysics Data System (ADS)
Neagoe, C.; Ionescu, C.
2009-04-01
Starting with 2002 the National Institute for Earth Physics (NIEP) has developed its real-time digital seismic network. This network consists of 96 seismic stations of which 48 broad band and short period stations and two seismic arrays are transmitted in real-time. The real time seismic stations are equipped with Quanterra Q330 and K2 digitizers, broadband seismometers (STS2, CMG40T, CMG 3ESP, CMG3T) and strong motions sensors Kinemetrics episensors (+/- 2g). SeedLink and AntelopeTM (installed on MARMOT) program packages are used for real-time (RT) data acquisition and exchange. The communication from digital seismic stations to the National Data Center in Bucharest is assured by 5 providers (GPRS, VPN, satellite communication, radio lease line and internet), which will assure the back-up communications lines. The processing centre runs BRTT's AntelopeTM 4.10 data acquisition and processing software on 2 workstations for real-time processing and post processing. The Antelope Real-Time System is also providing automatic event detection, arrival picking, event location and magnitude calculation. It provides graphical display and reporting within near-real-time after a local or regional event occurred. Also at the data center was implemented a system to collect macroseismic information using the internet on which macro seismic intensity maps are generated. In the near future at the data center will be install Seiscomp 3 data acquisition processing software on a workstation. The software will run in parallel with Antelope software as a back-up. The present network will be expanded in the near future. In the first half of 2009 NIEP will install 8 additional broad band stations in Romanian territory, which also will be transmitted to the data center in real time. The Romanian Seismic Network is permanently exchanging real -time waveform data with IRIS, ORFEUS and different European countries through internet. In Romania, magnitude and location of an earthquake are now
NASA Astrophysics Data System (ADS)
Yin, A.; Yu, X.; Shen, Z.
2014-12-01
Although the seismically active North China basin has the most complete written records of pre-instrumentation earthquakes in the world, this information has not been fully utilized for assessing potential earthquake hazards of this densely populated region that hosts ~200 million people. In this study, we use the historical records to document the earthquake migration pattern and the existence of a 180-km seismic gap along the 600-km long right-slip Tangshan-Hejian-Cixian (THC) fault zone that cuts across the North China basin. The newly recognized seismic gap, which is centered at Tianjin with a population of 11 million people and ~120 km from Beijing (22 million people) and Tangshan (7 million people), has not been ruptured in the past 1000 years by M≥6 earthquakes. The seismic migration pattern in the past millennium suggests that the epicenters of major earthquakes have shifted towards this seismic gap along the THC fault, which implies that the 180- km gap could be the site of the next great earthquake with M≈7.6 if it is ruptured by a single event. Alternatively, the seismic gap may be explained by aseismic creeping or seismic strain transfer between active faults.
Time-lapse 3-D seismic imaging of shallow subsurface contaminant flow.
McKenna, J; Sherlock, D; Evans, B
2001-12-01
This paper presents a physical modelling study outlining a technique whereby buoyant contaminant flow within water-saturated unconsolidated sand was remotely monitored utilizing the time-lapse 3-D (TL3-D) seismic response. The controlled temperature and pressure conditions, along with the high level of acquisition repeatability attainable using sandbox physical models, allow the TL3-D seismic response to pore fluid movement to be distinguished from all other effects. TL3-D seismic techniques are currently being developed to monitor hydrocarbon reserves within producing reservoirs in an endeavour to improve overall recovery. However, in many ways, sandbox models under atmospheric conditions more accurately simulate the shallow subsurface than petroleum reservoirs. For this reason, perhaps the greatest application for analogue sandbox modelling is to improve our understanding of shallow groundwater and environmental flow mechanisms. Two fluid flow simulations were conducted whereby air and kerosene were injected into separate water-saturated unconsolidated sand models. In both experiments, a base 3-D seismic volume was recorded and compared with six later monitor surveys recorded while the injection program was conducted. Normal incidence amplitude and P-wave velocity information were extracted from the TL3-D seismic data to provide visualization of contaminant migration. Reflection amplitudes displayed qualitative areal distribution of fluids when a suitable impedance contrast existed between pore fluids. TL3-D seismic reflection tomography can potentially monitor the change in areal distribution of fluid contaminants over time, indicating flow patterns. However, other research and this current work have not established a quantifiable relationship between either normal reflection amplitudes and attenuation and fluid saturation. Generally, different pore fluids will have unique seismic velocities due to differences in compressibility and density. The predictable
Seismic imaging of gas hydrate reservoir heterogeneities
NASA Astrophysics Data System (ADS)
Huang, Jun-Wei
Natural gas hydrate, a type of inclusion compound or clathrate, are composed of gas molecules trapped within a cage of water molecules. The presence of gas hydrate has been confirmed by core samples recovered from boreholes. Interests in the distribution of natural gas hydrate stem from its potential as a future energy source, geohazard to drilling activities and their possible impact on climate change. However the current geophysical investigations of gas hydrate reservoirs are still too limited to fully resolve the location and the total amount of gas hydrate due to its complex nature of distribution. The goal of this thesis is twofold, i.e., to model (1) the heterogeneous gas hydrate reservoirs and (2) seismic wave propagation in the presence of heterogeneities in order to address the fundamental questions: where are the location and occurrence of gas hydrate and how much is stored in the sediments. Seismic scattering studies predict that certain heterogeneity scales and velocity contrasts will generate strong scattering and wave mode conversion. Vertical Seismic Profile (VSP) techniques can be used to calibrate seismic characterization of gas hydrate expressions on surface seismograms. To further explore the potential of VSP in detecting the heterogeneities, a wave equation based approach for P- and S-wave separation is developed. Tests on synthetic data as well as applications to field data suggest alternative acquisition geometries for VSP to enable wave mode separation. A new reservoir modeling technique based on random medium theory is developed to construct heterogeneous multi-variable models that mimic heterogeneities of hydrate-bearing sediments at the level of detail provided by borehole logging data. Using this new technique, I modeled the density, and P- and S-wave velocities in combination with a modified Biot-Gassmann theory and provided a first order estimate of the in situ volume of gas hydrate near the Mallik 5L-38 borehole. Our results suggest a
NASA Astrophysics Data System (ADS)
Murru, M.; Falcone, G.; Taroni, M.; Console, R.
2017-12-01
In 2015 the Italian Department of Civil Protection, started a project for upgrading the official Italian seismic hazard map (MPS04) inviting the Italian scientific community to participate in a joint effort for its realization. We participated providing spatially variable time-independent (Poisson) long-term annual occurrence rates of seismic events on the entire Italian territory, considering cells of 0.1°x0.1° from M4.5 up to M8.1 for magnitude bin of 0.1 units. Our final model was composed by two different models, merged in one ensemble model, each one with the same weight: the first one was realized by a smoothed seismicity approach, the second one using the seismogenic faults. The spatial smoothed seismicity was obtained using the smoothing method introduced by Frankel (1995) applied to the historical and instrumental seismicity. In this approach we adopted a tapered Gutenberg-Richter relation with a b-value fixed to 1 and a corner magnitude estimated with the bigger events in the catalogs. For each seismogenic fault provided by the Database of the Individual Seismogenic Sources (DISS), we computed the annual rate (for each cells of 0.1°x0.1°) for magnitude bin of 0.1 units, assuming that the seismic moments of the earthquakes generated by each fault are distributed according to the same tapered Gutenberg-Richter relation of the smoothed seismicity model. The annual rate for the final model was determined in the following way: if the cell falls within one of the seismic sources, we merge the respective value of rate determined by the seismic moments of the earthquakes generated by each fault and the value of the smoothed seismicity model with the same weight; if instead the cells fall outside of any seismic source we considered the rate obtained from the spatial smoothed seismicity. Here we present the final results of our study to be used for the new Italian seismic hazard map.
Infrasonic component of volcano-seismic eruption tremor
NASA Astrophysics Data System (ADS)
Matoza, Robin S.; Fee, David
2014-03-01
Air-ground and ground-air elastic wave coupling are key processes in the rapidly developing field of seismoacoustics and are particularly relevant for volcanoes. During a sustained explosive volcanic eruption, it is typical to record a sustained broadband signal on seismometers, termed eruption tremor. Eruption tremor is usually attributed to a subsurface seismic source process, such as the upward migration of magma and gases through the shallow conduit and vent. However, it is now known that sustained explosive volcanic eruptions also generate powerful tremor signals in the atmosphere, termed infrasonic tremor. We investigate infrasonic tremor coupling down into the ground and its contribution to the observed seismic tremor. Our methodology builds on that proposed by Ichihara et al. (2012) and involves cross-correlation, coherence, and cross-phase spectra between waveforms from nearly collocated seismic and infrasonic sensors; we apply it to datasets from Mount St. Helens, Tungurahua, and Redoubt Volcanoes.
NASA Astrophysics Data System (ADS)
Pilecki, Zenon; Isakow, Zbigniew; Czarny, Rafał; Pilecka, Elżbieta; Harba, Paulina; Barnaś, Maciej
2017-08-01
In this work, the capabilities of the Seismobile system for shallow subsurface imaging of transport routes, such as roads, railways, and airport runways, in different geological conditions were presented. The Seismobile system combines the advantages of seismic profiling using landstreamer and georadar (GPR) profiling. It consists of up to four seismic measuring lines and carriage with a suspended GPR antenna. Shallow subsurface recognition may be achieved to a maximum width of 10.5 m for a distance of 3.5 m between the measurement lines. GPR measurement is performed in the axis of the construction. Seismobile allows the measurement time, labour and costs to be reduced due to easy technique of its installation, remote data transmission from geophones to accompanying measuring modules, automated location of the system based on GPS and a highly automated method of seismic wave excitation. In this paper, the results of field tests carried out in different geological conditions were presented. The methodologies of acquisition, processing and interpretation of seismic and GPR measurements were broadly described. Seismograms and its spectrum registered by Seismobile system were compared to the ones registered by Geode seismograph of Geometrix. Seismic data processing and interpretation software allows for the obtaining of 2D/3D models of P- and S-wave velocities. Combined seismic and GPR results achieved sufficient imaging of shallow subsurface to a depth of over a dozen metres. The obtained geophysical information correlated with geological information from the boreholes with good quality. The results of performed tests proved the efficiency of the Seismobile system in seismic and GPR imaging of a shallow subsurface of transport routes under compound conditions.
Is the co-seismic slip distribution fractal?
NASA Astrophysics Data System (ADS)
Milliner, Christopher; Sammis, Charles; Allam, Amir; Dolan, James
2015-04-01
Co-seismic along-strike slip heterogeneity is widely observed for many surface-rupturing earthquakes as revealed by field and high-resolution geodetic methods. However, this co-seismic slip variability is currently a poorly understood phenomenon. Key unanswered questions include: What are the characteristics and underlying causes of along-strike slip variability? Do the properties of slip variability change from fault-to-fault, along-strike or at different scales? We cross-correlate optical, pre- and post-event air photos using the program COSI-Corr to measure the near-field, surface deformation pattern of the 1992 Mw 7.3 Landers and 1999 Mw 7.1 Hector Mine earthquakes in high-resolution. We produce the co-seismic slip profiles of both events from over 1,000 displacement measurements and observe consistent along-strike slip variability. Although the observed slip heterogeneity seems apparently complex and disordered, a spectral analysis reveals that the slip distributions are indeed self-affine fractal i.e., slip exhibits a consistent degree of irregularity at all observable length scales, with a 'short-memory' and is not random. We find a fractal dimension of 1.58 and 1.75 for the Landers and Hector Mine earthquakes, respectively, indicating that slip is more heterogeneous for the Hector Mine event. Fractal slip is consistent with both dynamic and quasi-static numerical simulations that use non-planar faults, which in turn causes heterogeneous along-strike stress, and we attribute the observed fractal slip to fault surfaces of fractal roughness. As fault surfaces are known to smooth over geologic time due to abrasional wear and fracturing, we also test whether the fractal properties of slip distributions alters between earthquakes from immature to mature fault systems. We will present results that test this hypothesis by using the optical image correlation technique to measure historic, co-seismic slip distributions of earthquakes from structurally mature, large
Full waveform inversion of combined towed streamer and limited OBS seismic data: a theoretical study
NASA Astrophysics Data System (ADS)
Yang, Huachen; Zhang, Jianzhong
2018-06-01
In marine seismic oil exploration, full waveform inversion (FWI) of towed-streamer data is used to reconstruct velocity models. However, the FWI of towed-streamer data easily converges to a local minimum solution due to the lack of low-frequency content. In this paper, we propose a new FWI technique using towed-streamer data, its integrated data sets and limited OBS data. Both integrated towed-streamer seismic data and OBS data have low-frequency components. Therefore, at early iterations in the new FWI technique, the OBS data combined with the integrated towed-streamer data sets reconstruct an appropriate background model. And the towed-streamer seismic data play a major role in later iterations to improve the resolution of the model. The new FWI technique is tested on numerical examples. The results show that when starting models are not accurate enough, the models inverted using the new FWI technique are superior to those inverted using conventional FWI.
The Pollino Seismic Sequence: Activated Graben Structures in a Seismic Gap
NASA Astrophysics Data System (ADS)
Rößler, Dirk; Passarelli, Luigi; Govoni, Aladino; Bindi, Dino; Cesca, Simone; Hainzl, Sebatian; Maccaferri, Francesco; Rivalta, Eleonora; Woith, Heiko; Dahm, Torsten
2015-04-01
The Mercure Basin (MB) and the Castrovillari Fault (CF) in the Pollino range (Southern Apennines, Italy) represent one of the most prominent seismic gaps in the Italian seismic catalogue, with no M>5.5 earthquakes during the last centuries. In historical times several swarm-like seismic sequences occurred in the area including two intense swarms within the past two decades. The most energetic one started in 2010 and has been still active in 2014. The seismicity culminated in autumn 2012 with a M=5 event on 25 October. The range hosts a number of opposing normal faults forming a graben-like structure. Their rheology and their interactions are unclear. Current debates include the potential of the MB and the CF to host large earthquakes and the style of deformation. Understanding the seismicity and the behaviour of the faults is necessary to assess the tectonics and the seismic hazard. The GFZ German Research Centre for Geosciences and INGV, Italy, have jointly monitored the ongoing seismicity using a small-aperture seismic array, integrated in a temporary seismic network. Based on this installation, we located more than 16,000 local earthquakes that occurred between November 2012 and September 2014. Here we investigate quantitatively all the phases of the seismic sequence starting from January 2010. Event locations along with moment tensor inversion constrain spatially the structures activated by the swarm and the migration pattern of the seismicity. The seismicity forms clusters concentrated within the southern part of the MB and along the Pollino Fault linking MB and CF. Most earthquakes are confined to the upper 10 km of the crust in an area of ~15x15 km2. However, sparse seismicity at depths between 15 and 20 km and moderate seismicity further north with deepening hypocenters also exist. In contrast, the CF appears aseismic; only the northern part has experienced micro-seismicity. The spatial distribution is however more complex than the major tectonic structures
First approximations in avalanche model validations using seismic information
NASA Astrophysics Data System (ADS)
Roig Lafon, Pere; Suriñach, Emma; Bartelt, Perry; Pérez-Guillén, Cristina; Tapia, Mar; Sovilla, Betty
2017-04-01
Avalanche dynamics modelling is an essential tool for snow hazard management. Scenario based numerical modelling provides quantitative arguments for decision-making. The software tool RAMMS (WSL Institute for Snow and Avalanche Research SLF) is one such tool, often used by government authorities and geotechnical offices. As avalanche models improve, the quality of the numerical results will depend increasingly on user experience on the specification of input (e.g. release and entrainment volumes, secondary releases, snow temperature and quality). New model developments must continue to be validated using real phenomena data, for improving performance and reliability. The avalanches group form University of Barcelona (RISKNAT - UB), has studied the seismic signals generated from avalanches since 1994. Presently, the group manages the seismic installation at SLF's Vallée de la Sionne experimental site (VDLS). At VDLS the recorded seismic signals can be correlated to other avalanche measurement techniques, including both advanced remote sensing methods (radars, videogrammetry) and obstacle based sensors (pressure, capacitance, optical sender-reflector barriers). This comparison between different measurement techniques allows the group to address the question if seismic analysis can be used alone, on more additional avalanche tracks, to gain insight and validate numerical avalanche dynamics models in different terrain conditions. In this study, we aim to add the seismic data as an external record of the phenomena, able to validate RAMMS models. The seismic sensors are considerable easy and cheaper to install than other physical measuring tools, and are able to record data from the phenomena in every atmospheric conditions (e.g. bad weather, low light, freezing make photography, and other kind of sensors not usable). With seismic signals, we record the temporal evolution of the inner and denser parts of the avalanche. We are able to recognize the approximate position
NASA Astrophysics Data System (ADS)
Neto, F. A. P.; Franca, G.
2014-12-01
The purpose of this job was to study and document the Angola natural seismicity, establishment of the first database seismic data to facilitate consultation and search for information on seismic activity in the country. The study was conducted based on query reports produced by National Institute of Meteorology and Geophysics (INAMET) 1968 to 2014 with emphasis to the work presented by Moreira (1968), that defined six seismogenic zones from macro seismic data, with highlighting is Zone of Sá da Bandeira (Lubango)-Chibemba-Oncócua-Iona. This is the most important of Angola seismic zone, covering the epicentral Quihita and Iona regions, geologically characterized by transcontinental structure tectono-magmatic activation of the Mesozoic with the installation of a wide variety of intrusive rocks of ultrabasic-alkaline composition, basic and alkaline, kimberlites and carbonatites, strongly marked by intense tectonism, presenting with several faults and fractures (locally called corredor de Lucapa). The earthquake of May 9, 1948 reached intensity VI on the Mercalli-Sieberg scale (MCS) in the locality of Quihita, and seismic active of Iona January 15, 1964, the main shock hit the grade VI-VII. Although not having significant seismicity rate can not be neglected, the other five zone are: Cassongue-Ganda-Massano de Amorim; Lola-Quilengues-Caluquembe; Gago Coutinho-zone; Cuima-Cachingues-Cambândua; The Upper Zambezi zone. We also analyzed technical reports on the seismicity of the middle Kwanza produced by Hidroproekt (GAMEK) region as well as international seismic bulletins of the International Seismological Centre (ISC), United States Geological Survey (USGS), and these data served for instrumental location of the epicenters. All compiled information made possible the creation of the First datbase of seismic data for Angola, preparing the map of seismicity with the reconfirmation of the main seismic zones defined by Moreira (1968) and the identification of a new seismic
Seismic isolation of buildings using composite foundations based on metamaterials
NASA Astrophysics Data System (ADS)
Casablanca, O.; Ventura, G.; Garescı, F.; Azzerboni, B.; Chiaia, B.; Chiappini, M.; Finocchio, G.
2018-05-01
Metamaterials can be engineered to interact with waves in entirely new ways, finding application on the nanoscale in various fields such as optics and acoustics. In addition, acoustic metamaterials can be used in large-scale experiments for filtering and manipulating seismic waves (seismic metamaterials). Here, we propose seismic isolation based on a device that combines some properties of seismic metamaterials (e.g., periodic mass-in-mass systems) with that of a standard foundation positioned right below the building for isolation purposes. The concepts on which this solution is based are the local resonance and a dual-stiffness structure that preserves large (small) rigidity for compression (shear) effects. In other words, this paper introduces a different approach to seismic isolation by using certain principles of seismic metamaterials. The experimental demonstrator tested on the laboratory scale exhibits a spectral bandgap that begins at 4.5 Hz. Within the bandgap, it filters more than 50% of the seismic energy via an internal dissipation process. Our results open a path toward the seismic resilience of buildings and a critical infrastructure to shear seismic waves, achieving higher efficiency compared to traditional seismic insulators and passive energy-dissipation systems.
Patterned basal seismicity shows sub-ice stream bedforms
NASA Astrophysics Data System (ADS)
Barcheck, C. G.; Tulaczyk, S. M.; Schwartz, S. Y.
2017-12-01
Patterns in seismicity emanating from the bottom of fast-moving ice streams and glaciers may indicate localized patches of higher basal resistance— sometimes called 'sticky spots', or otherwise varying basal properties. These seismogenic basal areas resist an unknown portion of the total driving stress of the Whillans Ice Plain (WIP), in West Antarctica, but may play an important role in the WIP stick-slip cycle and ice stream slowdown. To better understand the mechanism and importance of basal seismicity beneath the WIP, we analyze seismic data collected by a small aperture (< 3km) network of 8 surface and 5 borehole seismometers installed in the main central sticky spot of the WIP. We use a network beamforming technique to detect and roughly locate thousands of small (magnitude < 0), local basal micro-earthquakes in Dec 2014, and we compare the resulting map of seismicity to ice bottom depth measured by airborne radar. The number of basal earthquakes per area within the network is spatially heterogeneous, but a pattern of two 400m wide streaks of high seismicity rates is evident, with >50-500 earthquakes detected per 50x50m grid cell in 2 weeks. These seismically active streaks are elongated approximately in the ice flow direction with a spacing of 750m. Independent airborne radar measurements of ice bottom depth from Jan 2013 show a low-amplitude ( 5m) undulation in the basal topography superposed on a regional gradient in ice bottom depth. The flow-perpendicular wavelength of these low-amplitude undulations is comparable to the spacing of the high seismicity bands, and the streaks of high seismicity intersect local lows in the undulating basal topography. We interpret these seismic and radar observations as showing seismically active sub-ice stream bedforms that are low amplitude and elongated in the direction of ice flow, comparable to the morphology of mega scale glacial lineations (MSGLs), with high basal seismicity rates observed in the MSGL troughs
Integration between well logging and seismic reflection techniques for structural a
NASA Astrophysics Data System (ADS)
Mohamed, Adel K.; Ghazala, Hosni H.; Mohamed, Lamees
2016-12-01
Abu El Gharadig basin is located in the northern part of the Western Desert, Egypt. Geophysical investigation in the form of thirty (3D) seismic lines and well logging data of five wells have been analyzed in the oil field BED-1 that is located in the northwestern part of Abu El Gharadig basin in the Western Desert of Egypt. The reflection sections have been used to shed more light on the tectonic setting of Late Jurassic-Early Cretaceous rocks. While the well logging data have been analyzed for delineating the petrophysical characteristics of the two main reservoirs, Bahariya and Kharita Formations. The constructed subsurface geologic cross sections, seismic sections, and the isochronous reflection maps indicate that the area is structurally controlled by tectonic trends affecting the current shape of Abu El Gharadig basin. Different types of faults are well represented in the area, particularly normal one. The analysis of the average and interval velocities versus depth has shown their effect by facies changes and/or fluid content. On the other hand, the derived petrophysical parameters of Bahariya and Kharita Formations vary from well to another and they have been affected by the gas effect and/or the presence of organic matter, complex lithology, clay content of dispersed habitat, and the pore volume.
NASA Astrophysics Data System (ADS)
Speece, M. A.; Link, C. A.; Stickney, M.
2011-12-01
techniques including linear noise suppression of the air wave and ground roll, refraction statics, and prestack migration. Reprocessing of these data using state-of-the-art seismic reflection processing techniques will provide a detailed picture of the stratigraphy and tectonic framework for this region. Moreover, extended correlations of the Vibroseis records to Moho depths might reveal new insights on crustal thickness and provide a framework for understanding crustal thickening during the Laramide Orogeny as well as later Cenozoic extension.
Double hashing technique in closed hashing search process
NASA Astrophysics Data System (ADS)
Rahim, Robbi; Zulkarnain, Iskandar; Jaya, Hendra
2017-09-01
The search process is used in various activities performed both online and offline, many algorithms that can be used to perform the search process one of which is a hash search algorithm, search process with hash search algorithm used in this study using double hashing technique where the data will be formed into the table with same length and then search, the results of this study indicate that the search process with double hashing technique allows faster searching than the usual search techniques, this research allows to search the solution by dividing the value into the main table and overflow table so that the search process is expected faster than the data stacked in the form of one table and collision data could avoided.
Ball, M.M.; Soderberg, N.K.
1989-01-01
In August 1979, the U.S. Geological Survey (USGS) aboard the M/V SEISMIC EXPLORER of Seismic Explorations International (SEI), ran 17 lines (1,270 km) of multichannel, seismic-reflection profiles on the western Florida Shelf. The main features of the SEI system were (1) a digital recorder with an instantaneous-floating-point-gain constant of 24 dB, (2) a 64-channel hydrophone streamer, 3,200 m long, and (3) a 21-airgun array that had a total volume of 2,000 in and a pressure of 2,000 psi. Sampling interval was array to the center of the farthest phone group was 3,338 m and to the nearest phone group, 188 m. Shot points were 5O m apart to obtain a 32-fold stack. Navigation was by an integrated satellite/Loran/doppler-sonar system.The SEI data were processed by Geophysical Data Processing Center, Inc. of Houston, Texas. Processing procedures were standard with the following exceptions: (1) a deringing deconvolution that had a 128-ms operator length was done prior to stacking. (2) a time-variant predictive deconvolution that had a filter operator length of 100 ms and automatic picking of the second zero-crossing was applied after stacking to further suppress multiple energy. (3) Velocity analyses were performed every 3 km, using a technique that included the determination and consideration of both the amount and direction of apparent dip. (4) Automatic gain ranging using a 750-ms window was applied pre- and post-stack. ( 5) Lines affected by sea floor's angle of slope were deconvolved again before stacking and time-variant filter parameters were adjusted to follow the sea-floor geometry.The data taken with the 3,200-m streamer and 2,000 in3 airgun array, aboard M/V SEISMIC EXPLORER (Arabic numerals) are vastly superior to those obtained by R/V GYRE using a much smaller streamer and source (Roman numerals). The former consistently show coherent primary events from within the units underlying the Mesozoic section on the western Florida Shelf, while the latter tend to do
MyShake: Building a smartphone seismic network
NASA Astrophysics Data System (ADS)
Kong, Q.; Allen, R. M.; Schreier, L.
2014-12-01
We are in the process of building up a smartphone seismic network. In order to build this network, we did shake table tests to evaluate the performance of the smartphones as seismic recording instruments. We also conducted noise floor test to find the minimum earthquake signal we can record using smartphones. We added phone noises to the strong motion data from past earthquakes, and used these as an analogy dataset to test algorithms and to understand the difference of using the smartphone network and the traditional seismic network. We also built a prototype system to trigger the smartphones from our server to record signals which can be sent back to the server in near real time. The phones can also be triggered by our developed algorithm running locally on the phone, if there's an earthquake occur to trigger the phones, the signal recorded by the phones will be sent back to the server. We expect to turn the prototype system into a real smartphone seismic network to work as a supplementary network to the existing traditional seismic network.
Induced Seismicity Potential of Energy Technologies
NASA Astrophysics Data System (ADS)
Hitzman, Murray
2013-03-01
Earthquakes attributable to human activities-``induced seismic events''-have received heightened public attention in the United States over the past several years. Upon request from the U.S. Congress and the Department of Energy, the National Research Council was asked to assemble a committee of experts to examine the scale, scope, and consequences of seismicity induced during fluid injection and withdrawal associated with geothermal energy development, oil and gas development, and carbon capture and storage (CCS). The committee's report, publicly released in June 2012, indicates that induced seismicity associated with fluid injection or withdrawal is caused in most cases by change in pore fluid pressure and/or change in stress in the subsurface in the presence of faults with specific properties and orientations and a critical state of stress in the rocks. The factor that appears to have the most direct consequence in regard to induced seismicity is the net fluid balance (total balance of fluid introduced into or removed from the subsurface). Energy technology projects that are designed to maintain a balance between the amount of fluid being injected and withdrawn, such as most oil and gas development projects, appear to produce fewer seismic events than projects that do not maintain fluid balance. Major findings from the study include: (1) as presently implemented, the process of hydraulic fracturing for shale gas recovery does not pose a high risk for inducing felt seismic events; (2) injection for disposal of waste water derived from energy technologies does pose some risk for induced seismicity, but very few events have been documented over the past several decades relative to the large number of disposal wells in operation; and (3) CCS, due to the large net volumes of injected fluids suggested for future large-scale carbon storage projects, may have potential for inducing larger seismic events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard D. Miller; Abdelmoneam E. Raef; Alan P. Byrnes
2007-06-30
The objective of this research project was to acquire, process, and interpret multiple high-resolution 3-D compressional wave and 2-D, 2-C shear wave seismic data in the hopes of observing changes in fluid characteristics in an oil field before, during, and after the miscible carbon dioxide (CO{sub 2}) flood that began around December 1, 2003, as part of the DOE-sponsored Class Revisit Project (DOE No.DE-AC26-00BC15124). Unique and key to this imaging activity is the high-resolution nature of the seismic data, minimal deployment design, and the temporal sampling throughout the flood. The 900-m-deep test reservoir is located in central Kansas oomoldic limestonesmore » of the Lansing-Kansas City Group, deposited on a shallow marine shelf in Pennsylvanian time. After 30 months of seismic monitoring, one baseline and eight monitor surveys clearly detected changes that appear consistent with movement of CO{sub 2} as modeled with fluid simulators and observed in production data. Attribute analysis was a very useful tool in enhancing changes in seismic character present, but difficult to interpret on time amplitude slices. Lessons learned from and tools/techniques developed during this project will allow high-resolution seismic imaging to be routinely applied to many CO{sub 2} injection programs in a large percentage of shallow carbonate oil fields in the midcontinent.« less
Self-induced seismicity due to fluid circulation along faults
NASA Astrophysics Data System (ADS)
Aochi, Hideo; Poisson, Blanche; Toussaint, Renaud; Rachez, Xavier; Schmittbuhl, Jean
2014-03-01
In this paper, we develop a system of equations describing fluid migration, fault rheology, fault thickness evolution and shear rupture during a seismic cycle, triggered either by tectonic loading or by fluid injection. Assuming that the phenomena predominantly take place on a single fault described as a finite permeable zone of variable width, we are able to project the equations within the volumetric fault core onto the 2-D fault interface. From the basis of this `fault lubrication approximation', we simulate the evolution of seismicity when fluid is injected at one point along the fault to model-induced seismicity during an injection test in a borehole that intercepts the fault. We perform several parametric studies to understand the basic behaviour of the system. Fluid transmissivity and fault rheology are key elements. The simulated seismicity generally tends to rapidly evolve after triggering, independently of the injection history and end when the stationary path of fluid flow is established at the outer boundary of the model. This self-induced seismicity takes place in the case where shear rupturing on a planar fault becomes dominant over the fluid migration process. On the contrary, if healing processes take place, so that the fluid mass is trapped along the fault, rupturing occurs continuously during the injection period. Seismicity and fluid migration are strongly influenced by the injection rate and the heterogeneity.
NASA Astrophysics Data System (ADS)
Farrell, M. E.; Russo, R. M.
2013-12-01
The installation of Earthscope Transportable Array-style geophysical observatories in Chile expands open data seismic recording capabilities in the southern hemisphere by nearly 30%, and has nearly tripled the number of seismic stations providing freely-available data in southern South America. Through the use of collocated seismic and atmospheric sensors at these stations we are able to analyze how local atmospheric conditions generate seismic noise, which can degrade data in seismic frequency bands at stations in the ';roaring forties' (S latitudes). Seismic vaults that are climate-controlled and insulated from the local environment are now employed throughout the world in an attempt to isolate seismometers from as many noise sources as possible. However, this is an expensive solution that is neither practical nor possible for all seismic deployments; and also, the increasing number and scope of temporary seismic deployments has resulted in the collection and archiving of terabytes of seismic data that is affected to some degree by natural seismic noise sources such as wind and atmospheric pressure changes. Changing air pressure can result in a depression and subsequent rebound of Earth's surface - which generates low frequency noise in seismic frequency bands - and even moderate winds can apply enough force to ground-coupled structures or to the surface above the seismometers themselves, resulting in significant noise. The 10 stations of the permanent Geophysical Reporting Observatories (GRO Chile), jointly installed during 2011-12 by IRIS and the Chilean Servicio Sismológico, include instrumentation in addition to the standard three seismic components. These stations, spaced approximately 300 km apart along the length of the country, continuously record a variety of atmospheric data including infrasound, air pressure, wind speed, and wind direction. The collocated seismic and atmospheric sensors at each station allow us to analyze both datasets together, to
NASA Astrophysics Data System (ADS)
Asakawa, Eiichi; Murakami, Fumitoshi; Tsukahara, Hitoshi; Saito, Shutaro; Lee, Sangkyun; Tara, Kenji; Kato, Masafumi; Jamali Hondori, Ehsan; Sumi, Tomonori; Kadoshima, Kazuyuki; Kose, Masami
2017-04-01
Within the EEZ of Japan, numerous surveys exploring ocean floor resources have been conducted. The exploration targets are gas hydrates, mineral resources (manganese, cobalt or rare earth) and especially seafloor massive sulphide (SMS) deposits. These resources exist in shallow subsurface areas in deep waters (>1500m). For seismic explorations very high resolution images are required. These cannot be effectively obtained with conventional marine seismic techniques. Therefore we have been developing autonomous seismic survey systems which record the data close to the seafloor to preserve high frequency seismic energy. Very high sampling rate (10kHz) and high accurate synchronization between recording systems and shot time are necessary. We adopted Cs-base atomic clock considering its power consumption. At first, we developed a Vertical Cable Seismic (VCS) system that uses hydrophone arrays moored vertically from the ocean bottom to record close to the target area. This system has been successfully applied to SMS exploration. Specifically it fixed over known sites to assess the amount of reserves with the resultant 3D volume. Based on the success of VCS, we modified the VCS system to use as a more efficient deep-tow seismic survey system. Although there are other examples of deep-tow seismic systems, signal transmission cables present challenges in deep waters. We use our autonomous recording system to avoid these problems. Combining a high frequency piezoelectric source (Sub Bottom Profiler:SBP) that automatically shots with a constant interval, we achieve the high resolution deep-tow seismic without data transmission/power cable to the board. Although the data cannot be monitored in real-time, the towing system becomes very simple. We have carried out survey trial, which showed the systems utility as a high-resolution deep-tow seismic survey system. Furthermore, the frequency ranges of deep-towed source (SBP) and surface towed sparker are 700-2300Hz and 10-200Hz
New Possibilities In Assessing Time-dependent Seismic Risk
NASA Astrophysics Data System (ADS)
Kossobokov, V.
A novel understanding of seismic occurrence process in terms of dynamics of a hierar- chical system of blocks-and-faults implies the necessity of new approaches to seismic risk assessment, which would allow for evident heterogeneity of seismic distribution in space and time. Spatial, apparently fractal, patterns of seismic distribution should be treated appropriately in estimation of seismic hazard. Otherwise the result could be over- or underestimated significantly. The patterns are clearly associated with tec- tonic movement, which traces being accumulated in a time-scale of tens of thousand years or larger provide geographic, geologic, gravity, and magnetic evidence of inten- sity of driving forces, their directivity and dating. This, term-less, in a sense of hu- man life-time, evidence, both clear and masked, requires analysis that involves pattern recognition and interpretation before it is used in favor of a conclusion about present day seismic activity. Moreover, the existing reproducible intermediate-term medium- range earthquake prediction algorithms that have passed statistical significance testing in forward application complement a knowledgeable estimation of the temporal devi- ation of seismic hazard in a given area from a constant. Bringing together the two estimations and convolving them with a given distribution of valuables of different kinds, e.g. population, industry, economy, etc., finalizes an estimation of seismic risk distribution.
Assessing the seismic risk potential of South America
Jaiswal, Kishor; Petersen, Mark D.; Harmsen, Stephen; Smoczyk, Gregory M.
2016-01-01
We present here a simplified approach to quantifying regional seismic risk. The seismic risk for a given region can be inferred in terms of average annual loss (AAL) that represents long-term value of earthquake losses in any one year caused from a long-term seismic hazard. The AAL are commonly measured in the form of earthquake shaking-induced deaths, direct economic impacts or indirect losses caused due to loss of functionality. In the context of South American subcontinent, the analysis makes use of readily available public data on seismicity, population exposure, and the hazard and vulnerability models for the region. The seismic hazard model was derived using available seismic catalogs, fault databases, and the hazard methodologies that are analogous to the U.S. Geological Survey’s national seismic hazard mapping process. The Prompt Assessment of Global Earthquakes for Response (PAGER) system’s direct empirical vulnerability functions in terms of fatality and economic impact were used for performing exposure and risk analyses. The broad findings presented and the risk maps produced herein are preliminary, yet they do offer important insights into the underlying zones of high and low seismic risks in the South American subcontinent. A more detailed analysis of risk may be warranted by engaging local experts, especially in some of the high risk zones identified through the present investigation.
Expected Seismicity and the Seismic Noise Environment of Europa
NASA Astrophysics Data System (ADS)
Panning, Mark P.; Stähler, Simon C.; Huang, Hsin-Hua; Vance, Steven D.; Kedar, Sharon; Tsai, Victor C.; Pike, William T.; Lorenz, Ralph D.
2018-01-01
Seismic data will be a vital geophysical constraint on internal structure of Europa if we land instruments on the surface. Quantifying expected seismic activity on Europa both in terms of large, recognizable signals and ambient background noise is important for understanding dynamics of the moon, as well as interpretation of potential future data. Seismic energy sources will likely include cracking in the ice shell and turbulent motion in the oceans. We define a range of models of seismic activity in Europa's ice shell by assuming each model follows a Gutenberg-Richter relationship with varying parameters. A range of cumulative seismic moment release between 1016 and 1018 Nm/yr is defined by scaling tidal dissipation energy to tectonic events on the Earth's moon. Random catalogs are generated and used to create synthetic continuous noise records through numerical wave propagation in thermodynamically self-consistent models of the interior structure of Europa. Spectral characteristics of the noise are calculated by determining probabilistic power spectral densities of the synthetic records. While the range of seismicity models predicts noise levels that vary by 80 dB, we show that most noise estimates are below the self-noise floor of high-frequency geophones but may be recorded by more sensitive instruments. The largest expected signals exceed background noise by ˜50 dB. Noise records may allow for constraints on interior structure through autocorrelation. Models of seismic noise generated by pressure variations at the base of the ice shell due to turbulent motions in the subsurface ocean may also generate observable seismic noise.
RSEIS and RFOC: Seismic Analysis in R
NASA Astrophysics Data System (ADS)
Lees, J. M.
2015-12-01
Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.
Romanian Data Center: A modern way for seismic monitoring
NASA Astrophysics Data System (ADS)
Neagoe, Cristian; Marius Manea, Liviu; Ionescu, Constantin
2014-05-01
The main seismic survey of Romania is performed by the National Institute for Earth Physics (NIEP) which operates a real-time digital seismic network. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Mark l4c, Ranger, gs21, Mark l22) and acceleration sensors (Episensor Kinemetrics). The data are transmitted at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for the Black Sea tsunami events. NIEP is a data acquisition node for the seismic network of Moldova (FDSN code MD) composed of five seismic stations. NIEP has installed in the northern part of Bulgaria eight seismic stations equipped with broadband sensors and Episensors and nine accelerometers (Episensors) installed in nine districts along the Danube River. All the data are acquired at NIEP for Early Warning System and for primary estimation of the earthquake parameters. The real-time acquisition (RT) and data exchange is done by Antelope software and Seedlink (from Seiscomp3). The real-time data communication is ensured by different types of transmission: GPRS, satellite, radio, Internet and a dedicated line provided by a governmental network. For data processing and analysis at the two data centers Antelope 5.2 TM is being used running on 3 workstations: one from a CentOS platform and two on MacOS. Also a Seiscomp3 server stands as back-up for Antelope 5.2 Both acquisition and analysis of seismic data systems produce information about local and global parameters of earthquakes. In addition, Antelope is used for manual processing (event association, calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV, etc.), generating
Gabor Deconvolution as Preliminary Method to Reduce Pitfall in Deeper Target Seismic Data
NASA Astrophysics Data System (ADS)
Oktariena, M.; Triyoso, W.
2018-03-01
Anelastic attenuation process during seismic wave propagation is the trigger of seismic non-stationary characteristic. An absorption and a scattering of energy are causing the seismic energy loss as the depth increasing. A series of thin reservoir layers found in the study area is located within Talang Akar Fm. Level, showing an indication of interpretation pitfall due to attenuation effect commonly occurred in deeper level seismic data. Attenuation effect greatly influences the seismic images of deeper target level, creating pitfalls in several aspect. Seismic amplitude in deeper target level often could not represent its real subsurface character due to a low amplitude value or a chaotic event nearing the Basement. Frequency wise, the decaying could be seen as the frequency content diminishing in deeper target. Meanwhile, seismic amplitude is the simple tool to point out Direct Hydrocarbon Indicator (DHI) in preliminary Geophysical study before a further advanced interpretation method applied. A quick-look of Post-Stack Seismic Data shows the reservoir associated with a bright spot DHI while another bigger bright spot body detected in the North East area near the field edge. A horizon slice confirms a possibility that the other bright spot zone has smaller delineation; an interpretation pitfall commonly occurs in deeper level of seismic. We evaluates this pitfall by applying Gabor Deconvolution to address the attenuation problem. Gabor Deconvolution forms a Partition of Unity to factorize the trace into smaller convolution window that could be processed as stationary packets. Gabor Deconvolution estimates both the magnitudes of source signature alongside its attenuation function. The enhanced seismic shows a better imaging in the pitfall area that previously detected as a vast bright spot zone. When the enhanced seismic is used for further advanced reprocessing process, the Seismic Impedance and Vp/Vs Ratio slices show a better reservoir delineation, in which the
Seismic migration in generalized coordinates
NASA Astrophysics Data System (ADS)
Arias, C.; Duque, L. F.
2017-06-01
Reverse time migration (RTM) is a technique widely used nowadays to obtain images of the earth’s sub-surface, using artificially produced seismic waves. This technique has been developed for zones with flat surface and when applied to zones with rugged topography some corrections must be introduced in order to adapt it. This can produce defects in the final image called artifacts. We introduce a simple mathematical map that transforms a scenario with rugged topography into a flat one. The three steps of the RTM can be applied in a way similar to the conventional ones just by changing the Laplacian in the acoustic wave equation for a generalized one. We present a test of this technique using the Canadian foothills SEG velocity model.
Detecting Seismic Activity with a Covariance Matrix Analysis of Data Recorded on Seismic Arrays
NASA Astrophysics Data System (ADS)
Seydoux, L.; Shapiro, N.; de Rosny, J.; Brenguier, F.
2014-12-01
Modern seismic networks are recording the ground motion continuously all around the word, with very broadband and high-sensitivity sensors. The aim of our study is to apply statistical array-based approaches to processing of these records. We use the methods mainly brought from the random matrix theory in order to give a statistical description of seismic wavefields recorded at the Earth's surface. We estimate the array covariance matrix and explore the distribution of its eigenvalues that contains information about the coherency of the sources that generated the studied wavefields. With this approach, we can make distinctions between the signals generated by isolated deterministic sources and the "random" ambient noise. We design an algorithm that uses the distribution of the array covariance matrix eigenvalues to detect signals corresponding to coherent seismic events. We investigate the detection capacity of our methods at different scales and in different frequency ranges by applying it to the records of two networks: (1) the seismic monitoring network operating on the Piton de la Fournaise volcano at La Réunion island composed of 21 receivers and with an aperture of ~15 km, and (2) the transportable component of the USArray composed of ~400 receivers with ~70 km inter-station spacing.
NASA Astrophysics Data System (ADS)
Longobardi, M.; Bustin, A. M. M.; Johansen, K.; Bustin, R. M.
2017-12-01
One of our goals is to investigate the variables and processes controlling the anomalous induced seismicity and its associated ground motions, to better understand the anomalous induced seismicity (AIS) due to hydraulic fracturing in Northeast British Columbia. Our other main objective is to optimize-completions and well design. Although the vast majority of earthquakes that occur in the world each year have natural causes, some of these earthquakes and a number of lesser magnitude seismic events are induced by human activities. The recorded induced seismicity resulting from the fluid injection during hydraulic fracturing is generally small in magnitude (< M 1). Shale gas operations in Northeast British Columbia (BC) have induced the largest recorded occurrence and magnitude of AIS because of hydraulic fracturing. Anomalous induced seismicity have been recorded in seven clusters within the Montney area, with magnitudes up to ML 4.6. Five of these clusters have been linked to hydraulic fracturing. To analyse our AIS data, we first have calculated the earthquakes hypocenters. The data was recorded on an array of real-time accelerometers. We built the array based on our modified design from the early earthquake detectors installed in BC schools for the Earthquake Early Warning System for British Columbia. We have developed a new technique for locating hypocenters and applied it to our dataset. The technique will enable near real-time event location, aiding in both mitigating induced events and adjusting completions to optimize the stimulation. Our hypocenter program assumes to consider a S wave speed, fitting the arrival times to the hypocenter, and using an "amoebae method" multivariate. We have used this method because it is well suited to minimizing of the chi-squared function of the arrival time deviation. We show some preliminary results on the Montney dataset.
Evaluation of induced seismicity forecast models in the Induced Seismicity Test Bench
NASA Astrophysics Data System (ADS)
Király, Eszter; Gischig, Valentin; Zechar, Jeremy; Doetsch, Joseph; Karvounis, Dimitrios; Wiemer, Stefan
2016-04-01
Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. Here, we propose an Induced Seismicity Test Bench to test and rank such models. We apply the test bench to data from the Basel 2006 and Soultz-sous-Forêts 2004 geothermal stimulation projects, and we assess forecasts from two models that incorporate a different mix of physical understanding and stochastic representation of the induced sequences: Shapiro in Space (SiS) and Hydraulics and Seismics (HySei). SiS is based on three pillars: the seismicity rate is computed with help of the seismogenic index and a simple exponential decay of the seismicity; the magnitude distribution follows the Gutenberg-Richter relation; and seismicity is distributed in space based on smoothing seismicity during the learning period with 3D Gaussian kernels. The HySei model describes seismicity triggered by pressure diffusion with irreversible permeability enhancement. Our results show that neither model is fully superior to the other. HySei forecasts the seismicity rate well, but is only mediocre at forecasting the spatial distribution. On the other hand, SiS forecasts the spatial distribution well but not the seismicity rate. The shut-in phase is a difficult moment for both models in both reservoirs: the models tend to underpredict the seismicity rate around, and shortly after, shut-in. Ensemble models that combine HySei's rate forecast with SiS's spatial forecast outperform each individual model.
NASA Astrophysics Data System (ADS)
Hibert, C.; Michéa, D.; Provost, F.; Malet, J. P.; Geertsema, M.
2017-12-01
Detection of landslide occurrences and measurement of their dynamics properties during run-out is a high research priority but a logistical and technical challenge. Seismology has started to help in several important ways. Taking advantage of the densification of global, regional and local networks of broadband seismic stations, recent advances now permit the seismic detection and location of landslides in near-real-time. This seismic detection could potentially greatly increase the spatio-temporal resolution at which we study landslides triggering, which is critical to better understand the influence of external forcings such as rainfalls and earthquakes. However, detecting automatically seismic signals generated by landslides still represents a challenge, especially for events with small mass. The low signal-to-noise ratio classically observed for landslide-generated seismic signals and the difficulty to discriminate these signals from those generated by regional earthquakes or anthropogenic and natural noises are some of the obstacles that have to be circumvented. We present a new method for automatically constructing instrumental landslide catalogues from continuous seismic data. We developed a robust and versatile solution, which can be implemented in any context where a seismic detection of landslides or other mass movements is relevant. The method is based on a spectral detection of the seismic signals and the identification of the sources with a Random Forest machine learning algorithm. The spectral detection allows detecting signals with low signal-to-noise ratio, while the Random Forest algorithm achieve a high rate of positive identification of the seismic signals generated by landslides and other seismic sources. The processing chain is implemented to work in a High Performance Computers centre which permits to explore years of continuous seismic data rapidly. We present here the preliminary results of the application of this processing chain for years
NASA Astrophysics Data System (ADS)
Filippidou, N.; Drijkoningen, G.; Braaksma, H.; Verwer, K.; Kenter, J.
2005-05-01
Interest in high-resolution 3D seismic experiments for imaging shallow targets has increased over the past years. Many case studies presented, show that producing clear seismic images with this non-evasive method, is still a challenge. We use two test-sites where nearby outcrops are present so that an accurate geological model can be built and the seismic result validated. The first so-called natural field laboratory is located in Boulonnais (N. France). It is an upper Jurassic siliciclastic sequence; age equivalent of the source rock of N. Sea. The second one is located in Cap Blanc,to the southwest of the Mallorca island(Spain); depicting an excellent example of Miocene prograding reef platform (Llucmajor Platform); it is a textbook analog for carbonate reservoirs. In both cases, the multidisciplinary experiment included the use of multicomponent and quasi- or 3D seismic recordings. The target depth does not exceed 120m. Vertical and shear portable vibrators were used as source. In the center of the setups, boreholes were drilled and Vertical Seismic Profiles were shot, along with core and borehole measurements both in situ and in the laboratory. These two geologically different sites, with different seismic stratigraphy have provided us with exceptionally high resolution seismic images. In general seismic data was processed more or less following standard procedures, a few innovative techniques on the Mallorca data, as rotation of horizontal components, 3D F-K filter and addition of parallel profiles, have improved the seismic image. In this paper we discuss the basic differences as seen on the seismic sections. The Boulonnais data present highly continuous reflection patterns of extremenly high resolution. This facilitated a high resolution stratigraphic description. Results from the VSP showed substantial wave energy attenuation. However, the high-fold (330 traces ) Mallorca seismic experiment returned a rather discontinuous pattern of possible reflectors
NASA Astrophysics Data System (ADS)
Brenguier, F.; Rivemale, E.; Clarke, D. S.; Schmid, A.; Got, J.; Battaglia, J.; Taisne, B.; Staudacher, T.; Peltier, A.; Shapiro, N. M.; Tait, S.; Ferrazzini, V.; Di Muro, A.
2011-12-01
Piton de la Fournaise volcano (PdF) is among the most active basaltic volcanoes worldwide with more than one eruption per year on average. Also, PdF is densely instrumented with short-period and broad-band seismometers as well as with GPS receivers. Continuous seismic waveforms are available from 1999. Piton de la Fournaise volcano has a moderate inter-eruptive seismic activity with an average of five detected Volcano-Tectonic (VT) earthquakes per day with magnitudes ranging from 0.5 to 3.5. These earthquakes are shallow and located about 2.5 kilometers beneath the edifice surface. Volcanic unrest is captured on average a few weeks before eruptions by measurements of increased VT seismicity rate, inflation of the edifice summit, and decreased seismic velocities from correlations of seismic noise. Eruptions are usually preceded by seismic swarms of VT earthquakes. Recently, almost 50 % of seismic swarms were not followed by eruptions. Within this work, we aim to gather results from different groups of the UnderVolc research project in order to better understand the processes of deep magma transfer, volcanic unrest, and pre-eruptive magma transport initiation. Among our results, we show that the period 1999-2003 was characterized by a long-term increase of VT seismicity rate coupled with a long-term decrease of seismic velocities. These observations could indicate a long-term replenishment of the magma storage area. The relocation of ten years of inter-eruptive micro-seismicity shows a narrow (~300 m long) sub-vertical fault zone thus indicating a conduit rather than an extended magma reservoir as the shallow magma feeder system. Also, we focus on the processes of short-term volcanic unrest and prove that magma intrusions within the edifice leading to eruptions activate specific VT earthquakes that are distinct from magma intrusions that do not lead to eruptions. We thus propose that, among the different pathways of magma transport within the edifice, only one will
Seismic Catalogue and Seismic Network in Haiti
NASA Astrophysics Data System (ADS)
Belizaire, D.; Benito, B.; Carreño, E.; Meneses, C.; Huerfano, V.; Polanco, E.; McCormack, D.
2013-05-01
The destructive earthquake occurred on January 10, 2010 in Haiti, highlighted the lack of preparedness of the country to address seismic phenomena. At the moment of the earthquake, there was no seismic network operating in the country, and only a partial control of the past seismicity was possible, due to the absence of a national catalogue. After the 2010 earthquake, some advances began towards the installation of a national network and the elaboration of a seismic catalogue providing the necessary input for seismic Hazard Studies. This paper presents the state of the works carried out covering both aspects. First, a seismic catalogue has been built, compiling data of historical and instrumental events occurred in the Hispaniola Island and surroundings, in the frame of the SISMO-HAITI project, supported by the Technical University of Madrid (UPM) and Developed in cooperation with the Observatoire National de l'Environnement et de la Vulnérabilité of Haiti (ONEV). Data from different agencies all over the world were gathered, being relevant the role of the Dominican Republic and Puerto Rico seismological services which provides local data of their national networks. Almost 30000 events recorded in the area from 1551 till 2011 were compiled in a first catalogue, among them 7700 events with Mw ranges between 4.0 and 8.3. Since different magnitude scale were given by the different agencies (Ms, mb, MD, ML), this first catalogue was affected by important heterogeneity in the size parameter. Then it was homogenized to moment magnitude Mw using the empirical equations developed by Bonzoni et al (2011) for the eastern Caribbean. At present, this is the most exhaustive catalogue of the country, although it is difficult to assess its degree of completeness. Regarding the seismic network, 3 stations were installed just after the 2010 earthquake by the Canadian Government. The data were sent by telemetry thought the Canadian System CARINA. In 2012, the Spanish IGN together
Temporal Delineation and Quantification of Short Term Clustered Mining Seismicity
NASA Astrophysics Data System (ADS)
Woodward, Kyle; Wesseloo, Johan; Potvin, Yves
2017-07-01
The assessment of the temporal characteristics of seismicity is fundamental to understanding and quantifying the seismic hazard associated with mining, the effectiveness of strategies and tactics used to manage seismic hazard, and the relationship between seismicity and changes to the mining environment. This article aims to improve the accuracy and precision in which the temporal dimension of seismic responses can be quantified and delineated. We present a review and discussion on the occurrence of time-dependent mining seismicity with a specific focus on temporal modelling and the modified Omori law (MOL). This forms the basis for the development of a simple weighted metric that allows for the consistent temporal delineation and quantification of a seismic response. The optimisation of this metric allows for the selection of the most appropriate modelling interval given the temporal attributes of time-dependent mining seismicity. We evaluate the performance weighted metric for the modelling of a synthetic seismic dataset. This assessment shows that seismic responses can be quantified and delineated by the MOL, with reasonable accuracy and precision, when the modelling is optimised by evaluating the weighted MLE metric. Furthermore, this assessment highlights that decreased weighted MLE metric performance can be expected if there is a lack of contrast between the temporal characteristics of events associated with different processes.
Detecting aseismic strain transients from seismicity data
Llenos, A.L.; McGuire, J.J.
2011-01-01
Aseismic deformation transients such as fluid flow, magma migration, and slow slip can trigger changes in seismicity rate. We present a method that can detect these seismicity rate variations and utilize these anomalies to constrain the underlying variations in stressing rate. Because ordinary aftershock sequences often obscure changes in the background seismicity caused by aseismic processes, we combine the stochastic Epidemic Type Aftershock Sequence model that describes aftershock sequences well and the physically based rate- and state-dependent friction seismicity model into a single seismicity rate model that models both aftershock activity and changes in background seismicity rate. We implement this model into a data assimilation algorithm that inverts seismicity catalogs to estimate space-time variations in stressing rate. We evaluate the method using a synthetic catalog, and then apply it to a catalog of M???1.5 events that occurred in the Salton Trough from 1990 to 2009. We validate our stressing rate estimates by comparing them to estimates from a geodetically derived slip model for a large creep event on the Obsidian Buttes fault. The results demonstrate that our approach can identify large aseismic deformation transients in a multidecade long earthquake catalog and roughly constrain the absolute magnitude of the stressing rate transients. Our method can therefore provide a way to detect aseismic transients in regions where geodetic resolution in space or time is poor. Copyright 2011 by the American Geophysical Union.
Structural interpretation of seismic data and inherent uncertainties
NASA Astrophysics Data System (ADS)
Bond, Clare
2013-04-01
Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with
NASA Astrophysics Data System (ADS)
Konstantaras, Anthony; Katsifarakis, Emmanouil; Artzouxaltzis, Xristos; Makris, John; Vallianatos, Filippos; Varley, Martin
2010-05-01
This paper is a preliminary investigation of the possible correlation of temporal and energy release patterns of seismic activity involving the preparation processes of consecutive sizeable seismic events [1,2]. The background idea is that during periods of low-level seismic activity, stress processes in the crust accumulate energy at the seismogenic area whilst larger seismic events act as a decongesting mechanism releasing considerable energy [3,4]. A dynamic algorithm is being developed aiming to identify and cluster pre- and post- seismic events to the main earthquake following on research carried out by Zubkov [5] and Dobrovolsky [6,7]. This clustering technique along with energy release equations dependent on Richter's scale [8,9] allow for an estimate to be drawn regarding the amount of the energy being released by the seismic sequence. The above approach is being implemented as a monitoring tool to investigate the behaviour of the underlying energy management system by introducing this information to various neural [10,11] and soft computing models [1,12,13,14]. The incorporation of intelligent systems aims towards the detection and simulation of the possible relationship between energy release patterns and time-intervals among consecutive sizeable earthquakes [1,15]. Anticipated successful training of the imported intelligent systems may result in a real-time, on-line processing methodology [1,16] capable to dynamically approximate the time-interval between the latest and the next forthcoming sizeable seismic event by monitoring the energy release process in a specific seismogenic area. Indexing terms: pattern recognition, long-term earthquake precursors, neural networks, soft computing, earthquake occurrence intervals References [1] Konstantaras A., Vallianatos F., Varley M.R. and Makris J. P.: ‘Soft computing modelling of seismicity in the southern Hellenic arc', IEEE Geoscience and Remote Sensing Letters, vol. 5 (3), pp. 323-327, 2008 [2] Eneva M. and
Seismic Data from Roosevelt Hot Springs, Utah FORGE Study Area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, John
This set of data contains raw and processed 2D and 3D seismic data from the Utah FORGE study area near Roosevelt Hot Springs. The zipped archives numbered from 1-100 to 1001-1122 contain 3D seismic uncorrelated shot gatherers SEG-Y files. The zipped archives numbered from 1-100C to 1001-1122C contain 3D seismic correlated shot gatherers SEG-Y files. Other data have intuitive names.
The SISIFO project: Seismic Safety at High Schools
NASA Astrophysics Data System (ADS)
Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi; Dusi, Alberto; Grimaz, Stefano; Malisan, Petra; Saraò, Angela; Mucciarelli, Marco
2014-05-01
For many years, the Italian scientific community has faced the problem of the reduction of earthquake risk using innovative educational techniques. Recent earthquakes in Italy and around the world have clearly demonstrated that seismic codes alone are not able to guarantee an effective mitigation of risk. After the tragic events of San Giuliano di Puglia (2002), where an earthquake killed 26 school children, special attention was paid in Italy to the seismic safety of schools, but mainly with respect to structural aspects. Little attention has been devoted to the possible and even significant damage to non-structural elements (collapse of ceilings, tipping of cabinets and shelving, obstruction of escape routes, etc..). Students and teachers trained on these aspects may lead to a very effective preventive vigilance. Since 2002, the project EDURISK (www.edurisk.it) proposed educational tools and training programs for schools, at primary and middle levels. More recently, a nationwide campaign aimed to adults (www.iononrischio.it) was launched with the extensive support of civil protection volounteers. There was a gap for high schools, and Project SISIFO was designed to fill this void and in particular for those schools with technical/scientific curricula. SISIFO (https://sites.google.com/site/ogssisifo/) is a multidisciplinary initiative, aimed at the diffusion of scientific culture for achieving seismic safety in schools, replicable and can be structured in training the next several years. The students, helped by their teachers and by experts from scientific institutions, followed a course on specialized training on earthquake safety. The trial began in North-East Italy, with a combination of hands-on activities for the measurement of earthquakes with low-cost instruments and lectures with experts in various disciplines, accompanied by specifically designed teaching materials, both on paper and digital format. We intend to raise teachers and students knowledge of the
Analysis of induced seismicity in geothermal reservoirs – An overview
Zang, Arno; Oye, Volker; Jousset, Philippe; Deichmann, Nicholas; Gritto, Roland; McGarr, Arthur F.; Majer, Ernest; Bruhn, David
2014-01-01
In this overview we report results of analysing induced seismicity in geothermal reservoirs in various tectonic settings within the framework of the European Geothermal Engineering Integrating Mitigation of Induced Seismicity in Reservoirs (GEISER) project. In the reconnaissance phase of a field, the subsurface fault mapping, in situ stress and the seismic network are of primary interest in order to help assess the geothermal resource. The hypocentres of the observed seismic events (seismic cloud) are dependent on the design of the installed network, the used velocity model and the applied location technique. During the stimulation phase, the attention is turned to reservoir hydraulics (e.g., fluid pressure, injection volume) and its relation to larger magnitude seismic events, their source characteristics and occurrence in space and time. A change in isotropic components of the full waveform moment tensor is observed for events close to the injection well (tensile character) as compared to events further away from the injection well (shear character). Tensile events coincide with high Gutenberg-Richter b-values and low Brune stress drop values. The stress regime in the reservoir controls the direction of the fracture growth at depth, as indicated by the extent of the seismic cloud detected. Stress magnitudes are important in multiple stimulation of wells, where little or no seismicity is observed until the previous maximum stress level is exceeded (Kaiser Effect). Prior to drilling, obtaining a 3D P-wave (Vp) and S-wave velocity (Vs) model down to reservoir depth is recommended. In the stimulation phase, we recommend to monitor and to locate seismicity with high precision (decametre) in real-time and to perform local 4D tomography for velocity ratio (Vp/Vs). During exploitation, one should use observed and model induced seismicity to forward estimate seismic hazard so that field operators are in a position to adjust well hydraulics (rate and volume of the
Open Source Seismic Software in NOAA's Next Generation Tsunami Warning System
NASA Astrophysics Data System (ADS)
Hellman, S. B.; Baker, B. I.; Hagerty, M. T.; Leifer, J. M.; Lisowski, S.; Thies, D. A.; Donnelly, B. K.; Griffith, F. P.
2014-12-01
The Tsunami Information technology Modernization (TIM) is a project spearheaded by National Oceanic and Atmospheric Administration to update the United States' Tsunami Warning System software currently employed at the Pacific Tsunami Warning Center (Eva Beach, Hawaii) and the National Tsunami Warning Center (Palmer, Alaska). This entirely open source software project will integrate various seismic processing utilities with the National Weather Service Weather Forecast Office's core software, AWIPS2. For the real-time and near real-time seismic processing aspect of this project, NOAA has elected to integrate the open source portions of GFZ's SeisComP 3 (SC3) processing system into AWIPS2. To provide for better tsunami threat assessments we are developing open source tools for magnitude estimations (e.g., moment magnitude, energy magnitude, surface wave magnitude), detection of slow earthquakes with the Theta discriminant, moment tensor inversions (e.g. W-phase and teleseismic body waves), finite fault inversions, and array processing. With our reliance on common data formats such as QuakeML and seismic community standard messaging systems, all new facilities introduced into AWIPS2 and SC3 will be available as stand-alone tools or could be easily integrated into other real time seismic monitoring systems such as Earthworm, Antelope, etc. Additionally, we have developed a template based design paradigm so that the developer or scientist can efficiently create upgrades, replacements, and/or new metrics to the seismic data processing with only a cursory knowledge of the underlying SC3.
Toward predicting clay landslide with ambient seismic noise
NASA Astrophysics Data System (ADS)
Larose, E. F.; Mainsant, G.; Carriere, S.; Chambon, G.; Michoud, C.; Jongmans, D.; Jaboyedoff, M.
2013-12-01
Clay-rich pose critical problems in risk management worldwide. The most widely proposed mechanism leading to such flow-like movements is the increase in water pore pressure in the sliding mass, generating partial or complete liquefaction. This solid-to-liquid transition results in a dramatic reduction of mechanical rigidity, which could be detected by monitoring shear wave velocity variations, The ambient seismic noise correlation technique has been applied to measure the variation in the seismic surface wave velocity in the Pont Bourquin landslide (Swiss Alps). This small but active composite earthslide-earthflow was equipped with continuously recording seismic sensors during spring and summer 2010, and then again from fall 2011 on. An earthslide of a few thousand cubic meters was triggered in mid-August 2010, after a rainy period. This article shows that the seismic velocity of the sliding material, measured from daily noise correlograms, decreased continuously and rapidly for several days prior to the catastrophic event. From a spectral analysis of the velocity decrease, it was possible to determine the location of the change at the base of the sliding layer. These results are confirmed by analogous small-scale experiments in the laboratory. These results demonstrate that ambient seismic noise can be used to detect rigidity variations before failure and could potentially be used to predict landslides.
Evaluation of stabilization techniques for ion implant processing
NASA Astrophysics Data System (ADS)
Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.
1999-06-01
With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across
A new database on subduction seismicity at the global scale
NASA Astrophysics Data System (ADS)
Presti, D.; Heuret, A.; Funiciello, F.; Piromallo, C.
2012-04-01
In the framework of the EURYI Project 'Convergent margins and seismogenesis: defining the risk of great earthquakes by using statistical data and modelling', a global collection of recent intraslab seismicity has been performed. Based on EHB hypocenter and CMT Harvard catalogues, the hypocenters, nodal planes and seismic moments of worldwide subduction-related earthquakes were extracted for the period 1976 - 2007. Data were collected for centroid depths between sea level and 700 km and for magnitude Mw ≥ 5.5. For each subduction zone, a set of trench-normal transects were constructed choosing a 120km width of the cross-section on each side of a vertical plane and a spacing of 1 degree along the trench. For each of the 505 resulting transects, the whole subduction seismogenic zone was mapped as focal mechanisms projected on to a vertical plane after their faulting type classification according to the Aki-Richards convention. Transect by transect, fist the seismicity that can be considered not related to the subduction process under investigation was removed, then was selected the upper plate seismicity (i.e. earthquakes generated within the upper plate as a result of the subduction process). After deletion from the so obtained event subset of the interplate seismicity as identified in the framework of this project by Heuret et al. (2011), we can be reasonably confident that the remaining seismicity can be related to the subducting plate. Among these earthquakes we then selected the intermediate and deep depth seismicity. The upper limit of the intermediate depth seismicity is generally fixed at 70 km depth in order to avoid possible mixing with interplate seismicity. The ranking of intermediate depth and deep seismicity was in most of cases referred to earthquakes with focal depth between 70-300 km and with depth exceeding 300 km, respectively. Outer-rise seismicity was also selected. Following Heuret et al. (2011), the 505 transects were merged into 62 larger
Effects of volcano topography on seismic broad-band waveforms
NASA Astrophysics Data System (ADS)
Neuberg, Jürgen; Pointer, Tim
2000-10-01
Volcano seismology often deals with rather shallow seismic sources and seismic stations deployed in their near field. The complex stratigraphy on volcanoes and near-field source effects have a strong impact on the seismic wavefield, complicating the interpretation techniques that are usually employed in earthquake seismology. In addition, as most volcanoes have a pronounced topography, the interference of the seismic wavefield with the stress-free surface results in severe waveform perturbations that affect seismic interpretation methods. In this study we deal predominantly with the surface effects, but take into account the impact of a typical volcano stratigraphy as well as near-field source effects. We derive a correction term for plane seismic waves and a plane-free surface such that for smooth topographies the effect of the free surface can be totally removed. Seismo-volcanic sources radiate energy in a broad frequency range with a correspondingly wide range of different Fresnel zones. A 2-D boundary element method is employed to study how the size of the Fresnel zone is dependent on source depth, dominant wavelength and topography in order to estimate the limits of the plane wave approximation. This approximation remains valid if the dominant wavelength does not exceed twice the source depth. Further aspects of this study concern particle motion analysis to locate point sources and the influence of the stratigraphy on particle motions. Furthermore, the deployment strategy of seismic instruments on volcanoes, as well as the direct interpretation of the broad-band waveforms in terms of pressure fluctuations in the volcanic plumbing system, are discussed.
Spots of Seismic Danger Extracted by Properties of Low-Frequency Seismic Noise
NASA Astrophysics Data System (ADS)
Lyubushin, Alexey
2013-04-01
region of Nankai Trough. The analysis of seismic noise after March 2011 indicates increasing of probability of the 2nd mega-earthquake starting from the middle of 2013 within the region of Nankai Trough which remains to be SSD. References 1. Lyubushin A. Multifractal Parameters of Low-Frequency Microseisms // V. de Rubeis et al. (eds.), Synchronization and Triggering: from Fracture to Earthquake Processes, GeoPlanet: Earth and Planetary Sciences 1, DOI 10.1007/978-3-642-12300-9_15, Springer-Verlag Berlin Heidelberg, 2010, 388p., Chapter 15, pp.253-272. http://www.springerlink.com/content/hj2l211577533261/ 2. Lyubushin A.A. Synchronization of multifractal parameters of regional and global low-frequency microseisms - European Geosciences Union General Assembly 2010, Vienna, 02-07 of May, 2010, Geophysical Research Abstracts, Vol. 12, EGU2010-696, 2010. http://meetingorganizer.copernicus.org/EGU2010/EGU2010-696.pdf 3. Lyubushin A.A. Synchronization phenomena of low-frequency microseisms. European Seismological Commission, 32nd General Assembly, September 06-10, 2010, Montpelier, France. Book of abstracts, p.124, session ES6. http://alexeylyubushin.narod.ru/ESC-2010_Book_of_abstracts.pdf 4. Lyubushin A.A. Seismic Catastrophe in Japan on March 11, 2011: Long-Term Prediction on the Basis of Low-Frequency Microseisms - Izvestiya, Atmospheric and Oceanic Physics, 2011, Vol. 46, No. 8, pp. 904-921. http://www.springerlink.com/content/kq53j2667024w715/ 5. Lyubushin, A. Prognostic properties of low-frequency seismic noise. Natural Science, 4, 659-666.doi: 10.4236/ns.2012.428087. http://www.scirp.org/journal/PaperInformation.aspx?paperID=21656
New Technology Changing The Face of Mobile Seismic Networks
NASA Astrophysics Data System (ADS)
Brisbourne, A.; Denton, P.; Seis-Uk
SEIS-UK, a seismic equipment pool and data management facility run by a consortium of four UK universities (Leicester, Leeds, Cambridge and Royal Holloway, London) completed its second phase in 2001. To compliment the existing broadband equipment pool, which has been deployed to full capacity to date, the consortium undertook a tender evaluation process for low-power, lightweight sensors and recorders, for use on both controlled source and passive seismic experiments. The preferred option, selected by the consortium, was the Guralp CMG-6TD system, with 150 systems ordered. The CMG-6TD system is a new concept in temporary seismic equipment. A 30s- 100Hz force-feedback sensor, integral 24bit digitiser and 3-4Gbyte of solid-state memory are all housed in a single unit. Use of the most recent technologies has kept the power consumption to below 1W and the weight to 3.5Kg per unit. The concept of the disk-swap procedure for obtaining data from the field has been usurped by a fast data download technique using firewire technology. This allows for rapid station servicing, essential when 150 stations are in use, and also ensures the environmental integrity of the system by removing the requirement for a disk access port and envi- ronmentally exposed data disk. The system therefore meets the criteria for controlled source and passive seismic experiments: (1) the single unit concept and low-weight is designed for rapid deployment on short-term projects; (2) the low power consumption reduces the power-supply requirements facilitating deployment; (3) the low self-noise and bandwidth of the sensor make it applicable to passive experiments involving nat- ural sources. Further to this acquisition process, in collaboration with external groups, the SEIS- UK data management procedures have been streamlined with the integration of the Guralp GCF format data into the PASSCAL PDB software. This allows for rapid dissemination of field data and the production of archive-ready datasets
NASA Astrophysics Data System (ADS)
Viens, L.; Miyake, H.; Koketsu, K.
2016-12-01
Large subduction earthquakes have the potential to generate strong long-period ground motions. The ambient seismic field, also called seismic noise, contains information about the elastic response of the Earth between two seismic stations that can be retrieved using seismic interferometry. The DONET1 network, which is composed of 20 offshore stations, has been deployed atop the Nankai subduction zone, Japan, to continuously monitor the seismotectonic activity in this highly seismically active region. The surrounding onshore area is covered by hundreds of seismic stations, which are operated the National Research Institute for Earth Science and Disaster Prevention (NIED) and the Japan Meteorological Agency (JMA), with a spacing of 15-20 km. We retrieve offshore-onshore Green's functions from the ambient seismic field using the deconvolution technique and use them to simulate the long-period ground motions of moderate subduction earthquakes that occurred at shallow depth. We extend the point source method, which is appropriate for moderate events, to finite source modeling to simulate the long-period ground motions of large Mw 7 class earthquake scenarios. The source models are constructed using scaling relations between moderate and large earthquakes to discretize the fault plane of the large hypothetical events into subfaults. Offshore-onshore Green's functions are spatially interpolated over the fault plane to obtain one Green's function for each subfault. The interpolated Green's functions are finally summed up considering different rupture velocities. Results show that this technique can provide additional information about earthquake ground motions that can be used with the existing physics-based simulations to improve seismic hazard assessment.
Seismic signature of active intrusions in mountain chains.
Di Luccio, Francesca; Chiodini, Giovanni; Caliro, Stefano; Cardellini, Carlo; Convertito, Vincenzo; Pino, Nicola Alessandro; Tolomei, Cristiano; Ventura, Guido
2018-01-01
Intrusions are a ubiquitous component of mountain chains and testify to the emplacement of magma at depth. Understanding the emplacement and growth mechanisms of intrusions, such as diapiric or dike-like ascent, is critical to constrain the evolution and structure of the crust. Petrological and geological data allow us to reconstruct magma pathways and long-term magma differentiation and assembly processes. However, our ability to detect and reconstruct the short-term dynamics related to active intrusive episodes in mountain chains is embryonic, lacking recognized geophysical signals. We analyze an anomalously deep seismic sequence (maximum magnitude 5) characterized by low-frequency bursts of earthquakes that occurred in 2013 in the Apennine chain in Italy. We provide seismic evidences of fluid involvement in the earthquake nucleation process and identify a thermal anomaly in aquifers where CO 2 of magmatic origin dissolves. We show that the intrusion of dike-like bodies in mountain chains may trigger earthquakes with magnitudes that may be relevant to seismic hazard assessment. These findings provide a new perspective on the emplacement mechanisms of intrusive bodies and the interpretation of the seismicity in mountain chains.
Seismic signature of active intrusions in mountain chains
Di Luccio, Francesca; Chiodini, Giovanni; Caliro, Stefano; Cardellini, Carlo; Convertito, Vincenzo; Pino, Nicola Alessandro; Tolomei, Cristiano; Ventura, Guido
2018-01-01
Intrusions are a ubiquitous component of mountain chains and testify to the emplacement of magma at depth. Understanding the emplacement and growth mechanisms of intrusions, such as diapiric or dike-like ascent, is critical to constrain the evolution and structure of the crust. Petrological and geological data allow us to reconstruct magma pathways and long-term magma differentiation and assembly processes. However, our ability to detect and reconstruct the short-term dynamics related to active intrusive episodes in mountain chains is embryonic, lacking recognized geophysical signals. We analyze an anomalously deep seismic sequence (maximum magnitude 5) characterized by low-frequency bursts of earthquakes that occurred in 2013 in the Apennine chain in Italy. We provide seismic evidences of fluid involvement in the earthquake nucleation process and identify a thermal anomaly in aquifers where CO2 of magmatic origin dissolves. We show that the intrusion of dike-like bodies in mountain chains may trigger earthquakes with magnitudes that may be relevant to seismic hazard assessment. These findings provide a new perspective on the emplacement mechanisms of intrusive bodies and the interpretation of the seismicity in mountain chains. PMID:29326978
Subband Coding Methods for Seismic Data Compression
NASA Technical Reports Server (NTRS)
Kiely, A.; Pollara, F.
1995-01-01
This paper presents a study of seismic data compression techniques and a compression algorithm based on subband coding. The compression technique described could be used as a progressive transmission system, where successive refinements of the data can be requested by the user. This allows seismologists to first examine a coarse version of waveforms with minimal usage of the channel and then decide where refinements are required. Rate-distortion performance results are presented and comparisons are made with two block transform methods.
Connecting crustal seismicity and earthquake-driven stress evolution in Southern California
Pollitz, Fred; Cattania, Camilla
2017-01-01
Tectonic stress in the crust evolves during a seismic cycle, with slow stress accumulation over interseismic periods, episodic stress steps at the time of earthquakes, and transient stress readjustment during a postseismic period that may last months to years. Static stress transfer to surrounding faults has been well documented to alter regional seismicity rates over both short and long time scales. While static stress transfer is instantaneous and long lived, postseismic stress transfer driven by viscoelastic relaxation of the ductile lower crust and mantle leads to additional, slowly varying stress perturbations. Both processes may be tested by comparing a decade-long record of regional seismicity to predicted time-dependent seismicity rates based on a stress evolution model that includes viscoelastic stress transfer. Here we explore crustal stress evolution arising from the seismic cycle in Southern California from 1981 to 2014 using five M≥6.5 source quakes: the M7.3 1992 Landers, M6.5 1992 Big Bear, M6.7 1994 Big Bear, M7.1 1999 Hector Mine, and M7.2 2010 El Mayor-Cucapah earthquakes. We relate the stress readjustment in the surrounding crust generated by each quake to regional seismicity using rate-and-state friction theory. Using a log likelihood approach, we quantify the potential to trigger seismicity of both static and viscoelastic stress transfer, finding that both processes have systematically shaped the spatial pattern of Southern California seismicity since 1992.
Deep-towed high resolution seismic imaging II: Determination of P-wave velocity distribution
NASA Astrophysics Data System (ADS)
Marsset, B.; Ker, S.; Thomas, Y.; Colin, F.
2018-02-01
The acquisition of high resolution seismic data in deep waters requires the development of deep towed seismic sources and receivers able to deal with the high hydrostatic pressure environment. The low frequency piezoelectric transducer of the SYSIF (SYstème Sismique Fond) deep towed seismic device comply with the former requirement taking advantage of the coupling of a mechanical resonance (Janus driver) and a fluid resonance (Helmholtz cavity) to produce a large frequency bandwidth acoustic signal (220-1050 Hz). The ability to perform deep towed multichannel seismic imaging with SYSIF was demonstrated in 2014, yet, the ability to determine P-wave velocity distribution wasn't achieved. P-wave velocity analysis relies on the ratio between the source-receiver offset range and the depth of the seismic reflectors, thus towing the seismic source and receivers closer to the sea bed will provide a better geometry for P-wave velocity determination. Yet, technical issues, related to the acoustic source directivity, arise for this approach in the particular framework of piezoelectric sources. A signal processing sequence is therefore added to the initial processing flow. Data acquisition took place during the GHASS (Gas Hydrates, fluid Activities and Sediment deformations in the western Black Sea) cruise in the Romanian waters of the Black Sea. The results of the imaging processing are presented for two seismic data sets acquired over gas hydrates and gas bearing sediments. The improvement in the final seismic resolution demonstrates the validity of the velocity model.
Hawaiian Volcano Observatory seismic data, January to December 2005
Nakata, Jennifer S.
2006-01-01
The Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year. The seismic summary is offered without interpretation as a source of preliminary data. It is complete in the sense that most data for events of M-1.5 routinely gathered by the Observatory are included. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data dictated an annual publication beginning with Summary 74 for the year 1974. Summary 86 (the introduction of CUSP at HVO) includes a description of the seismic instrumentation, calibration, and processing used in recent years. Beginning with 2004, summaries will simply be identified by the year, rather than Summary number. The present summary includes background information on the seismic network and processing to allow use of the data and to provide an understanding of how they were gathered. A report by Klein and Koyanagi (1980) tabulates instrumentation, calibration, and recording history of each seismic station in the network. It is designed as a reference for users of seismograms and phase data and includes and augments the information in the station table in this summary.
Hawaiian Volcano Observatory Seismic Data, January to December 2006
Nakata, Jennifer
2007-01-01
Introduction The Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year. The seismic summary is offered without interpretation as a source of preliminary data. It is complete in the sense that most data for events of M>1.5 routinely gathered by the Observatory are included. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data dictated an annual publication beginning with Summary 74 for the year 1974. Summary 86 (the introduction of CUSP at HVO) includes a description of the seismic instrumentation, calibration, and processing used in recent years. Beginning with 2004, summaries are simply identified by the year, rather than Summary number. The present summary includes background information on the seismic network and processing to allow use of the data and to provide an understanding of how they were gathered. A report by Klein and Koyanagi (1980) tabulates instrumentation, calibration, and recording history of each seismic station in the network. It is designed as a reference for users of seismograms and phase data and includes and augments the information in the station table in this summary.
Diffraction Seismic Imaging of the Chalk Group Reservoir Rocks
NASA Astrophysics Data System (ADS)
Montazeri, M.; Fomel, S.; Nielsen, L.
2016-12-01
In this study we investigate seismic diffracted waves instead of seismic reflected waves, which are usually much stronger and carry most of the information regarding subsurface structures. The goal of this study is to improve imaging of small subsurface features such as faults and fractures. Moreover, we focus on the Chalk Group, which contains important groundwater resources onshore and oil and gas reservoirs in the Danish sector of the North Sea. Finding optimum seismic velocity models for the Chalk Group and estimating high-quality stacked sections with conventional processing methods are challenging tasks. Here, we try to filter out as much as possible of undesired arrivals before stacking the seismic data. Further, a plane-wave destruction method is applied on the seismic stack in order to dampen the reflection events and thereby enhance the visibility of the diffraction events. After this initial processing, we estimate the optimum migration velocity using diffraction events in order to obtain a better resolution stack. The results from this study demonstrate how diffraction imaging can be used as an additional tool for improving the images of small-scale features in the Chalk Group reservoir, in particular faults and fractures. Moreover, we discuss the potential of applying this approach in future studies focused on such reservoirs.
Development of Laboratory Seismic Exploration Experiment for Education and Demonstration
NASA Astrophysics Data System (ADS)
Kuwano, O.; Nakanishi, A.
2016-12-01
We developed a laboratory experiment to simulate a seismic refraction survey for educational purposes. The experiment is tabletop scaled experiment using the soft hydrogel as an analogue material of a layered crust. So, we can conduct the seismic exploration experiment in a laboratory or a classroom. The softness and the transparency of the gel material enable us to observe the wave propagation with our naked eyes, using the photoelastic technique. By analyzing the waveforms obtained by the image analysis of the movie of the experiment, one can estimate the velocities and the structure of the gel specimen in the same way as an actual seismic survey. We report details of the practical course and the public outreach activities using the experiment.
Multivariate Analysis of Seismic Field Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alam, M. Kathleen
1999-06-01
This report includes the details of the model building procedure and prediction of seismic field data. Principal Components Regression, a multivariate analysis technique, was used to model seismic data collected as two pieces of equipment were cycled on and off. Models built that included only the two pieces of equipment of interest had trouble predicting data containing signals not included in the model. Evidence for poor predictions came from the prediction curves as well as spectral F-ratio plots. Once the extraneous signals were included in the model, predictions improved dramatically. While Principal Components Regression performed well for the present datamore » sets, the present data analysis suggests further work will be needed to develop more robust modeling methods as the data become more complex.« less
McLaskey, Gregory C.; Lockner, David A.; Kilgore, Brian D.; Beeler, Nicholas M.
2015-01-01
We describe a technique to estimate the seismic moment of acoustic emissions and other extremely small seismic events. Unlike previous calibration techniques, it does not require modeling of the wave propagation, sensor response, or signal conditioning. Rather, this technique calibrates the recording system as a whole and uses a ball impact as a reference source or empirical Green’s function. To correctly apply this technique, we develop mathematical expressions that link the seismic moment $M_{0}$ of internal seismic sources (i.e., earthquakes and acoustic emissions) to the impulse, or change in momentum $\\Delta p $, of externally applied seismic sources (i.e., meteor impacts or, in this case, ball impact). We find that, at low frequencies, moment and impulse are linked by a constant, which we call the force‐moment‐rate scale factor $C_{F\\dot{M}} = M_{0}/\\Delta p$. This constant is equal to twice the speed of sound in the material from which the seismic sources were generated. Next, we demonstrate the calibration technique on two different experimental rock mechanics facilities. The first example is a saw‐cut cylindrical granite sample that is loaded in a triaxial apparatus at 40 MPa confining pressure. The second example is a 2 m long fault cut in a granite sample and deformed in a large biaxial apparatus at lower stress levels. Using the empirical calibration technique, we are able to determine absolute source parameters including the seismic moment, corner frequency, stress drop, and radiated energy of these magnitude −2.5 to −7 seismic events.
Back-Projection Imaging of extended, diffuse seismic sources in volcanic and hydrothermal systems
NASA Astrophysics Data System (ADS)
Kelly, C. L.; Lawrence, J. F.; Beroza, G. C.
2017-12-01
Volcanic and hydrothermal systems exhibit a wide range of seismicity that is directly linked to fluid and volatile activity in the subsurface and that can be indicative of imminent hazardous activity. Seismograms recorded near volcanic and hydrothermal systems typically contain "noisy" records, but in fact, these complex signals are generated by many overlapping low-magnitude displacements and pressure changes at depth. Unfortunately, excluding times of high-magnitude eruptive activity that typically occur infrequently relative to the length of a system's entire eruption cycle, these signals often have very low signal-to-noise ratios and are difficult to identify and study using established seismic analysis techniques (i.e. phase-picking, template matching). Arrays of short-period and broadband seismic sensors are proven tools for monitoring short- and long-term changes in volcanic and hydrothermal systems. Time-reversal techniques (i.e. back-projection) that are improved by additional seismic observations have been successfully applied to locating volcano-seismic sources recorded by dense sensor arrays. We present results from a new computationally efficient back-projection method that allows us to image the evolution of extended, diffuse sources of volcanic and hydrothermal seismicity. We correlate short time-window seismograms from receiver-pairs to find coherent signals and propagate them back in time to potential source locations in a 3D subsurface model. The strength of coherent seismic signal associated with any potential source-receiver-receiver geometry is equal to the correlation of the short time-windows of seismic records at appropriate time lags as determined by the velocity structure and ray paths. We stack (sum) all short time-window correlations from all receiver-pairs to determine the cumulative coherence of signals at each potential source location. Through stacking, coherent signals from extended and/or repeating sources of short-period energy
Hawaiian Volcano Observatory Seismic Data, January to December 2008
Nakata, Jennifer S.; Okubo, Paul G.
2009-01-01
The U.S. Geological Survey (USGS), Hawaiian Volcano Observatory (HVO) summary presents seismic data gathered during the year. The seismic summary is offered without interpretation as a source of preliminary data and is complete in that most data for events of M greater than 1.5 are included. All latitude and longitude references in this report are stated in Old Hawaiian Datum. The HVO summaries have been published in various forms since 1956. Summaries prior to 1974 were issued quarterly, but cost, convenience of preparation and distribution, and the large quantities of data necessitated an annual publication, beginning with Summary 74 for the year 1974. Beginning in 2004, summaries are simply identified by the year, rather than by summary number. Summaries originally issued as administrative reports were republished in 2007 as Open-File Reports. All the summaries since 1956 are listed at http://geopubs.wr.usgs.gov/ (last accessed 09/21/2009). In January 1986, HVO adopted CUSP (California Institute of Technology USGS Seismic Processing). Summary 86 includes a description of the seismic instrumentation, calibration, and processing used in recent years. The present summary includes background information about the seismic network to provide the end user an understanding of the processing parameters and how the data were gathered. A report by Klein and Koyanagi (1980) tabulates instrumentation, calibration, and recording history of each seismic station in the network. It is designed as a reference for users of seismograms and phase data and includes and augments the information in the station table in this summary. Figures 11-14 are maps showing computer-located hypocenters. The maps were generated using the Generic Mapping Tools (GMT http://gmt.soest.hawaii.edu/, last accessed 09/21/2009) in place of traditional Qplot maps.
Adding seismic broadband analysis to characterize Andean backarc seismicity in Argentina
NASA Astrophysics Data System (ADS)
Alvarado, P.; Giuliano, A.; Beck, S.; Zandt, G.
2007-05-01
Characterization of the highly seismically active Andean backarc is crucial for assessment of earthquake hazards in western Argentina. Moderate-to-large crustal earthquakes have caused several deaths, damage and drastic economic consequences in Argentinean history. We have studied the Andean backarc crust between 30°S and 36°S using seismic broadband data available from a previous ("the CHARGE") IRIS-PASSCAL experiment. We collected more than 12 terabytes of continuous seismic data from 22 broadband instruments deployed across Chile and Argentina during 1.5 years. Using free software we modeled full regional broadband waveforms and obtained seismic moment tensor inversions of crustal earthquakes testing for the best focal depth for each event. We also mapped differences in the Andean backarc crustal structure and found a clear correlation with different types of crustal seismicity (i.e. focal depths, focal mechanisms, magnitudes and frequencies of occurrence) and previously mapped terrane boundaries. We now plan to use the same methodology to study other regions in Argentina using near-real time broadband data available from the national seismic (INPRES) network and global seismic networks operating in the region. We will re-design the national seismic network to optimize short-period and broadband seismic station coverage for different network purposes. This work is an international effort that involves researchers and students from universities and national government agencies with the goal of providing more information about earthquake hazards in western Argentina.
NASA Astrophysics Data System (ADS)
Harjes, H.-P.; Bram, K.; Dürbaum, H.-J.; Gebrande, H.; Hirschmann, G.; Janik, M.; KlöCkner, M.; Lüschen, E.; Rabbel, W.; Simon, M.; Thomas, R.; Tormann, J.; Wenzel, F.
1997-08-01
For almost 10 years the KTB superdeep drilling project has offered an excellent field laboratory for adapting seismic techniques to crystalline environments and for testing new ideas for interpreting seismic reflections in terms of lithological or textural properties of metamorphic rock units. The seismic investigations culminated in a three-dimensional (3-D) reflection survey on a 19×19 km area with the drill site at its center. Interpretation of these data resulted in a detailed, structural model of the German Continental Deep Drilling Program (KTB) location with dominant, steep faults in the upper crust. The 3-D reflection survey was part of a suite of seismic experiments, ranging from wide-angle reflection and refraction profiles to standard vertical seismic profiles (VSP) and more sophisticated surface-to-borehole observations. It was predicted that the drill bit would meet the most prominent, steeply dipping, crustal reflector at a depth of about 6500-7000 m, and indeed, the borehole penetrated a major fault zone in the depth interval between 6850 and 7300 m. This reflector offered the rare opportunity to relate logging results, reflective properties, and geology to observed and modeled data. Post-Variscan thrusting caused cataclastic deformation, with partial, strong alterations within a steeply dipping reverse fault zone. This process generated impedance contrasts within the fault zone on a lateral scale large enough to cause seismic reflections. This was confirmed by borehole measurements along the whole 9.1 km deep KTB profile. The strongest, reflected signals originated from fluid-filled fractures and cataclastic fracture zones rather than from lithological boundaries (i.e., first-order discontinuities between different rock types) or from texture- and/or foliation-induced anisotropy. During the interpretation of seismic data at KTB several lessons were learned: Conventional processing of two-dimensional (2-D) reflection data from a presite survey
NASA Astrophysics Data System (ADS)
De Luca, Claudio; Zinno, Ivana; Manunta, Michele; Lanari, Riccardo; Casu, Francesco
2016-04-01
The microwave remote sensing scenario is rapidly evolving through development of new sensor technology for Earth Observation (EO). In particular, Sentinel-1A (S1A) is the first of a sensors' constellation designed to provide a satellite data stream for the Copernicus European program. Sentinel-1A has been specifically designed to provide, over land, Differential Interferometric Synthetic Aperture Radar (DInSAR) products to analyze and investigate Earth's surface displacements. S1A peculiarities include wide ground coverage (250 km of swath), C-band operational frequency and short revisit time (that will reduce from 12 to 6 days when the twin system Sentinel-1B will be placed in orbit during 2016). Such characteristics, together with the global coverage acquisition policy, make the Sentinel-1 constellation to be extremely suitable for volcanic and seismic areas studying and monitoring worldwide, thus allowing the generation of both ground displacement information with increasing rapidity and new geological understanding. The main acquisition mode over land is the so called Interferometric Wide Swath (IWS) that is based on the Terrain Observation by Progressive Scans (TOPS) technique and that guarantees the mentioned S1A large coverage characteristics at expense of a not trivial interferometric processing. Moreover, the satellite spatial coverage and the reduced revisit time will lead to an exponential increase of the data archives that, after the launch of Sentine-1B, will reach about 3TB per day. Therefore, the EO scientific community needs from the one hand automated and effective DInSAR tools able to address the S1A processing complexity, and from the other hand the computing and storage capacities to face out the expected large amount of data. Then, it is becoming more crucial to move processors and tools close to the satellite archives, being not efficient anymore the approach of downloading and processing data with in-house computing facilities. To address
A vector scanning processing technique for pulsed laser velocimetry
NASA Technical Reports Server (NTRS)
Wernet, Mark P.; Edwards, Robert V.
1989-01-01
Pulsed-laser-sheet velocimetry yields two-dimensional velocity vectors across an extended planar region of a flow. Current processing techniques offer high-precision (1-percent) velocity estimates, but can require hours of processing time on specialized array processors. Sometimes, however, a less accurate (about 5 percent) data-reduction technique which also gives unambiguous velocity vector information is acceptable. Here, a direct space-domain processing technique is described and shown to be far superior to previous methods in achieving these objectives. It uses a novel data coding and reduction technique and has no 180-deg directional ambiguity. A complex convection vortex flow was recorded and completely processed in under 2 min on an 80386-based PC, producing a two-dimensional velocity-vector map of the flowfield. Pulsed-laser velocimetry data can thus be reduced quickly and reasonably accurately, without specialized array processing hardware.
High vertical resolution crosswell seismic imaging
Lazaratos, Spyridon K.
1999-12-07
A method for producing high vertical resolution seismic images from crosswell data is disclosed. In accordance with one aspect of the disclosure, a set of vertically spaced, generally horizontally extending continuous layers and associated nodes are defined within a region between two boreholes. The specific number of nodes is selected such that the value of a particular characteristic of the subterranean region at each of the nodes is one which can be determined from the seismic data. Once values are established at the nodes, values of the particular characteristic are assigned to positions between the node points of each layer based on the values at node within that layer and without regard to the values at node points within any other layer. A seismic map is produced using the node values and the assigned values therebetween. In accordance with another aspect of the disclosure, an approximate model of the region is established using direct arrival traveltime data. Thereafter, the approximate model is adjusted using reflected arrival data. In accordance with still another aspect of the disclosure, correction is provided for well deviation. An associated technique which provides improvements in ray tracing is also disclosed.
Advanced Communication Processing Techniques
NASA Astrophysics Data System (ADS)
Scholtz, Robert A.
This document contains the proceedings of the workshop Advanced Communication Processing Techniques, held May 14 to 17, 1989, near Ruidoso, New Mexico. Sponsored by the Army Research Office (under Contract DAAL03-89-G-0016) and organized by the Communication Sciences Institute of the University of Southern California, the workshop had as its objective to determine those applications of intelligent/adaptive communication signal processing that have been realized and to define areas of future research. We at the Communication Sciences Institute believe that there are two emerging areas which deserve considerably more study in the near future: (1) Modulation characterization, i.e., the automation of modulation format recognition so that a receiver can reliably demodulate a signal without using a priori information concerning the signal's structure, and (2) the incorporation of adaptive coding into communication links and networks. (Encoders and decoders which can operate with a wide variety of codes exist, but the way to utilize and control them in links and networks is an issue). To support these two new interest areas, one must have both a knowledge of (3) the kinds of channels and environments in which the systems must operate, and of (4) the latest adaptive equalization techniques which might be employed in these efforts.
Seismic methods are the most commonly conducted geophysical surveys for engineering investigations. Seismic refraction provides engineers and geologists with the most basic of geologic data via simple procedures with common equipment.
On the use of a laser ablation as a laboratory seismic source
NASA Astrophysics Data System (ADS)
Shen, Chengyi; Brito, Daniel; Diaz, Julien; Zhang, Deyuan; Poydenot, Valier; Bordes, Clarisse; Garambois, Stéphane
2017-04-01
Mimic near-surface seismic imaging conducted in well-controlled laboratory conditions is potentially a powerful tool to study large scale wave propagations in geological media by means of upscaling. Laboratory measurements are indeed particularly suited for tests of theoretical modellings and comparisons with numerical approaches. We have developed an automated Laser Doppler Vibrometer (LDV) platform, which is able to detect and register broadband nano-scale displacements on the surface of various materials. This laboratory equipment has already been validated in experiments where piezoelectric transducers were used as seismic sources. We are currently exploring a new seismic source in our experiments, a laser ablation, in order to compensate some drawbacks encountered with piezoelectric sources. The laser ablation source is considered to be an interesting ultrasound wave generator since the 1960s. It was believed to have numerous potential applications such as the Non-Destructive Testing (NDT) and the measurements of velocities and attenuations in solid samples. We aim at adapting and developing this technique into geophysical experimental investigations in order to produce and explore complete micro-seismic data sets in the laboratory. We will first present the laser characteristics including its mechanism, stability, reproducibility, and will evaluate in particular the directivity patterns of such a seismic source. We have started by applying the laser ablation source on the surfaces of multi-scale homogeneous aluminum samples and are now testing it on heterogeneous and fractured limestone cores. Some other results of data processing will also be shown, especially the 2D-slice V P and V S tomographic images obtained in limestone samples. Apart from the experimental records, numerical simulations will be carried out for both the laser source modelling and the wave propagation in different media. First attempts will be done to compare quantitatively the
Looking inside the microseismic cloud using seismic interferometry
NASA Astrophysics Data System (ADS)
Matzel, E.; Rhode, A.; Morency, C.; Templeton, D. C.; Pyle, M. L.
2015-12-01
Microseismicity provides a direct means of measuring the physical characteristics of active tectonic features such as fault zones. Thousands of microquakes are often associated with an active site. This cloud of microseismicity helps define the tectonically active region. When processed using novel geophysical techniques, we can isolate the energy sensitive to the faulting region, itself. The virtual seismometer method (VSM) is a technique of seismic interferometry that provides precise estimates of the GF between earthquakes. In many ways the converse of ambient noise correlation, it is very sensitive to the source parameters (location, mechanism and magnitude) and to the Earth structure in the source region. In a region with 1000 microseisms, we can calculate roughly 500,000 waveforms sampling the active zone. At the same time, VSM collapses the computation domain down to the size of the cloud of microseismicity, often by 2-3 orders of magnitude. In simple terms VSM involves correlating the waveforms from a pair of events recorded at an individual station and then stacking the results over all stations to obtain the final result. In the far-field, when most of the stations in a network fall along a line between the two events, the result is an estimate of the GF between the two, modified by the source terms. In this geometry each earthquake is effectively a virtual seismometer recording all the others. When applied to microquakes, this alignment is often not met, and we also need to address the effects of the geometry between the two microquakes relative to each seismometer. Nonetheless, the technique is quite robust, and highly sensitive to the microseismic cloud. Using data from the Salton Sea geothermal region, we demonstrate the power of the technique, illustrating our ability to scale the technique from the far-field, where sources are well separated, to the near field where their locations fall within each other's uncertainty ellipse. VSM provides better
SiSeRHMap v1.0: a simulator for mapped seismic response using a hybrid model
NASA Astrophysics Data System (ADS)
Grelle, G.; Bonito, L.; Lampasi, A.; Revellino, P.; Guerriero, L.; Sappa, G.; Guadagno, F. M.
2015-06-01
SiSeRHMap is a computerized methodology capable of drawing up prediction maps of seismic response. It was realized on the basis of a hybrid model which combines different approaches and models in a new and non-conventional way. These approaches and models are organized in a code-architecture composed of five interdependent modules. A GIS (Geographic Information System) Cubic Model (GCM), which is a layered computational structure based on the concept of lithodynamic units and zones, aims at reproducing a parameterized layered subsoil model. A metamodeling process confers a hybrid nature to the methodology. In this process, the one-dimensional linear equivalent analysis produces acceleration response spectra of shear wave velocity-thickness profiles, defined as trainers, which are randomly selected in each zone. Subsequently, a numerical adaptive simulation model (Spectra) is optimized on the above trainer acceleration response spectra by means of a dedicated Evolutionary Algorithm (EA) and the Levenberg-Marquardt Algorithm (LMA) as the final optimizer. In the final step, the GCM Maps Executor module produces a serial map-set of a stratigraphic seismic response at different periods, grid-solving the calibrated Spectra model. In addition, the spectra topographic amplification is also computed by means of a numerical prediction model. This latter is built to match the results of the numerical simulations related to isolate reliefs using GIS topographic attributes. In this way, different sets of seismic response maps are developed, on which, also maps of seismic design response spectra are defined by means of an enveloping technique.
A comparison of Q-factor estimation methods for marine seismic data
NASA Astrophysics Data System (ADS)
Kwon, J.; Ha, J.; Shin, S.; Chung, W.; Lim, C.; Lee, D.
2016-12-01
The seismic imaging technique draws information from inside the earth using seismic reflection and transmission data. This technique is an important method in geophysical exploration. Also, it has been employed widely as a means of locating oil and gas reservoirs because it offers information on geological media. There is much recent and active research into seismic attenuation and how it determines the quality of seismic imaging. Seismic attenuation is determined by various geological characteristics, through the absorption or scattering that occurs when the seismic wave passes through a geological medium. The seismic attenuation can be defined using an attenuation coefficient and represented as a non-dimensional variable known as the Q-factor. Q-factor is a unique characteristic of a geological medium. It is a very important material property for oil and gas resource development. Q-factor can be used to infer other characteristics of a medium, such as porosity, permeability and viscosity, and can directly indicate the presence of hydrocarbons to identify oil and gas bearing areas from the seismic data. There are various ways to estimate Q-factor in three different domains. In the time domain, pulse amplitude decay, pulse rising time, and pulse broadening are representative. Logarithm spectral ratio (LSR), centroid frequency shift (CFS), and peak frequency shift (PFS) are used in the frequency domain. In the time-frequency domain, Wavelet's Envelope Peak Instantaneous Frequency (WEPIF) is most frequently employed. In this study, we estimated and analyzed the Q-factor through the numerical model test and used 4 methods: the LSR, CFS, PFS, and WEPIF. Before we applied these 4 methods to observed data, we experimented with the numerical model test. The numerical model test data is derived from Norsar-2D, which is the basis of the ray-tracing algorithm, and we used reflection and normal incidence surveys to calculate Q-factor according to the array of sources and
NASA Astrophysics Data System (ADS)
mouloud, Hamidatou
2016-04-01
The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.
Seismicity and seismic structure at Okmok Volcano, Alaska
Ohlendorf, Summer J.; Thurber, Clifford H.; Pesicek, Jeremy D.; Prejean, Stephanie G.
2014-01-01
Okmok volcano is an active volcanic caldera located on the northeastern portion of Umnak Island in the Aleutian arc, with recent eruptions in 1997 and 2008. The Okmok area had ~900 locatable earthquakes between 2003 and June 2008, and an additional ~600 earthquakes from the beginning of the 2008 eruption to mid 2009, providing an adequate dataset for seismic tomography. To image the seismic velocity structure of Okmok, we apply waveform cross-correlation using bispectrum verification and double-difference tomography to a subset of these earthquakes. We also perform P-wave attenuation tomography using a spectral decay technique. We examine the spatio-temporal characteristics of seismicity in the opening sequence of the 2008 eruption to investigate the path of magma migration during the establishment of a new eruptive vent. We also incorporate the new earthquake relocations and three-dimensional (3D) velocity model with first-motion polarities to compute focal mechanisms for selected events in the 2008 pre-eruptive and eruptive periods. Through these techniques we obtain precise relocations, a well-constrained 3D P-wave velocity model, and a marginally resolved S-wave velocity model. We image a main low Vp and Vs anomaly directly under the caldera consisting of a shallow zone at 0–2 km depth connected to a larger deeper zone that extends to about 6 km depth. We find that areas of low Qp are concentrated in the central to southwestern portion of the caldera and correspond fairly well with areas of low Vp. We interpret the deeper part of the low velocity anomaly (4–6 km depth) beneath the caldera as a magma body. This is consistent with results from ambient noise tomography and suggests that previous estimates of depth to Okmok's magma chamber based only on geodetic data may be too shallow. The distribution of events preceding the 2008 eruption suggest that a combination of overpressure in the zone surrounding the magma chamber and the introduction of new material
Monitoring earthen dams and levees with ambient seismic noise
NASA Astrophysics Data System (ADS)
Planès, T.; Mooney, M.; Rittgers, J. B.; Kanning, W.; Draganov, D.
2017-12-01
Internal erosion is a major cause of failure of earthen dams and levees and is difficult to detect at an early stage by traditional visual inspection techniques. The passive and non-invasive ambient-noise correlation technique could help detect and locate internal changes taking place within these structures. First, we apply this passive seismic method to monitor a canal embankment model submitted to piping erosion, in laboratory-controlled conditions. We then present the monitoring of a sea levee in the Netherlands. A 150m-long section of the dike shows sandboils in the drainage ditch located downstream of the levee. These sandboils are the sign of concentrated seepage and potential initiation of internal erosion in the structure. Using the ambient-noise correlation technique, we retrieve surface waves propagating along the crest of the dike. Temporal variations of the seismic wave velocity are then computed during the tide cycle. These velocity variations are correlated with local in-situ pore water pressure measurements and are possibly influenced by the presence of concentrated seepage paths.
NASA Astrophysics Data System (ADS)
Peres Rocha, M.; Azevedo, P. A. D.; Assumpcao, M.; Franca, G. S.; Marotta, G. S.
2016-12-01
Results of the P-wave travel-time seismic tomography method allowed observing differences in the seismic behavior of the lithosphere along the Brazilian continental margin in the South Atlantic. High velocity anomalies have predominance in the northern portion, which extends from the Rio de Janeiro to Alagoas States (between latitudes -22.5 and -8.5), and low velocity anomalies in the southern portion, which extends from Rio de Janeiro to Rio Grande do Sul States (between latitudes -30 and -22.5). Low velocities coincide spatially with the offshore high seismicity areas, as indicated by Assumpção (1998) and at the high velocities with low seismicity regions. The high velocity anomalies at northern portion are related to the cratonic and low-stretched lithosphere of San Francisco block that was connected to the Congo block before the opening of the Atlantic Ocean. Low velocities can be assigned to more weakened lithosphere, where it started the South Atlantic Ocean opening process. The oldest lithosphere in the South Atlantic, indicated by the magnetic anomalies of the oceanic floor, is higher in the southern part than in the northern part, suggesting that the continents in this region were separating, while the northern region was still connected to Africa, which could explain the lithospheric stretching process.
Tandon, K.; Tuncay, K.; Hubbard, K.; Comer, J.; Ortoleva, P.
2004-01-01
A data assimilation approach is demonstrated whereby seismic inversion is both automated and enhanced using a comprehensive numerical sedimentary basin simulator to study the physics and chemistry of sedimentary basin processes in response to geothermal gradient in much greater detail than previously attempted. The approach not only reduces costs by integrating the basin analysis and seismic inversion activities to understand the sedimentary basin evolution with respect to geodynamic parameters-but the technique also has the potential for serving as a geoinfomatics platform for understanding various physical and chemical processes operating at different scales within a sedimentary basin. Tectonic history has a first-order effect on the physical and chemical processes that govern the evolution of sedimentary basins. We demonstrate how such tectonic parameters may be estimated by minimizing the difference between observed seismic reflection data and synthetic ones constructed from the output of a reaction, transport, mechanical (RTM) basin model. We demonstrate the method by reconstructing the geothermal gradient. As thermal history strongly affects the rate of RTM processes operating in a sedimentary basin, variations in geothermal gradient history alter the present-day fluid pressure, effective stress, porosity, fracture statistics and hydrocarbon distribution. All these properties, in turn, affect the mechanical wave velocity and sediment density profiles for a sedimentary basin. The present-day state of the sedimentary basin is imaged by reflection seismology data to a high degree of resolution, but it does not give any indication of the processes that contributed to the evolution of the basin or causes for heterogeneities within the basin that are being imaged. Using texture and fluid properties predicted by our Basin RTM simulator, we generate synthetic seismograms. Linear correlation using power spectra as an error measure and an efficient quadratic
Monitoring Instrument Performance in Regional Broadband Seismic Network Using Ambient Seismic Noise
NASA Astrophysics Data System (ADS)
Ye, F.; Lyu, S.; Lin, J.
2017-12-01
In the past ten years, the number of seismic stations has increased significantly, and regional seismic networks with advanced technology have been gradually developed all over the world. The resulting broadband data help to improve the seismological research. It is important to monitor the performance of broadband instruments in a new network in a long period of time to ensure the accuracy of seismic records. Here, we propose a method that uses ambient noise data in the period range 5-25 s to monitor instrument performance and check data quality in situ. The method is based on an analysis of amplitude and phase index parameters calculated from pairwise cross-correlations of three stations, which provides multiple references for reliable error estimates. Index parameters calculated daily during a two-year observation period are evaluated to identify stations with instrument response errors in near real time. During data processing, initial instrument responses are used in place of available instrument responses to simulate instrument response errors, which are then used to verify our results. We also examine feasibility of the tailing noise using data from stations selected from USArray in different locations and analyze the possible instrumental errors resulting in time-shifts used to verify the method. Additionally, we show an application that effects of instrument response errors that experience pole-zeros variations on monitoring temporal variations in crustal properties appear statistically significant velocity perturbation larger than the standard deviation. The results indicate that monitoring seismic instrument performance helps eliminate data pollution before analysis begins.
A seismic hazard uncertainty analysis for the New Madrid seismic zone
Cramer, C.H.
2001-01-01
A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.
Field test investigation of high sensitivity fiber optic seismic geophone
NASA Astrophysics Data System (ADS)
Wang, Meng; Min, Li; Zhang, Xiaolei; Zhang, Faxiang; Sun, Zhihui; Li, Shujuan; Wang, Chang; Zhao, Zhong; Hao, Guanghu
2017-10-01
Seismic reflection, whose measured signal is the artificial seismic waves ,is the most effective method and widely used in the geophysical prospecting. And this method can be used for exploration of oil, gas and coal. When a seismic wave travelling through the Earth encounters an interface between two materials with different acoustic impedances, some of the wave energy will reflect off the interface and some will refract through the interface. At its most basic, the seismic reflection technique consists of generating seismic waves and measuring the time taken for the waves to travel from the source, reflect off an interface and be detected by an array of geophones at the surface. Compared to traditional geophones such as electric, magnetic, mechanical and gas geophone, optical fiber geophones have many advantages. Optical fiber geophones can achieve sensing and signal transmission simultaneously. With the development of fiber grating sensor technology, fiber bragg grating (FBG) is being applied in seismic exploration and draws more and more attention to its advantage of anti-electromagnetic interference, high sensitivity and insensitivity to meteorological conditions. In this paper, we designed a high sensitivity geophone and tested its sensitivity, based on the theory of FBG sensing. The frequency response range is from 10 Hz to 100 Hz and the acceleration of the fiber optic seismic geophone is over 1000pm/g. sixteen-element fiber optic seismic geophone array system is presented and the field test is performed in Shengli oilfield of China. The field test shows that: (1) the fiber optic seismic geophone has a higher sensitivity than the traditional geophone between 1-100 Hz;(2) The low frequency reflection wave continuity of fiber Bragg grating geophone is better.
Imaging Subsurface Structure of Tehran/Iran region using Ambient Seismic Noise Tomography
NASA Astrophysics Data System (ADS)
Shirzad Iraj, Taghi; Shmomali, Z. Hossein
2013-04-01
Tehran, capital of Iran, is surrounded by many active faults (including Mosha, North Tehran and North and/or South Rey faults), however our knowledge about the 3D velocity structure of the study area is limited. Recent developments in seismology have shown that cross-correlation of a long time ambient seismic noise recorded by pair of stations, contain information about the Green's function between the stations. Thus ambient seismic noise carries valuable information of propagation path which can be extracted. We obtained 2D model of shear wave velocity (Vs) for Tehran/Iran area using seismic ambient noise tomography (ANT) method. In this study, we use continuous vertical component of data recorded by TDMMO (Tehran Disaster Mitigation and Management Organization) and IRSC (Iranian Seismological Center) networks in the Tehran/Iran area. The TDMMO and IRSC networks are equipped with CMG-5TD Guralp sensor and SS-1 Kinemetrics sensor respectively. We use data from 25 stations for 12 months from 2009/Oct. to 2010/Oct. Data processing is similar to that explained in detail by Bensen et al. (2007) including processed daily base data. The mean, trend, and instrument response were removed and the data were decimated to 10 sps. One-bit time-domain normalization was then applied to suppress the influence of instrument irregularities and earthquake signals followed by spectral normalization between 0.1-1.0 Hz (period 1-10 sec). After cross-correlation processing, we implement a new stacking method to stack many cross-correlation functions bases on the highest energy in a time interval which we expect to receive the Rayleigh wave fundamental mode. We then obtained group velocity of Rayleigh wave by using phase match filtering and frequency-time analysis techniques. Finally, we applied iterative inversion method to extract Vs model of shallow structure in the Tehran/Iran area.
Seismic Imaging of a Prospective Geothermal Play, Using a Dense Geophone Array
NASA Astrophysics Data System (ADS)
Trow, A.; Pankow, K. L.; Wannamaker, P. E.; Lin, F. C.; Ward, K. M.
2017-12-01
In the summer of 2016 a dense array of 48 Nodal Seismic geophones was deployed near Beaver, Utah on the eastern flank of the Mineral Mountains. The array aperture was approximately 20 kilometers and recorded continuous seismic data for 30 days. Geophones were centered on a previously known shallow (5km depth) magnetolluric (MT) low-resistivity body. This region of low resistivity was interpreted to possibly contain hydrothermal/geothermal fluids and was targeted for further seismic investigation. The seismic array geometry was designed to optimize seismic event detection for small (magnitude of completeness zero) earthquakes and to facilitate seismic imaging at depths of 5 km and deeper. For the duration of the experiment, one ML 1 earthquake was detected underneath the array with 15 other earthquakes detected to the east and south in the more seismically active Pavant Range. Different passive imaging techniques, including ambient noise and earthquake tomography are being explored in order to produce a seismic velocity image. Understanding the subsurface, specifically the fracture network and fluid content of the bedrock is important for characterization of a geothermal prospect. If it is rich in fluids, it can be assumed that some fracture network is in place to accommodate such fluids. Both fractures and fluid content of the prospect will have an effect on the seismic velocities in the basement structure. These properties can help determine the viability of a geothermal system for power production.
High frequency seismic monitoring of debris flows at Chalk Cliffs (CO), USA
NASA Astrophysics Data System (ADS)
Coviello, Velio; Kean, Jason; Smith, Joel; Coe, Jeffrey; Arattano, Massimo; McCoy, Scott
2015-04-01
A growing number of studies adopt passive seismic monitoring techniques to investigate slope instabilities and landslide processes. These techniques are attractive and convenient because large areas can be monitored from a safe distance. This is particularly true when the phenomena under investigation are rapid and infrequent mass movements like debris flows. Different types of devices are used to monitor debris flow processes, but among them ground vibration detectors (GVDs) present several, specific advantages that encourage their use. These advantages include: (i) the possibility to be installed outside the channel bed, (ii) the high adaptability to different and harsh field conditions, and (iii) the capability to detect the debris flow front arrival tens of seconds earlier than contact and stage sensors. Ground vibration data can provide relevant information on the dynamics of debris flows such as timing and velocity of the main surges. However, the processing of the raw seismic signal is usually needed, both to obtain a more effective representation of waveforms and to decrease the amount of data that need to be recorded and analyzed. With this objective, the methods of Amplitude and Impulses are commonly adopted to transform the raw signal to a 1-Hz signal that allows for a more useful representation of the phenomenon. In that way, peaks and other features become more visible and comparable with data obtained from other monitoring devices. In this work, we present the first debris flows seismic recordings gathered in the Chalk Cliffs instrumented basin, central Colorado, USA. In May 2014, two 4.5-Hz, three-axial geophones were installed in the upper part of the catchment. Seismic data are sampled at 333 Hz and then recorded by a standalone recording unit. One geophone is directly installed on bedrock, the other one mounted on a 1-m boulder partially buried in colluvium. This latter sensor integrates a heavily instrumented cross-section consisting of a 225 cm2
The Collaborative Seismic Earth Model Project
NASA Astrophysics Data System (ADS)
Fichtner, A.; van Herwaarden, D. P.; Afanasiev, M.
2017-12-01
We present the first generation of the Collaborative Seismic Earth Model (CSEM). This effort is intended to address grand challenges in tomography that currently inhibit imaging the Earth's interior across the seismically accessible scales: [1] For decades to come, computational resources will remain insufficient for the exploitation of the full observable seismic bandwidth. [2] With the man power of individual research groups, only small fractions of available waveform data can be incorporated into seismic tomographies. [3] The limited incorporation of prior knowledge on 3D structure leads to slow progress and inefficient use of resources. The CSEM is a multi-scale model of global 3D Earth structure that evolves continuously through successive regional refinements. Taking the current state of the CSEM as initial model, these refinements are contributed by external collaborators, and used to advance the CSEM to the next state. This mode of operation allows the CSEM to [1] harness the distributed man and computing power of the community, [2] to make consistent use of prior knowledge, and [3] to combine different tomographic techniques, needed to cover the seismic data bandwidth. Furthermore, the CSEM has the potential to serve as a unified and accessible representation of tomographic Earth models. Generation 1 comprises around 15 regional tomographic refinements, computed with full-waveform inversion. These include continental-scale mantle models of North America, Australasia, Europe and the South Atlantic, as well as detailed regional models of the crust beneath the Iberian Peninsula and western Turkey. A global-scale full-waveform inversion ensures that regional refinements are consistent with whole-Earth structure. This first generation will serve as the basis for further automation and methodological improvements concerning validation and uncertainty quantification.
NASA Astrophysics Data System (ADS)
Montecino, Henry D.; de Freitas, Silvio R. C.; Báez, Juan C.; Ferreira, Vagner G.
2017-12-01
The Maule Earthquake (Mw = 8.8) of February 27, 2010 is among the strongest earthquakes that occurred in recent years throughout the world. The crustal deformation caused by this earthquake has been widely studied using GNSS, InSAR and gravity observations. However, there is currently no estimation of the possible vertical deformations produced by co-seismic and post-seismic effects in segments of the Chilean Vertical Reference Frame (CHVRF). In this paper, we present an estimation of co-seismic and post-seismic deformations on the CHVRF using an indirect approach based on GNSS and Gravity Recovery and Climate Experiment (GRACE) data as well as by applying a trajectory model. GNSS time series were used from 10 continuous GNSS stations in the period from 2007 to 2015, as well as 28 GNSS temporary stations realized before and after the earthquake, and 34 vertical deformation vectors in the region most affected by the earthquake. We considered a set of 147 monthly solutions of spherical harmonic gravity field that were expanded up to degree, as well as order 96 of the GRACE mission provided by Center for Space Research, University of Texas at Austin (UT-CSR) process center. The magnitude of vertical deformation was estimated in part of the Chilean vertical network due to the co-seismic and post-seismic effects. Once we evaluated the hydrological effect, natural and artificial jumps, and the effect of glacial isostatic adjustment in GNSS and GRACE time series, the maximum values associated to co- and post-seismic deformations on orthometric height were found to be ∼-34 cm and 5 cm, respectively. Overall, the deformation caused by the Maule earthquake in orthometric heights is almost entirely explained by the variation in the ellipsoidal heights (over 85% in co-seismic jump); however, coseismic jump in the geoid reached -3.3 mm, and could influence the maintenance of a modern vertical reference network in a medium to long term. We evaluated the consistency for a
NASA Astrophysics Data System (ADS)
Morelli, Andrea; Danecek, Peter; Molinari, Irene; Postpischl, Luca; Schivardi, Renata; Serretti, Paola; Tondi, Maria Rosaria
2010-05-01
Together with the building and maintenance of observational and data banking infrastructures - i.e. an integrated organization of coordinated sensor networks, in conjunction with connected data banks and efficient data retrieval tools - a strategic vision for bolstering the future development of geophysics in Europe should also address the essential issue of improving our current ability to model coherently the propagation of seismic waves across the European plate. This impacts on fundamental matters, such as correctly locating earthquakes, imaging detailed earthquake source properties, modeling ground shaking, inferring geodynamic processes. To this extent, we both need detailed imaging of shallow and deep earth structure, and accurate modeling of seismic waves by numerical methods. Our current abilities appear somewhat limited, but emerging technologies may enable soon a significant leap towards better accuracy and reliability. To contribute to this debate, we present here the state-of-the-art of knowledge of earth structure and numerical wave modeling in the European plate, as the result of a comprehensive study towards the definition of a continental-scale reference model. Our model includes a description of crustal structure (EPcrust) merging information deriving from previous studies - large-scale compilations, seismic prospection, receiver functions, inversion of surface wave dispersion measurements and Green functions from noise correlation. We use a simple description of crustal structure, with laterally-varying sediment and cristalline layers thickness, density, and seismic parameters. This a priori crustal model improves the overall fit to observed Bouguer anomaly maps over CRUST2.0. The new crustal model is then used as a constraint in the inversion for mantle shear wave speed, based on fitting Love and Rayleigh surface wave dispersion. The new mantle model sensibly improves over global S models in the imaging of shallow asthenospheric (slow) anomalies
NASA Astrophysics Data System (ADS)
Kieffer, Susan Werner
1984-09-01
Old Faithful Geyser in Yellowstone National Park, U.S.A., is a relatively isolated source of seismic noise and exhibits seismic behavior similar to that observed at many volcanoes, including "bubblequakes" that resemble B-type "earthquakes", harmonic tremor before and during eruptions, and periods of seismic quiet prior to eruptions. Although Old Faithful differs from volcanoes in that the conduit is continuously open, that rock-fracturing is not a process responsible for seismicity, and that the erupting fluid is inviscid H 2O rather than viscous magma, there are also remarkable similarities in the problems of heat and mass recharge to the system, in the eruption dynamics, and in the seismicity. Water rises irregularly into the immediate reservoir of Old Faithful as recharge occurs, a fact that suggests that there are two enlarged storage regions: one between 18 and 22 m (the base of the immediate reservoir) and one between about 10 and 12 m depth. Transport of heat from hot water or steam entering at the base of the recharging water column into cooler overlying water occurs by migration of steam bubbles upward and their collapse in the cooler water, and by episodes of convective overturn. An eruption occurs when the temperature of the near-surface water exceeds the boiling point if the entire water column is sufficiently close to the boiling curve that the propagation of pressure-release waves (rarefactions) down the column can bring the liquid water onto the boiling curve. The process of conversion of the liquid water in the conduit at the onset of an eruption into a two-phase liquid-vapor mixture takes on the order of 30 s. The seismicity is directly related to the sequence of filling and heating during the recharge cycle, and to the fluid mechanics of the eruption. Short (0.2-0.3 s), monochromatic, high-frequency events (20-60 Hz) resembling unsustained harmonic tremor and, in some instances, B-type volcanic earthquakes, occur when exploding or imploding
3-D Characterization of Seismic Properties at the Smart Weapons Test Range, YPG
NASA Astrophysics Data System (ADS)
Miller, Richard D.; Anderson, Thomas S.; Davis, John C.; Steeples, Don W.; Moran, Mark L.
2001-10-01
The Smart Weapons Test Range (SWTR) lies within the Yuma Proving Ground (YPG), Arizona. SWTR is a new facility constructed specifically for the development and testing of futuristic intelligent battlefield sensor networks. In this paper, results are presented for an extensive high-resolution geophysical characterization study at the SWTR site along with validation using 3-D modeling. In this study, several shallow seismic methods and novel processing techniques were used to generate a 3-D grid of earth seismic properties, including compressional (P) and shear (S) body-wave speeds (Vp and Vs), and their associated body-wave attenuation parameters (Qp, and Qs). These experiments covered a volume of earth measuring 1500 m by 300 m by 25 m deep (11 million cubic meters), centered on the vehicle test track at the SWTR site. The study has resulted in detailed characterizations of key geophysical properties. To our knowledge, results of this kind have not been previously achieved, nor have the innovative methods developed for this effort been reported elsewhere. In addition to supporting materiel developers with important geophysical information at this test range, the data from this study will be used to validate sophisticated 3-D seismic signature models for moving vehicles.
High temporal resolution mapping of seismic noise sources using heterogeneous supercomputers
NASA Astrophysics Data System (ADS)
Gokhberg, Alexey; Ermert, Laura; Paitz, Patrick; Fichtner, Andreas
2017-04-01
Time- and space-dependent distribution of seismic noise sources is becoming a key ingredient of modern real-time monitoring of various geo-systems. Significant interest in seismic noise source maps with high temporal resolution (days) is expected to come from a number of domains, including natural resources exploration, analysis of active earthquake fault zones and volcanoes, as well as geothermal and hydrocarbon reservoir monitoring. Currently, knowledge of noise sources is insufficient for high-resolution subsurface monitoring applications. Near-real-time seismic data, as well as advanced imaging methods to constrain seismic noise sources have recently become available. These methods are based on the massive cross-correlation of seismic noise records from all available seismic stations in the region of interest and are therefore very computationally intensive. Heterogeneous massively parallel supercomputing systems introduced in the recent years combine conventional multi-core CPU with GPU accelerators and provide an opportunity for manifold increase and computing performance. Therefore, these systems represent an efficient platform for implementation of a noise source mapping solution. We present the first results of an ongoing research project conducted in collaboration with the Swiss National Supercomputing Centre (CSCS). The project aims at building a service that provides seismic noise source maps for Central Europe with high temporal resolution (days to few weeks depending on frequency and data availability). The service is hosted on the CSCS computing infrastructure; all computationally intensive processing is performed on the massively parallel heterogeneous supercomputer "Piz Daint". The solution architecture is based on the Application-as-a-Service concept in order to provide the interested external researchers the regular access to the noise source maps. The solution architecture includes the following sub-systems: (1) data acquisition responsible for
A seismic reflection velocity study of a Mississippian mud-mound in the Illinois basin
NASA Astrophysics Data System (ADS)
Ranaweera, Chamila Kumari
Two mud-mounds have been reported in the Ullin limestone near, but not in, the Aden oil field in Hamilton County, Illinois. One mud-mound is in the Broughton oil field of Hamilton County 25 miles to the south of Aden. The second mud-mound is in the Johnsonville oil field in Wayne County 20 miles to the north of Aden. Seismic reflection profiles were shot in 2012 adjacent to the Aden oil field to evaluate the oil prospects and to investigate the possibility of detecting Mississippian mud-mounds near the Aden field. A feature on one of the seismic profiles was interpreted to be a mud-mound or carbonate buildup. A well drilled at the location of this interpreted structure provided digital geophysical logs and geological logs used to refine the interpretation of the seismic profiles. Geological data from the new well at Aden, in the form of drill cuttings, have been used to essentially confirm the existence of a mud-mound in the Ullin limestone at a depth of 4300 feet. Geophysical well logs from the new well near Aden were used to create 1-D computer models and synthetic seismograms for comparison to the seismic data. The reflection seismic method is widely used to aid interpreting subsurface geology. Processing seismic data is an important step in the method as a properly processed seismic section can give a better image of the subsurface geology whereas a poorly processed section could mislead the interpretation. Seismic reflections will be more accurately depicted with careful determination of seismic velocities and by carefully choosing the processing steps and parameters. Various data processing steps have been applied and parameters refined to produce improved stacked seismic records. The resulting seismic records from the Aden field area indicate a seismic response similar to what is expected from a carbonate mud-mound. One-dimensional synthetic seismograms were created using the available sonic and density logs from the well drilled near the Aden seismic lines
Processing of a nine-component near-offset VSP for seismic anisotropy
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacBeth, C.; Li, X.Y.; Zeng, X.
1997-03-01
A convolutional sequence of matrix operators is offered as a convenient deterministic scheme for processing a multicomponent vertical seismic profile (VSP). This sequence is applied to a nine-component near-offset VSP recorded at the Conoco borehole test facility, Kay County, Oklahoma. These data are corrected for tool spin and near-surface anisotropy together with source coupling or imbalance. After wave-field separation using a standard f-k filter, each source and receiver pair for the upgoing waves is adjusted to a common reference depth using a matrix operator based on the downgoing wavefield. The up- and downgoing waves are then processed for anisotropy bymore » a similarity transformation, to separate the qS1 and qS2 waves, from which the anisotropic properties are estimated. These estimates reveal a strong (apparent) vertical birefringence in the near-surface, but weak or moderate values for the majority of the subsurface. The target zone consists of a thin sandstone and deeper shale layer, both of which possess a strong vertical birefringence. The sandstone corresponds to a zone of known fluid flow. An observed qS2 attenuation and polarization change in the shale suggest it contains large fractures.« less
NASA Astrophysics Data System (ADS)
Zhou, Fulin; Tan, Ping
2018-01-01
China is a country where 100% of the territory is located in a seismic zone. Most of the strong earthquakes are over prediction. Most fatalities are caused by structural collapse. Earthquakes not only cause severe damage to structures, but can also damage non-structural elements on and inside of facilities. This can halt city life, and disrupt hospitals, airports, bridges, power plants, and other infrastructure. Designers need to use new techniques to protect structures and facilities inside. Isolation, energy dissipation and, control systems are more and more widely used in recent years in China. Currently, there are nearly 6,500 structures with isolation and about 3,000 structures with passive energy dissipation or hybrid control in China. The mitigation techniques are applied to structures like residential buildings, large or complex structures, bridges, underwater tunnels, historical or cultural relic sites, and industrial facilities, and are used for retrofitting of existed structures. This paper introduces design rules and some new and innovative devices for seismic isolation, energy dissipation and hybrid control for civil and industrial structures. This paper also discusses the development trends for seismic resistance, seismic isolation, passive and active control techniques for the future in China and in the world.
NASA Astrophysics Data System (ADS)
Mukuhira, Yusuke; Asanuma, Hiroshi; Ito, Takatoshi; Häring, Markus
2016-04-01
Occurrence of induced seismicity with large magnitude is critical environmental issues associated with fluid injection for shale gas/oil extraction, waste water disposal, carbon capture and storage, and engineered geothermal systems (EGS). Studies for prediction of the hazardous seismicity and risk assessment of induced seismicity has been activated recently. Many of these studies are based on the seismological statistics and these models use the information of the occurrence time and event magnitude. We have originally developed physics based model named "possible seismic moment model" to evaluate seismic activity and assess seismic moment which can be ready to release. This model is totally based on microseismic information of occurrence time, hypocenter location and magnitude (seismic moment). This model assumes existence of representative parameter having physical meaning that release-able seismic moment per rock volume (seismic moment density) at given field. Seismic moment density is to be estimated from microseismic distribution and their seismic moment. In addition to this, stimulated rock volume is also inferred by progress of microseismic cloud at given time and this quantity can be interpreted as the rock volume which can release seismic energy due to weakening effect of normal stress by injected fluid. Product of these two parameters (equation (1)) provide possible seismic moment which can be released from current stimulated zone as a model output. Difference between output of this model and observed cumulative seismic moment corresponds the seismic moment which will be released in future, based on current stimulation conditions. This value can be translated into possible maximum magnitude of induced seismicity in future. As this way, possible seismic moment can be used to have feedback to hydraulic stimulation operation in real time as an index which can be interpreted easily and intuitively. Possible seismic moment is defined as equation (1), where D
Signal-to-noise ratio application to seismic marker analysis and fracture detection
NASA Astrophysics Data System (ADS)
Xu, Hui-Qun; Gui, Zhi-Xian
2014-03-01
Seismic data with high signal-to-noise ratios (SNRs) are useful in reservoir exploration. To obtain high SNR seismic data, significant effort is required to achieve noise attenuation in seismic data processing, which is costly in materials, and human and financial resources. We introduce a method for improving the SNR of seismic data. The SNR is calculated by using the frequency domain method. Furthermore, we optimize and discuss the critical parameters and calculation procedure. We applied the proposed method on real data and found that the SNR is high in the seismic marker and low in the fracture zone. Consequently, this can be used to extract detailed information about fracture zones that are inferred by structural analysis but not observed in conventional seismic data.
Characterizing Geological Facies using Seismic Waveform Classification in Sarawak Basin
NASA Astrophysics Data System (ADS)
Zahraa, Afiqah; Zailani, Ahmad; Prasad Ghosh, Deva
2017-10-01
Numerous effort have been made to build relationship between geology and geophysics using different techniques throughout the years. The integration of these two most important data in oil and gas industry can be used to reduce uncertainty in exploration and production especially for reservoir productivity enhancement and stratigraphic identification. This paper is focusing on seismic waveform classification to different classes using neural network and to link them according to the geological facies which are established using the knowledge on lithology and log motif of well data. Seismic inversion is used as the input for the neural network to act as the direct lithology indicator reducing dependency on well calibration. The interpretation of seismic facies classification map provides a better understanding towards the lithology distribution, depositional environment and help to identify significant reservoir rock
Detection of sinkholes or anomalies using full seismic wave fields.
DOT National Transportation Integrated Search
2013-04-01
This research presents an application of two-dimensional (2-D) time-domain waveform tomography for detection of embedded sinkholes and anomalies. The measured seismic surface wave fields were inverted using a full waveform inversion (FWI) technique, ...
Pidlisecky, Adam; Haines, S.S.
2011-01-01
Conventional processing methods for seismic cone penetrometer data present several shortcomings, most notably the absence of a robust velocity model uncertainty estimate. We propose a new seismic cone penetrometer testing (SCPT) data-processing approach that employs Bayesian methods to map measured data errors into quantitative estimates of model uncertainty. We first calculate travel-time differences for all permutations of seismic trace pairs. That is, we cross-correlate each trace at each measurement location with every trace at every other measurement location to determine travel-time differences that are not biased by the choice of any particular reference trace and to thoroughly characterize data error. We calculate a forward operator that accounts for the different ray paths for each measurement location, including refraction at layer boundaries. We then use a Bayesian inversion scheme to obtain the most likely slowness (the reciprocal of velocity) and a distribution of probable slowness values for each model layer. The result is a velocity model that is based on correct ray paths, with uncertainty bounds that are based on the data error. ?? NRC Research Press 2011.
NASA Astrophysics Data System (ADS)
Othman, Adel A. A.; Fathy, M.; Negm, Adel
2018-06-01
The Temsah field is located in eastern part of the Nile delta to seaward. The main reservoirs of the area are Middle Pliocene mainly consist from siliciclastic which associated with a close deep marine environment. The Distribution pattern of the reservoir facies is limited scale indicating fast lateral and vertical changes which are not easy to resolve by applying of conventional seismic attribute. The target of the present study is to create geophysical workflows to a better image of the channel sand distribution in the study area. We apply both Average Absolute Amplitude and Energy attribute which are indicated on the distribution of the sand bodies in the study area but filled to fully described the channel geometry. So another tool, which offers more detailed geometry description is needed. The spectral decomposition analysis method is an alternative technique focused on processing Discrete Fourier Transform which can provide better results. Spectral decomposition have been done over the upper channel shows that the frequency in the eastern part of the channel is the same frequency in places where the wells are drilled, which confirm the connection of both the eastern and western parts of the upper channel. Results suggest that application of the spectral decomposition method leads to reliable inferences. Hence, using the spectral decomposition method alone or along with other attributes has a positive impact on reserves growth and increased production where the reserve in the study area increases to 75bcf.
Reevaluation of the Seismicity and seismic hazards of Northeastern Libya
NASA Astrophysics Data System (ADS)
Ben Suleman, abdunnur; Aousetta, Fawzi
2014-05-01
Libya, located at the northern margin of the African continent, underwent many episodes of orogenic activities. These episodes of orogenic activities affected and shaped the geological setting of the country. This study represents a detailed investigation that aims to focus on the seismicity and its implications on earthquake hazards of Northeastern Libya. At the end of year 2005 the Libyan National Seismological Network starts functioning with 15 stations. The Seismicity of the area under investigation was reevaluated using data recorded by the recently established network. The Al-Maraj earthquake occurred in May 22nd 2005was analyzed. This earthquake was located in a known seismically active area. This area was the sight of the well known 1963 earthquake that kills over 200 people. Earthquakes were plotted and resulting maps were interpreted and discussed. The level of seismic activity is higher in some areas, such as the city of Al-Maraj. The offshore areas north of Al-Maraj seem to have higher seismic activity. It is highly recommended that the recent earthquake activity is considered in the seismic hazard assessments for the northeastern part of Libya.
On the physics-based processes behind production-induced seismicity in natural gas fields
NASA Astrophysics Data System (ADS)
Zbinden, Dominik; Rinaldi, Antonio Pio; Urpi, Luca; Wiemer, Stefan
2017-04-01
Induced seismicity due to natural gas production is observed at different sites around the world. Common understanding is that the pressure drop caused by gas production leads to compaction, which affects the stress field in the reservoir and the surrounding rock formations, hence reactivating pre-existing faults and inducing earthquakes. Previous studies have often assumed that pressure changes in the reservoir compartments and intersecting fault zones are equal, while neglecting multi-phase fluid flow. In this study, we show that disregarding fluid flow involved in natural gas extraction activities is often inappropriate. We use a fully coupled multiphase fluid flow and geomechanics simulator, which accounts for stress-dependent permeability and linear poroelasticity, to better determine the conditions leading to fault reactivation. In our model setup, gas is produced from a porous reservoir, cut in two compartments that are offset by a normal fault, and overlain by impermeable caprock. Results show that fluid flow plays a major role pertaining to pore pressure and stress evolution within the fault. Hydro-mechanical processes include rotation of the principal stresses due to reservoir compaction, as well as poroelastic effects caused by the pressure drop in the adjacent reservoir. Fault strength is significantly reduced due to fluid flow into the fault zone from the neighbouring reservoir compartment and other formations. We also analyze the case of production in both compartments, and results show that simultaneous production does not prevent the fault to be reactivated, but the magnitude of the induced event is smaller. Finally, we analyze scenarios for minimizing seismicity after a period of production, such as (i) well shut-in and (ii) gas re-injection. Results show that, in the case of well shut-in, a highly stressed fault zone can still be reactivated several decades after production stop, although in average the shut-in results in reduction of seismicity
NASA Astrophysics Data System (ADS)
Czaplinska, Daria; Piazolo, Sandra; Almqvist, Bjarne
2015-04-01
techniques, including Voigt, Reuss, Hill, geometric mean and self-consistent and Asymptotic Expansion Homogenization (AEH) methods. To test the advantages and disadvantages of the method, results are compared to measured geophysical properties of equivalent rocks. Such comparison, allows refinement of seismic data interpretation for mid to lower crustal rocks. References: Cook, A., Vel., S., Johnson, S.E., Gerbi, C., Song, W.J., 2013. Elastic and Seismic Properties (ESP) Toolbox (beta version); http://umaine.edu/mecheng/faculty-and-staff/senthil-vel/software/ESP_Toolbox/