Sample records for statistical seismicity analysis

  1. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    USGS Publications Warehouse

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  2. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  3. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    NASA Astrophysics Data System (ADS)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  4. Open Source Tools for Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Powers, P.

    2010-12-01

    The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.

  5. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review available statistical seismology software packages.

  6. Digital recovery, modification, and analysis of Tetra Tech seismic horizon mapping, National Petroleum Reserve Alaska (NPRA), northern Alaska

    USGS Publications Warehouse

    Saltus, R.W.; Kulander, Christopher S.; Potter, Christopher J.

    2002-01-01

    We have digitized, modified, and analyzed seismic interpretation maps of 12 subsurface stratigraphic horizons spanning portions of the National Petroleum Reserve in Alaska (NPRA). These original maps were prepared by Tetra Tech, Inc., based on about 15,000 miles of seismic data collected from 1974 to 1981. We have also digitized interpreted faults and seismic velocities from Tetra Tech maps. The seismic surfaces were digitized as two-way travel time horizons and converted to depth using Tetra Tech seismic velocities. The depth surfaces were then modified by long-wavelength corrections based on recent USGS seismic re-interpretation along regional seismic lines. We have developed and executed an algorithm to identify and calculate statistics on the area, volume, height, and depth of closed structures based on these seismic horizons. These closure statistics are tabulated and have been used as input to oil and gas assessment calculations for the region. Directories accompanying this report contain basic digitized data, processed data, maps, tabulations of closure statistics, and software relating to this project.

  7. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  8. A comparative study of two statistical approaches for the analysis of real seismicity sequences and synthetic seismicity generated by a stick-slip experimental model

    NASA Astrophysics Data System (ADS)

    Flores-Marquez, Leticia Elsa; Ramirez Rojaz, Alejandro; Telesca, Luciano

    2015-04-01

    The study of two statistical approaches is analyzed for two different types of data sets, one is the seismicity generated by the subduction processes occurred at south Pacific coast of Mexico between 2005 and 2012, and the other corresponds to the synthetic seismic data generated by a stick-slip experimental model. The statistical methods used for the present study are the visibility graph in order to investigate the time dynamics of the series and the scaled probability density function in the natural time domain to investigate the critical order of the system. This comparison has the purpose to show the similarities between the dynamical behaviors of both types of data sets, from the point of view of critical systems. The observed behaviors allow us to conclude that the experimental set up globally reproduces the behavior observed in the statistical approaches used to analyses the seismicity of the subduction zone. The present study was supported by the Bilateral Project Italy-Mexico Experimental Stick-slip models of tectonic faults: innovative statistical approaches applied to synthetic seismic sequences, jointly funded by MAECI (Italy) and AMEXCID (Mexico) in the framework of the Bilateral Agreement for Scientific and Technological Cooperation PE 2014-2016.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marekova, Elisaveta

    Series of relatively large earthquakes in different regions of the Earth are studied. The regions chooses are of a high seismic activity and has a good contemporary network for recording of the seismic events along them. The main purpose of this investigation is the attempt to describe analytically the seismic process in the space and time. We are considering the statistical distributions the distances and the times between consecutive earthquakes (so called pair analysis). Studies conducted on approximating the statistical distribution of the parameters of consecutive seismic events indicate the existence of characteristic functions that describe them best. Such amore » mathematical description allows the distributions of the examined parameters to be compared to other model distributions.« less

  10. A non extensive statistical physics analysis of the Hellenic subduction zone seismicity

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.; Papadakis, G.; Michas, G.; Sammonds, P.

    2012-04-01

    The Hellenic subduction zone is the most seismically active region in Europe [Becker & Meier, 2010]. The spatial and temporal distribution of seismicity as well as the analysis of the magnitude distribution of earthquakes concerning the Hellenic subduction zone, has been studied using the concept of Non-Extensive Statistical Physics (NESP) [Tsallis, 1988 ; Tsallis, 2009]. Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems (Vallianatos, 2011). Using this concept, Abe & Suzuki (2003;2005) investigated the spatial and temporal properties of the seismicity in California and Japan and recently Darooneh & Dadashinia (2008) in Iran. Furthermore, Telesca (2011) calculated the thermodynamic parameter q of the magnitude distribution of earthquakes of the southern California earthquake catalogue. Using the external seismic zones of 36 seismic sources of shallow earthquakes in the Aegean and the surrounding area [Papazachos, 1990], we formed a dataset concerning the seismicity of shallow earthquakes (focal depth ≤ 60km) of the subduction zone, which is based on the instrumental data of the Geodynamic Institute of the National Observatory of Athens (http://www.gein.noa.gr/, period 1990-2011). The catalogue consists of 12800 seismic events which correspond to 15 polygons of the aforementioned external seismic zones. These polygons define the subduction zone, as they are associated with the compressional stress field which characterizes a subducting regime. For each event, moment magnitude was calculated from ML according to the suggestions of Papazachos et al. (1997). The cumulative distribution functions of the inter-event times and the inter-event distances as well as the magnitude distribution for each seismic zone have been estimated, presenting a variation in the q-triplet along the Hellenic subduction zone. The models used, fit rather well to the observed distributions, implying the complexity of the spatiotemporal properties of seismicity and the usefulness of NESP in investigating such phenomena, exhibiting scale-free nature and long range memory effects. Acknowledgments. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC". GM and GP wish to acknowledge the partial support of the Greek State Scholarships Foundation (ΙΚΥ).

  11. Waveform classification and statistical analysis of seismic precursors to the July 2008 Vulcanian Eruption of Soufrière Hills Volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Rodgers, Mel; Smith, Patrick; Pyle, David; Mather, Tamsin

    2016-04-01

    Understanding the transition between quiescence and eruption at dome-forming volcanoes, such as Soufrière Hills Volcano (SHV), Montserrat, is important for monitoring volcanic activity during long-lived eruptions. Statistical analysis of seismic events (e.g. spectral analysis and identification of multiplets via cross-correlation) can be useful for characterising seismicity patterns and can be a powerful tool for analysing temporal changes in behaviour. Waveform classification is crucial for volcano monitoring, but consistent classification, both during real-time analysis and for retrospective analysis of previous volcanic activity, remains a challenge. Automated classification allows consistent re-classification of events. We present a machine learning (random forest) approach to rapidly classify waveforms that requires minimal training data. We analyse the seismic precursors to the July 2008 Vulcanian explosion at SHV and show systematic changes in frequency content and multiplet behaviour that had not previously been recognised. These precursory patterns of seismicity may be interpreted as changes in pressure conditions within the conduit during magma ascent and could be linked to magma flow rates. Frequency analysis of the different waveform classes supports the growing consensus that LP and Hybrid events should be considered end members of a continuum of low-frequency source processes. By using both supervised and unsupervised machine-learning methods we investigate the nature of waveform classification and assess current classification schemes.

  12. Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity

    NASA Astrophysics Data System (ADS)

    Tanaka, Hiroki; Aizawa, Yoji

    2017-02-01

    The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.

  13. Seismic sample areas defined from incomplete catalogues: an application to the Italian territory

    NASA Astrophysics Data System (ADS)

    Mulargia, F.; Tinti, S.

    1985-11-01

    The comprehensive understanding of earthquake source-physics under real conditions requires the study not of single faults as separate entities but rather of a seismically active region as a whole, accounting for the interaction among different structures. We define "seismic sample area" the most convenient region to be used as a natural laboratory for the study of seismic source physics. This coincides with the region where the average large magnitude seismicity is the highest. To this end, time and space future distributions of large earthquakes are to be estimated. Using catalog seismicity as an input, the rate of occurrence is not constant but appears generally biased by incompleteness in some parts of the catalog and possible nonstationarities in seismic activity. We present a statistical procedure which is capable, under a few mild assumptions, of both detecting nonstationarities in seismicity and finding the incomplete parts of a seismic catalog. The procedure is based on Kolmogorov-Smirnov nonparametric statistics, and can be applied without a priori assuming the parent distribution of the events. The efficiency of this procedure allows the analysis of small data sets. An application to the Italian territory is presented, using the most recent version of the ENEL seismic catalog. Seismic activity takes place in six well defined areas but only five of them have a number of events sufficient for analysis. Barring a few exceptions, seismicity is found stationary throughout the whole catalog span 1000-1980. The eastern Alps region stands out as the best "sample area", with the highest average probability of event occurrence per time and area unit. Final objective of this characterization is to stimulate a program of intensified research.

  14. Role of reservoirs in sustained seismicity of Koyna-Warna region—a statistical analysis

    NASA Astrophysics Data System (ADS)

    Yadav, Amrita; Gahalaut, Kalpna; Purnachandra Rao, N.

    2018-03-01

    Koyna-Warna region in western India is a globally recognized site of reservoir-triggered seismicity near the Koyna and Warna reservoirs. The region has been reported with several M > 5 earthquakes in the last five decades including M6.3 Koyna earthquake which is considered as the largest triggered earthquake worldwide. In the present study, a detailed statistical analysis has been done for long period earthquake catalogues during 1968-2004 of MERI and 2005-2012 of CSIR-NGRI to find out the spatio-temporal influence of the Koyna and Warna reservoirs impoundment on the seismicity of the region. Depending upon the earthquake clusters, we divided the region into three different zones and performed power spectrum and singular spectrum analysis (SSA) on them. For the time period 1983-1995, the earthquake zone near the Warna reservoir; for 1996-2004, the earthquake zone near the Koyna reservoir; and for 2005-2012, the earthquake zone near the Warna reservoir found to be influenced by the annual water level variations in the reservoirs that confirm the continuous role of both the reservoirs in the seismicity of the Koyna-Warna region.

  15. Aftershock identification problem via the nearest-neighbor analysis for marked point processes

    NASA Astrophysics Data System (ADS)

    Gabrielov, A.; Zaliapin, I.; Wong, H.; Keilis-Borok, V.

    2007-12-01

    The centennial observations on the world seismicity have revealed a wide variety of clustering phenomena that unfold in the space-time-energy domain and provide most reliable information about the earthquake dynamics. However, there is neither a unifying theory nor a convenient statistical apparatus that would naturally account for the different types of seismic clustering. In this talk we present a theoretical framework for nearest-neighbor analysis of marked processes and obtain new results on hierarchical approach to studying seismic clustering introduced by Baiesi and Paczuski (2004). Recall that under this approach one defines an asymmetric distance D in space-time-energy domain such that the nearest-neighbor spanning graph with respect to D becomes a time- oriented tree. We demonstrate how this approach can be used to detect earthquake clustering. We apply our analysis to the observed seismicity of California and synthetic catalogs from ETAS model and show that the earthquake clustering part is statistically different from the homogeneous part. This finding may serve as a basis for an objective aftershock identification procedure.

  16. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  17. Worldwide seismicity in view of non-extensive statistical physics

    NASA Astrophysics Data System (ADS)

    Chochlaki, Kaliopi; Vallianatos, Filippos; Michas, George

    2014-05-01

    In the present work we study the distribution of worldwide shallow seismic events occurred from 1981 to 2011 extracted from the CMT catalog, with magnitude equal or greater than Mw 5.0. Our analysis based on the subdivision of the Earth surface into seismic zones that are homogeneous with regards to seismic activity and orientation of the predominant stress field. To this direction we use the Flinn-Engdahl regionalization (Flinn and Engdahl, 1965), which consists of 50 seismic zones as modified by Lombardi and Marzocchi (2007), where grouped the 50 FE zones into larger tectonically homogeneous ones, utilizing the cumulative moment tensor method. As a result Lombardi and Marzocchi (2007), limit the initial 50 regions to 39 ones, in which we apply the non- extensive statistical physics approach. The non-extensive statistical physics seems to be the most adequate and promising methodological tool for analyzing complex systems, such as the Earth's interior. In this frame, we introduce the q-exponential formulation as the expression of probability distribution function that maximizes the Sq entropy as defined by Tsallis, (1988). In the present work we analyze the interevent time distribution between successive earthquakes by a q-exponential function in each of the seismic zones defined by Lombardi and Marzocchi (2007).confirming the importance of long-range interactions and the existence of a power-law approximation in the distribution of the interevent times. Our findings supports the ideas of universality within the Tsallis approach to describe Earth's seismicity and present strong evidence on temporal clustering of seismic activity in each of the tectonic zones analyzed. Our analysis as applied in worldwide seismicity with magnitude equal or greater than Mw 5.5 and 6.) is presented and the dependence of our result on the cut-off magnitude is discussed. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project of the "Education & Lifelong Learning" Operational Programme.

  18. Analysis of Magnitude Correlations in a Self-Similar model of Seismicity

    NASA Astrophysics Data System (ADS)

    Zambrano, A.; Joern, D.

    2017-12-01

    A recent model of seismicity that incorporates a self-similar Omori-Utsu relation, which is used to describe the temporal evolution of earthquake triggering, has been shown to provide a more accurate description of seismicity in Southern California when compared to epidemic type aftershock sequence models. Forecasting of earthquakes is an active research area where one of the debated points is whether magnitude correlations of earthquakes exist within real world seismic data. Prior to this work, the analysis of magnitude correlations of the aforementioned self-similar model had not been addressed. Here we present statistical properties of the magnitude correlations for the self-similar model along with an analytical analysis of the branching ratio and criticality parameters.

  19. Seismic hazard assessment: Issues and alternatives

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  20. Using 3D visualization and seismic attributes to improve structural and stratigraphic resolution of reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerr, J.; Jones, G.L.

    1996-01-01

    Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less

  1. Using 3D visualization and seismic attributes to improve structural and stratigraphic resolution of reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerr, J.; Jones, G.L.

    1996-12-31

    Recent advances in hardware and software have given the interpreter and engineer new ways to view 3D seismic data and well bore information. Recent papers have also highlighted the use of various statistics and seismic attributes. By combining new 3D rendering technologies with recent trends in seismic analysis, the interpreter can improve the structural and stratigraphic resolution of hydrocarbon reservoirs. This paper gives several examples using 3D visualization to better define both the structural and stratigraphic aspects of several different structural types from around the world. Statistics, 3D visualization techniques and rapid animation are used to show complex faulting andmore » detailed channel systems. These systems would be difficult to map using either 2D or 3D data with conventional interpretation techniques.« less

  2. Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.

    DTIC Science & Technology

    1983-09-01

    research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis

  3. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2006

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl

    2008-01-01

    Between January 1 and December 31, 2006, AVO located 8,666 earthquakes of which 7,783 occurred on or near the 33 volcanoes monitored within Alaska. Monitoring highlights in 2006 include: an eruption of Augustine Volcano, a volcanic-tectonic earthquake swarm at Mount Martin, elevated seismicity and volcanic unrest at Fourpeaked Mountain, and elevated seismicity and low-level tremor at Mount Veniaminof and Korovin Volcano. A new seismic subnetwork was installed on Fourpeaked Mountain. This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field during 2006, (2) a description of earthquake detection, recording, analysis, and data archival systems, (3) a description of seismic velocity models used for earthquake locations, (4) a summary of earthquakes located in 2006, and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2006.

  4. Statistical Analysis of Time-Series from Monitoring of Active Volcanic Vents

    NASA Astrophysics Data System (ADS)

    Lachowycz, S.; Cosma, I.; Pyle, D. M.; Mather, T. A.; Rodgers, M.; Varley, N. R.

    2016-12-01

    Despite recent advances in the collection and analysis of time-series from volcano monitoring, and the resulting insights into volcanic processes, challenges remain in forecasting and interpreting activity from near real-time analysis of monitoring data. Statistical methods have potential to characterise the underlying structure and facilitate intercomparison of these time-series, and so inform interpretation of volcanic activity. We explore the utility of multiple statistical techniques that could be widely applicable to monitoring data, including Shannon entropy and detrended fluctuation analysis, by their application to various data streams from volcanic vents during periods of temporally variable activity. Each technique reveals changes through time in the structure of some of the data that were not apparent from conventional analysis. For example, we calculate the Shannon entropy (a measure of the randomness of a signal) of time-series from the recent dome-forming eruptions of Volcán de Colima (Mexico) and Soufrière Hills (Montserrat). The entropy of real-time seismic measurements and the count rate of certain volcano-seismic event types from both volcanoes is found to be temporally variable, with these data generally having higher entropy during periods of lava effusion and/or larger explosions. In some instances, the entropy shifts prior to or coincident with changes in seismic or eruptive activity, some of which were not clearly recognised by real-time monitoring. Comparison with other statistics demonstrates the sensitivity of the entropy to the data distribution, but that it is distinct from conventional statistical measures such as coefficient of variation. We conclude that each analysis technique examined could provide valuable insights for interpretation of diverse monitoring time-series.

  5. Identification and characterization of earthquake clusters: a comparative analysis for selected sequences in Italy

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Gentili, Stefania

    2017-04-01

    Identification and statistical characterization of seismic clusters may provide useful insights about the features of seismic energy release and their relation to physical properties of the crust within a given region. Moreover, a number of studies based on spatio-temporal analysis of main-shocks occurrence require preliminary declustering of the earthquake catalogs. Since various methods, relying on different physical/statistical assumptions, may lead to diverse classifications of earthquakes into main events and related events, we aim to investigate the classification differences among different declustering techniques. Accordingly, a formal selection and comparative analysis of earthquake clusters is carried out for the most relevant earthquakes in North-Eastern Italy, as reported in the local OGS-CRS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. The comparison is then extended to selected earthquake sequences associated with a different seismotectonic setting, namely to events that occurred in the region struck by the recent Central Italy destructive earthquakes, making use of INGV data. Various techniques, ranging from classical space-time windows methods to ad hoc manual identification of aftershocks, are applied for detection of earthquake clusters. In particular, a statistical method based on nearest-neighbor distances of events in space-time-energy domain, is considered. Results from clusters identification by the nearest-neighbor method turn out quite robust with respect to the time span of the input catalogue, as well as to minimum magnitude cutoff. The identified clusters for the largest events reported in North-Eastern Italy since 1977 are well consistent with those reported in earlier studies, which were aimed at detailed manual aftershocks identification. The study shows that the data-driven approach, based on the nearest-neighbor distances, can be satisfactorily applied to decompose the seismic catalog into background seismicity and individual sequences of earthquake clusters, also in areas characterized by moderate seismic activity, where the standard declustering techniques may turn out rather gross approximations. With these results acquired, the main statistical features of seismic clusters are explored, including complex interdependence of related events, with the aim to characterize the space-time patterns of earthquakes occurrence in North-Eastern Italy and capture their basic differences with Central Italy sequences.

  6. Evidence of non-extensivity and complexity in the seismicity observed during 2011-2012 at the Santorini volcanic complex, Greece

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.; Tzanis, A.; Michas, G.; Papadakis, G.

    2012-04-01

    Since the middle of summer 2011, an increase in the seismicity rates of the volcanic complex system of Santorini Island, Greece, was observed. In the present work, the temporal distribution of seismicity, as well as the magnitude distribution of earthquakes, have been studied using the concept of Non-Extensive Statistical Physics (NESP; Tsallis, 2009) along with the evolution of Shanon entropy H (also called information entropy). The analysis is based on the earthquake catalogue of the Geodynamic Institute of the National Observatory of Athens for the period July 2011-January 2012 (http://www.gein.noa.gr/). Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems. The observed distributions of seismicity rates at Santorini can be described (fitted) with NESP models to exceptionally well. This implies the inherent complexity of the Santorini volcanic seismicity, the applicability of NESP concepts to volcanic earthquake activity and the usefulness of NESP in investigating phenomena exhibiting multifractality and long-range coupling effects. Acknowledgments. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC". GM and GP wish to acknowledge the partial support of the Greek State Scholarships Foundation (ΙΚΥ).

  7. Global regionalized seismicity in view of Non-Extensive Statistical Physics

    NASA Astrophysics Data System (ADS)

    Chochlaki, Kalliopi; Vallianatos, Filippos; Michas, Georgios

    2018-03-01

    In the present work we study the distribution of Earth's shallow seismicity on different seismic zones, as occurred from 1981 to 2011 and extracted from the Centroid Moment Tensor (CMT) catalog. Our analysis is based on the subdivision of the Earth's surface into seismic zones that are homogeneous with regards to seismic activity and orientation of the predominant stress field. For this, we use the Flinn-Engdahl regionalization (FE) (Flinn and Engdahl, 1965), which consists of fifty seismic zones as modified by Lombardi and Marzocchi (2007). The latter authors grouped the 50 FE zones into larger tectonically homogeneous ones, utilizing the cumulative moment tensor method, resulting into thirty-nine seismic zones. In each one of these seismic zones we study the distribution of seismicity in terms of the frequency-magnitude distribution and the inter-event time distribution between successive earthquakes, a task that is essential for hazard assessments and to better understand the global and regional geodynamics. In our analysis we use non-extensive statistical physics (NESP), which seems to be one of the most adequate and promising methodological tools for analyzing complex systems, such as the Earth's seismicity, introducing the q-exponential formulation as the expression of probability distribution function that maximizes the Sq entropy as defined by Tsallis, (1988). The qE parameter is significantly greater than one for all the seismic regions analyzed with value range from 1.294 to 1.504, indicating that magnitude correlations are particularly strong. Furthermore, the qT parameter shows some temporal correlations but variations with cut-off magnitude show greater temporal correlations when the smaller magnitude earthquakes are included. The qT for earthquakes with magnitude greater than 5 takes values from 1.043 to 1.353 and as we increase the cut-off magnitude to 5.5 and 6 the qT value ranges from 1.001 to 1.242 and from 1.001 to 1.181 respectively, presenting a significant decrease. Our findings support the ideas of universality within the Tsallis approach to describe Earth's seismicity and present strong evidence ontemporal clustering and long-range correlations of seismicity in each of the tectonic zonesanalyzed.

  8. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-06-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/ a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  9. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    NASA Astrophysics Data System (ADS)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  10. Iceberg calving as a primary source of regional‐scale glacier‐generated seismicity in the St. Elias Mountains, Alaska

    USGS Publications Warehouse

    O'Neel, Shad; Larsen, Christopher F.; Rupert, Natalia; Hansen, Roger

    2010-01-01

    Since the installation of the Alaska Regional Seismic Network in the 1970s, data analysts have noted nontectonic seismic events thought to be related to glacier dynamics. While loose associations with the glaciers of the St. Elias Mountains have been made, no detailed study of the source locations has been undertaken. We performed a two-step investigation surrounding these events, beginning with manual locations that guided an automated detection and event sifting routine. Results from the manual investigation highlight characteristics of the seismic waveforms including single-peaked (narrowband) spectra, emergent onsets, lack of distinct phase arrivals, and a predominant cluster of locations near the calving termini of several neighboring tidewater glaciers. Through these locations, comparison with previous work, analyses of waveform characteristics, frequency-magnitude statistics and temporal patterns in seismicity, we suggest calving as a source for the seismicity. Statistical properties and time series analysis of the event catalog suggest a scale-invariant process that has no single or simple forcing. These results support the idea that calving is often a response to short-lived or localized stress perturbations. Our results demonstrate the utility of passive seismic instrumentation to monitor relative changes in the rate and magnitude of iceberg calving at tidewater glaciers that may be volatile or susceptible to ensuing rapid retreat, especially when existing seismic infrastructure can be used.

  11. Analysis of regional deformation and strain accumulation data adjacent to the San Andreas fault

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    A new approach to the understanding of crustal deformation was developed under this grant. This approach combined aspects of fractals, chaos, and self-organized criticality to provide a comprehensive theory for deformation on distributed faults. It is hypothesized that crustal deformation is an example of comminution: Deformation takes place on a fractal distribution of faults resulting in a fractal distribution of seismicity. Our primary effort under this grant was devoted to developing an understanding of distributed deformation in the continental crust. An initial effort was carried out on the fractal clustering of earthquakes in time. It was shown that earthquakes do not obey random Poisson statistics, but can be approximated in many cases by coupled, scale-invariant fractal statistics. We applied our approach to the statistics of earthquakes in the New Hebrides region of the southwest Pacific because of the very high level of seismicity there. This work was written up and published in the Bulletin of the Seismological Society of America. This approach was also applied to the statistics of the seismicity on the San Andreas fault system.

  12. Hydromechanical Earthquake Nucleation Model Forecasts Onset, Peak, and Falling Rates of Induced Seismicity in Oklahoma and Kansas

    NASA Astrophysics Data System (ADS)

    Norbeck, J. H.; Rubinstein, J. L.

    2018-04-01

    The earthquake activity in Oklahoma and Kansas that began in 2008 reflects the most widespread instance of induced seismicity observed to date. We develop a reservoir model to calculate the hydrologic conditions associated with the activity of 902 saltwater disposal wells injecting into the Arbuckle aquifer. Estimates of basement fault stressing conditions inform a rate-and-state friction earthquake nucleation model to forecast the seismic response to injection. Our model replicates many salient features of the induced earthquake sequence, including the onset of seismicity, the timing of the peak seismicity rate, and the reduction in seismicity following decreased disposal activity. We present evidence for variable time lags between changes in injection and seismicity rates, consistent with the prediction from rate-and-state theory that seismicity rate transients occur over timescales inversely proportional to stressing rate. Given the efficacy of the hydromechanical model, as confirmed through a likelihood statistical test, the results of this study support broader integration of earthquake physics within seismic hazard analysis.

  13. Aftershock Sequences and Seismic-Like Organization of Acoustic Events Produced by a Single Propagating Crack

    NASA Astrophysics Data System (ADS)

    Alizee, D.; Bonamy, D.

    2017-12-01

    In inhomogeneous brittle solids like rocks, concrete or ceramics, one usually distinguish nominally brittle fracture, driven by the propagation of a single crack from quasibrittle one, resulting from the accumulation of many microcracks. The latter goes along with intermittent sharp noise, as e.g. revealed by the acoustic emission observed in lab scale compressive fracture experiments or at geophysical scale in the seismic activity. In both cases, statistical analyses have revealed a complex time-energy organization into aftershock sequences obeying a range of robust empirical scaling laws (the Omori-Utsu, productivity and Bath's law) that help carry out seismic hazard analysis and damage mitigation. These laws are usually conjectured to emerge from the collective dynamics of microcrack nucleation. In the experiments presented at AGU, we will show that such a statistical organization is not specific to the quasi-brittle multicracking situations, but also rules the acoustic events produced by a single crack slowly driven in an artificial rock made of sintered polymer beads. This simpler situation has advantageous properties (statistical stationarity in particular) permitting us to uncover the origins of these seismic laws: Both productivity law and Bath's law result from the scale free statistics for event energy and Omori-Utsu law results from the scale-free statistics of inter-event time. This yields predictions on how the associated parameters are related, which were analytically derived. Surprisingly, the so-obtained relations are also compatible with observations on lab scale compressive fracture experiments, suggesting that, in these complex multicracking situations also, the organization into aftershock sequences and associated seismic laws are also ruled by the propagation of individual microcrack fronts, and not by the collective, stress-mediated, microcrack nucleation. Conversely, the relations are not fulfilled in seismology signals, suggesting that additional ingredient should be taken into account.

  14. Precursory enhancement of EIA in the morning sector: Contribution from mid-latitude large earthquakes in the north-east Asian region

    NASA Astrophysics Data System (ADS)

    Ryu, Kwangsun; Oyama, Koh-Ichiro; Bankov, Ludmil; Chen, Chia-Hung; Devi, Minakshi; Liu, Huixin; Liu, Jann-Yenq

    2016-01-01

    To investigate whether the link between seismic activity and EIA (equatorial ionization anomaly) enhancement is valid for mid-latitude seismic activity, DEMETER observations around seven large earthquakes in the north-east Asian region were fully analyzed (M ⩾ 6.8). In addition, statistical analysis was performed for 35 large earthquakes (M ⩾ 6.0) that occurred during the DEMETER observation period. The results suggest that mid-latitude earthquakes do contribute to EIA enhancement, represented as normalized equatorial Ne , and that ionospheric change precedes seismic events, as has been reported in previous studies. According to statistical studies, the normalized equatorial density enhancement is sensitive and proportional to both the magnitude and the hypocenter depth of an earthquake. The mechanisms that can explain the contribution of mid-latitude seismic activity to EIA variation are briefly discussed based on current explanations of the geochemical and ionospheric processes involved in lithosphere-ionosphere interaction.

  15. The Statistical Meaning of Kurtosis and Its New Application to Identification of Persons Based on Seismic Signals

    PubMed Central

    Liang, Zhiqiang; Wei, Jianming; Zhao, Junyu; Liu, Haitao; Li, Baoqing; Shen, Jie; Zheng, Chunlei

    2008-01-01

    This paper presents a new algorithm making use of kurtosis, which is a statistical parameter, to distinguish the seismic signal generated by a person's footsteps from other signals. It is adaptive to any environment and needs no machine study or training. As persons or other targets moving on the ground generate continuous signals in the form of seismic waves, we can separate different targets based on the seismic waves they generate. The parameter of kurtosis is sensitive to impulsive signals, so it's much more sensitive to the signal generated by person footsteps than other signals generated by vehicles, winds, noise, etc. The parameter of kurtosis is usually employed in the financial analysis, but rarely used in other fields. In this paper, we make use of kurtosis to distinguish person from other targets based on its different sensitivity to different signals. Simulation and application results show that this algorithm is very effective in distinguishing person from other targets. PMID:27873804

  16. New Madrid Seismic Zone: a test case for naturally induced seismicity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nava, S.J.

    1983-09-01

    Induced seismicity caused by man-made events, such as the filling of reservoirs has been well documented. In contrast, naturally induced seismicity has received little attention. It has been shown that a fluctuation of as little as several bars can trigger reservoir induced earthquakes. Naturally occurring phenomena generate similar fluctuations and could trigger earthquakes where the faults in ambient stress field are suitably oriented and close to failure. The New Madrid Seismic Zone (NMSZ) presents an ideal test case for the study of naturally induced seismicity. The ideal data set for a study of triggering effects must contain a statistically significantmore » number of events, a constant accumulated strain, and a limited focal region. New Madrid earthquakes are well documented from 1974 to the present, down to a magnitude approx. 1.8. They lie in a distinct fault pattern and occur as a reaction to the regional stress regime. A statistical correlation was made between the earthquakes and a variety of different types of loads, to see if New Madrid seismicity could be triggered by natural fluctuations. The types of triggers investigated ranged from solid earth tides to variations in barometric pressure, rainfall, and stages of the Mississippi River. This analysis becomes complex because each factor investigated creates individual stresses, as well as having imbedded in it a reaction to other factors.« less

  17. Evolution of seismicity in relation to fluid injection in the North-Western part of The Geysers geothermal field

    NASA Astrophysics Data System (ADS)

    Leptokaropoulos, Konstantinos; Staszek, Monika; Lasocki, Stanisław; Martínez-Garzón, Patricia; Kwiatek, Grzegorz

    2018-02-01

    The Geysers geothermal field located in California, USA, is the largest geothermal site in the world, operating since the 1960s. We here investigate and quantify the correlation between temporal seismicity evolution and variation of the injection data by examination of time-series through specified statistical tools (binomial test to investigate significant rate changes, cross correlation between seismic and injection data, b-value variation analysis). To do so, we utilize seismicity and operational data associated with two injection wells (Prati-9 and Prati-29) which cover a time period of approximately 7 yr (from November 2007 to August 2014). The seismicity is found to be significantly positively correlated with the injection rate. The maximum correlation occurs with a seismic response delay of ˜2 weeks, following injection operations. Those results are very stable even after considering hypocentral uncertainties, by applying a vertical shift of the events foci up to 300 m. Our analysis indicates also time variations of b-value, which exhibits significant positive correlation with injection rates.

  18. Detecting Seismic Activity with a Covariance Matrix Analysis of Data Recorded on Seismic Arrays

    NASA Astrophysics Data System (ADS)

    Seydoux, L.; Shapiro, N.; de Rosny, J.; Brenguier, F.

    2014-12-01

    Modern seismic networks are recording the ground motion continuously all around the word, with very broadband and high-sensitivity sensors. The aim of our study is to apply statistical array-based approaches to processing of these records. We use the methods mainly brought from the random matrix theory in order to give a statistical description of seismic wavefields recorded at the Earth's surface. We estimate the array covariance matrix and explore the distribution of its eigenvalues that contains information about the coherency of the sources that generated the studied wavefields. With this approach, we can make distinctions between the signals generated by isolated deterministic sources and the "random" ambient noise. We design an algorithm that uses the distribution of the array covariance matrix eigenvalues to detect signals corresponding to coherent seismic events. We investigate the detection capacity of our methods at different scales and in different frequency ranges by applying it to the records of two networks: (1) the seismic monitoring network operating on the Piton de la Fournaise volcano at La Réunion island composed of 21 receivers and with an aperture of ~15 km, and (2) the transportable component of the USArray composed of ~400 receivers with ~70 km inter-station spacing.

  19. The persistent signature of tropical cyclones in ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Gualtieri, Lucia; Camargo, Suzana J.; Pascale, Salvatore; Pons, Flavio M. E.; Ekström, Göran

    2018-02-01

    The spectrum of ambient seismic noise shows strong signals associated with tropical cyclones, yet a detailed understanding of these signals and the relationship between them and the storms is currently lacking. Through the analysis of more than a decade of seismic data recorded at several stations located in and adjacent to the northwest Pacific Ocean, here we show that there is a persistent and frequency-dependent signature of tropical cyclones in ambient seismic noise that depends on characteristics of the storm and on the detailed location of the station relative to the storm. An adaptive statistical model shows that the spectral amplitude of ambient seismic noise, and notably of the short-period secondary microseisms, has a strong relationship with tropical cyclone intensity and can be employed to extract information on the tropical cyclones.

  20. Non-extensivity and complexity in the earthquake activity at the West Corinth rift (Greece)

    NASA Astrophysics Data System (ADS)

    Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter

    2013-04-01

    Earthquakes exhibit complex phenomenology that is revealed from the fractal structure in space, time and magnitude. For that reason other tools rather than the simple Poissonian statistics seem more appropriate to describe the statistical properties of the phenomenon. Here we use Non-Extensive Statistical Physics [NESP] to investigate the inter-event time distribution of the earthquake activity at the west Corinth rift (central Greece). This area is one of the most seismotectonically active areas in Europe, with an important continental N-S extension and high seismicity rates. NESP concept refers to the non-additive Tsallis entropy Sq that includes Boltzmann-Gibbs entropy as a particular case. This concept has been successfully used for the analysis of a variety of complex dynamic systems including earthquakes, where fractality and long-range interactions are important. The analysis indicates that the cumulative inter-event time distribution can be successfully described with NESP, implying the complexity that characterizes the temporal occurrences of earthquakes. Further on, we use the Tsallis entropy (Sq) and the Fischer Information Measure (FIM) to investigate the complexity that characterizes the inter-event time distribution through different time windows along the evolution of the seismic activity at the West Corinth rift. The results of this analysis reveal a different level of organization and clusterization of the seismic activity in time. Acknowledgments. GM wish to acknowledge the partial support of the Greek State Scholarships Foundation (IKY).

  1. Comparison between deterministic and statistical wavelet estimation methods through predictive deconvolution: Seismic to well tie example from the North Sea

    NASA Astrophysics Data System (ADS)

    de Macedo, Isadora A. S.; da Silva, Carolina B.; de Figueiredo, J. J. S.; Omoboya, Bode

    2017-01-01

    Wavelet estimation as well as seismic-to-well tie procedures are at the core of every seismic interpretation workflow. In this paper we perform a comparative study of wavelet estimation methods for seismic-to-well tie. Two approaches to wavelet estimation are discussed: a deterministic estimation, based on both seismic and well log data, and a statistical estimation, based on predictive deconvolution and the classical assumptions of the convolutional model, which provides a minimum-phase wavelet. Our algorithms, for both wavelet estimation methods introduce a semi-automatic approach to determine the optimum parameters of deterministic wavelet estimation and statistical wavelet estimation and, further, to estimate the optimum seismic wavelets by searching for the highest correlation coefficient between the recorded trace and the synthetic trace, when the time-depth relationship is accurate. Tests with numerical data show some qualitative conclusions, which are probably useful for seismic inversion and interpretation of field data, by comparing deterministic wavelet estimation and statistical wavelet estimation in detail, especially for field data example. The feasibility of this approach is verified on real seismic and well data from Viking Graben field, North Sea, Norway. Our results also show the influence of the washout zones on well log data on the quality of the well to seismic tie.

  2. Estimating Fault Friction From Seismic Signals in the Laboratory

    NASA Astrophysics Data System (ADS)

    Rouet-Leduc, Bertrand; Hulbert, Claudia; Bolton, David C.; Ren, Christopher X.; Riviere, Jacques; Marone, Chris; Guyer, Robert A.; Johnson, Paul A.

    2018-02-01

    Nearly all aspects of earthquake rupture are controlled by the friction along the fault that progressively increases with tectonic forcing but in general cannot be directly measured. We show that fault friction can be determined at any time, from the continuous seismic signal. In a classic laboratory experiment of repeating earthquakes, we find that the seismic signal follows a specific pattern with respect to fault friction, allowing us to determine the fault's position within its failure cycle. Using machine learning, we show that instantaneous statistical characteristics of the seismic signal are a fingerprint of the fault zone shear stress and frictional state. Further analysis of this fingerprint leads to a simple equation of state quantitatively relating the seismic signal power and the friction on the fault. These results show that fault zone frictional characteristics and the state of stress in the surroundings of the fault can be inferred from seismic waves, at least in the laboratory.

  3. Assessment of Uncertainties Related to Seismic Hazard Using Fuzzy Analysis

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, N.; Yokoi, T.; Javakhishvili, Z.

    2013-05-01

    Seismic hazard analysis in last few decades has been become very important issue. Recently, new technologies and available data have been improved that helped many scientists to understand where and why earthquakes happen, physics of earthquakes, etc. They have begun to understand the role of uncertainty in Seismic hazard analysis. However, there is still significant problem how to handle existing uncertainty. The same lack of information causes difficulties to quantify uncertainty accurately. Usually attenuation curves are obtained in statistical way: regression analysis. Statistical and probabilistic analysis show overlapped results for the site coefficients. This overlapping takes place not only at the border between two neighboring classes, but also among more than three classes. Although the analysis starts from classifying sites using the geological terms, these site coefficients are not classified at all. In the present study, this problem is solved using Fuzzy set theory. Using membership functions the ambiguities at the border between neighboring classes can be avoided. Fuzzy set theory is performed for southern California by conventional way. In this study standard deviations that show variations between each site class obtained by Fuzzy set theory and classical way are compared. Results on this analysis show that when we have insufficient data for hazard assessment site classification based on Fuzzy set theory shows values of standard deviations less than obtained by classical way which is direct proof of less uncertainty.

  4. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  5. Generalized statistical mechanics approaches to earthquakes and tectonics.

    PubMed

    Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios

    2016-12-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.

  6. Generalized statistical mechanics approaches to earthquakes and tectonics

    PubMed Central

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  7. Effects of long-term fluid injection on induced seismicity parameters and maximum magnitude in northwestern part of The Geysers geothermal field

    NASA Astrophysics Data System (ADS)

    Kwiatek, Grzegorz; Martínez-Garzón, Patricia; Dresen, Georg; Bohnhoff, Marco; Sone, Hiroki; Hartline, Craig

    2015-10-01

    The long-term temporal and spatial changes in statistical, source, and stress characteristics of one cluster of induced seismicity recorded at The Geysers geothermal field (U.S.) are analyzed in relation to the field operations, fluid migration, and constraints on the maximum likely magnitude. Two injection wells, Prati-9 and Prati-29, located in the northwestern part of the field and their associated seismicity composed of 1776 events recorded throughout a 7 year period were analyzed. The seismicity catalog was relocated, and the source characteristics including focal mechanisms and static source parameters were refined using first-motion polarity, spectral fitting, and mesh spectral ratio analysis techniques. The source characteristics together with statistical parameters (b value) and cluster dynamics were used to investigate and understand the details of fluid migration scheme in the vicinity of injection wells. The observed temporal, spatial, and source characteristics were clearly attributed to fluid injection and fluid migration toward greater depths, involving increasing pore pressure in the reservoir. The seasonal changes of injection rates were found to directly impact the shape and spatial extent of the seismic cloud. A tendency of larger seismic events to occur closer to injection wells and a correlation between the spatial extent of the seismic cloud and source sizes of the largest events was observed suggesting geometrical constraints on the maximum likely magnitude and its correlation to the average injection rate and volume of fluids present in the reservoir.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harben, P E; Harris, D; Myers, S

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization and in full 3Dmore » finite difference modeling as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project benefits the U.S. military and intelligence community in support of LLNL's national-security mission. FY03 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A 3-seismic-array vehicle tracking testbed was installed on-site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications.« less

  9. Studies of Fault Interactions and Regional Seismicity Using Numerical Simulations

    NASA Astrophysics Data System (ADS)

    Yikilmaz, Mehmet Burak

    Numerical simulations are routinely used for weather and climate forecasting. It is desirable to simulate regional seismicity for seismic hazard analysis. One such simulation tool is the Virtual California earthquake simulator. We have used Virtual California (VC) to study various aspects of fault interaction and analyzed the statistics of earthquake recurrence times and magnitudes generated synthetically. The first chapter of this dissertation investigates the behavior of seismology simulations using three relatively simple models involving a straight strike-slip fault. We show that a series of historical earthquakes observed along the Nankai Trough in Japan exhibit similar patterns to those obtained in our model II. In the second chapter we utilize Virtual California to study regional seismicity in northern California. We generate synthetic catalogs of seismicity using a composite simulation. We use these catalogs to analyze frequency-magnitude and recurrence interval statistics on both a regional and fault specific level and compare our modeled rates of seismicity and spatial variability with observations. The final chapter explores the jump distance for a propagating rupture over a stepping strike-slip fault. Our study indicates that between 2.5 and 5.5 km of the separation distance, the percentage of events that jump from one fault to the next decreases significantly. We find that these step-over distance values are in good agreement with geologically observed values.

  10. Periodicity of Strong Seismicity in Italy: Schuster Spectrum Analysis Extended to the Destructive Earthquakes of 2016

    NASA Astrophysics Data System (ADS)

    Bragato, P. L.

    2017-10-01

    The strong earthquakes that occurred in Italy between 2009 and 2016 represent an abrupt acceleration of seismicity in respect of the previous 30 years. Such behavior seems to agree with the periodic rate change I observed in a previous paper. The present work improves that study by extending the data set up to the end of 2016, adopting the latest version of the historical seismic catalog of Italy, and introducing Schuster spectrum analysis for the detection of the oscillatory period and the assessment of its statistical significance. Applied to the declustered catalog of M w ≥ 6 earthquakes that occurred between 1600 and 2016, the analysis individuates a marked periodicity of 46 years, which is recognized above the 95% confidence level. Monte Carlo simulation shows that the oscillatory behavior is stable in respect of random errors on magnitude estimation. A parametric oscillatory model for the annual rate of seismicity is estimated by likelihood maximization under the hypothesis of inhomogeneous Poisson point process. According to the Akaike Information Criterion, such model outperforms the simpler homogeneous one with constant annual rate. A further element emerges form the analysis: so far, despite recent earthquakes, the Italian seismicity is still within a long-term decreasing trend established since the first half of the twentieth century.

  11. Analysis of the Seismicity Preceding Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Stallone, A.; Marzocchi, W.

    2016-12-01

    The most common earthquake forecasting models assume that the magnitude of the next earthquake is independent from the past. This feature is probably one of the most severe limitations of the capability to forecast large earthquakes.In this work, we investigate empirically on this specific aspect, exploring whether spatial-temporal variations in seismicity encode some information on the magnitude of the future earthquakes. For this purpose, and to verify the universality of the findings, we consider seismic catalogs covering quite different space-time-magnitude windows, such as the Alto Tiberina Near Fault Observatory (TABOO) catalogue, and the California and Japanese seismic catalog. Our method is inspired by the statistical methodology proposed by Zaliapin (2013) to distinguish triggered and background earthquakes, using the nearest-neighbor clustering analysis in a two-dimension plan defined by rescaled time and space. In particular, we generalize the metric based on the nearest-neighbor to a metric based on the k-nearest-neighbors clustering analysis that allows us to consider the overall space-time-magnitude distribution of k-earthquakes (k-foreshocks) which anticipate one target event (the mainshock); then we analyze the statistical properties of the clusters identified in this rescaled space. In essence, the main goal of this study is to verify if different classes of mainshock magnitudes are characterized by distinctive k-foreshocks distribution. The final step is to show how the findings of this work may (or not) improve the skill of existing earthquake forecasting models.

  12. An automated multi-scale network-based scheme for detection and location of seismic sources

    NASA Astrophysics Data System (ADS)

    Poiata, N.; Aden-Antoniow, F.; Satriano, C.; Bernard, P.; Vilotte, J. P.; Obara, K.

    2017-12-01

    We present a recently developed method - BackTrackBB (Poiata et al. 2016) - allowing to image energy radiation from different seismic sources (e.g., earthquakes, LFEs, tremors) in different tectonic environments using continuous seismic records. The method exploits multi-scale frequency-selective coherence in the wave field, recorded by regional seismic networks or local arrays. The detection and location scheme is based on space-time reconstruction of the seismic sources through an imaging function built from the sum of station-pair time-delay likelihood functions, projected onto theoretical 3D time-delay grids. This imaging function is interpreted as the location likelihood of the seismic source. A signal pre-processing step constructs a multi-band statistical representation of the non stationary signal, i.e. time series, by means of higher-order statistics or energy envelope characteristic functions. Such signal-processing is designed to detect in time signal transients - of different scales and a priori unknown predominant frequency - potentially associated with a variety of sources (e.g., earthquakes, LFE, tremors), and to improve the performance and the robustness of the detection-and-location location step. The initial detection-location, based on a single phase analysis with the P- or S-phase only, can then be improved recursively in a station selection scheme. This scheme - exploiting the 3-component records - makes use of P- and S-phase characteristic functions, extracted after a polarization analysis of the event waveforms, and combines the single phase imaging functions with the S-P differential imaging functions. The performance of the method is demonstrated here in different tectonic environments: (1) analysis of the one year long precursory phase of 2014 Iquique earthquake in Chile; (2) detection and location of tectonic tremor sources and low-frequency earthquakes during the multiple episodes of tectonic tremor activity in southwestern Japan.

  13. Induced seismicity and implications for CO2 storage risk

    NASA Astrophysics Data System (ADS)

    Gerstenberger, M. C.; Nicol, A.; Bromley, C.; Carne, R.; Chardot, L.; Ellis, S. M.; Jenkins, C.; Siggins, T.; Viskovic, P.

    2012-12-01

    We provide an overview of a recently completed report for the IEA GHG that represents a comprehensive review of current research and observations in induced seismicity, its risk to successful completion of Carbon Capture and Storage (CCS) projects and potential mitigation measures. We focus on two topics: a meta-analysis of related data from multiple injection projects around the globe and the implications of these data for CCS induced seismicity risk management. Published data have been compiled from injection and extraction projects around the globe to examine statistical relationships between possible controlling factors and induced seismicity. Quality control of such observational earthquake data sets is crucial to ensure robust results and issues with bias and completeness of the data set will be discussed. Analyses of the available data support previous suggestions that the locations, numbers and magnitudes of induced earthquakes are dependent on a range of factors, including the injection rate, total injected fluid volume, the reservoir permeability and the proximity of pre-existing faults. Increases in the injection rates and total volume of fluid injected, for example, typically raise reservoir pressures and increase the likelihood of elevated seismicity rates and maximum magnitudes of induced earthquakes. The risks associated with induced seismicity at CCS sites can be reduced and mitigated using a systematic and structured risk management programme. While precise forecasts of the expected induced seismicity may never be possible, a thorough risk management procedure should include some level of knowledge of the possible behaviour of induced seismicity. Risk management requires estimates of the expected magnitude, number, location and timing of potential induced earthquakes. Such forecasts should utilise site specific observations together with physical and statistical models that are optimised for the site. Statistical models presently show the most promise for forecasting induced seismicity after injection has commenced, however, with further development physical models could become key predictive tools. Combining forecasts with real-time monitoring of induced seismicity will be necessary to maintain an accurate picture of the seismicity and to allow for mitigation of the associated risks as they evolve. To optimise the utility of monitoring and mitigation programmes, site performance and management guidelines for the acceptable levels and impacts of induced seismicity together with key control measures should be established prior to injection. Such guidelines have been developed for Enhanced Geothermal Systems and should provide the starting point for a management strategy of induced seismicity at CCS sites.

  14. Statistical analysis of the El Niño-Southern Oscillation and sea-floor seismicity in the eastern tropical Pacific.

    PubMed

    Guillas, Serge; Day, Simon J; McGuire, B

    2010-05-28

    We present statistical evidence for a temporal link between variations in the El Niño-Southern Oscillation (ENSO) and the occurrence of earthquakes on the East Pacific Rise (EPR). We adopt a zero-inflated Poisson regression model to represent the relationship between the number of earthquakes in the Easter microplate on the EPR and ENSO (expressed using the southern oscillation index (SOI) for east Pacific sea-level pressure anomalies) from February 1973 to February 2009. We also examine the relationship between the numbers of earthquakes and sea levels, as retrieved by Topex/Poseidon from October 1992 to July 2002. We observe a significant (95% confidence level) positive influence of SOI on seismicity: positive SOI values trigger more earthquakes over the following 2 to 6 months than negative SOI values. There is a significant negative influence of absolute sea levels on seismicity (at 6 months lag). We propose that increased seismicity is associated with ENSO-driven sea-surface gradients (rising from east to west) in the equatorial Pacific, leading to a reduction in ocean-bottom pressure over the EPR by a few kilopascal. This relationship is opposite to reservoir-triggered seismicity and suggests that EPR fault activity may be triggered by plate flexure associated with the reduced pressure.

  15. Temporal evolution of a seismic sequence induced by a gas injection in the Eastern coast of Spain.

    PubMed

    Ruiz-Barajas, S; Sharma, N; Convertito, V; Zollo, A; Benito, B

    2017-06-06

    Induced seismicity associated with energy production is becoming an increasingly important issue worldwide for the hazard it poses to the exposed population and structures. We analyze one of the rare cases of induced seismicity associated with the underwater gas storage operations observed in the Castor platform, located in the Valencia gulf, east Spain, near a complex and important geological structure. In September 2013, some gas injection operations started at Castor, producing a series of seismic events around the reservoir area. The larger magnitude events (up to 4.2) took place some days after the end of the injection, with EMS intensities in coastal towns up to degree III. In this work, the seismic sequence is analyzed with the aim of detecting changes in statistical parameters describing the earthquake occurrence before and after the injection and identifying possible proxies to be used for monitoring the sequence evolution. Moreover, we explore the potential predictability of these statistical parameters which can be used to control the field operations in injection/storage fluid reservoirs. We firstly perform a retrospective approach and next a perspective analysis. We use different techniques for estimating the value of the expected maximum magnitude that can occur due to antropogenic activities in Castor.

  16. Seismic precursory patterns before a cliff collapse and critical point phenomena

    USGS Publications Warehouse

    Amitrano, D.; Grasso, J.-R.; Senfaute, G.

    2005-01-01

    We analyse the statistical pattern of seismicity before a 1-2 103 m3 chalk cliff collapse on the Normandie ocean shore, Western France. We show that a power law acceleration of seismicity rate and energy in both 40 Hz-1.5 kHz and 2 Hz-10kHz frequency range, is defined on 3 orders of magnitude, within 2 hours from the collapse time. Simultaneously, the average size of the seismic events increases toward the time to failure. These in situ results are derived from the only station located within one rupture length distance from the rock fall rupture plane. They mimic the "critical point" like behavior recovered from physical and numerical experiments before brittle failures and tertiary creep failures. Our analysis of this first seismic monitoring data of a cliff collapse suggests that the thermodynamic phase transition models for failure may apply for cliff collapse. Copyright 2005 by the American Geophysical Union.

  17. Statistical Analysis of the Correlation between Microwave Emission Anomalies and Seismic Activity Based on AMSR-E Satellite Data

    NASA Astrophysics Data System (ADS)

    qin, kai; Wu, Lixin; De Santis, Angelo; Zhang, Bin

    2016-04-01

    Pre-seismic thermal IR anomalies and ionosphere disturbances have been widely reported by using the Earth observation system (EOS). To investigate the possible physical mechanisms, a series of detecting experiments on rock loaded to fracturing were conducted. Some experiments studies have demonstrated that microwave radiation energy will increase under the loaded rock in specific frequency and the feature of radiation property can reflect the deformation process of rock fracture. This experimental result indicates the possibility that microwaves are emitted before earthquakes. Such microwaves signals are recently found to be detectable before some earthquake cases from the brightness temperature data obtained by the microwave-radiometer Advanced Microwave-Scanning Radiometer for the EOS (AMSR-E) aboard the satellite Aqua. This suggested that AMSR-E with vertical- and horizontal-polarization capability for six frequency bands (6.925, 10.65, 18.7, 23.8, 36.5, and 89.0 GHz) would be feasible to detect an earthquake which is associated with rock crash or plate slip. However, the statistical analysis of the correlation between satellite-observed microwave emission anomalies and seismic activity are firstly required. Here, we focus on the Kamchatka peninsula to carry out a statistical study, considering its high seismicity activity and the dense orbits covering of AMSR-E in high latitudes. 8-years (2003-2010) AMSR-E microwave brightness temperature data were used to reveal the spatio-temporal association between microwave emission anomalies and 17 earthquake events (M>5). Firstly, obvious spatial difference of microwave brightness temperatures between the seismic zone at the eastern side and the non-seismic zone the western side within the Kamchatka peninsula are found. Secondly, using both vertical- and horizontal-polarization to extract the temporal association, it is found that abnormal changes of microwave brightness temperatures appear generally 2 months before the M>6 earthquakes. Since the microwave emissions observed by AMSR-E are affected by various factors (e.g., emission of the earth's surface and emission, absorption and scattering of the atmosphere), further study together with the surface temperature, soil moisture and atmospheric water vapor will remove the weather and climate influences.

  18. Estimating Fault Friction From Seismic Signals in the Laboratory

    DOE PAGES

    Rouet-Leduc, Bertrand; Hulbert, Claudia; Bolton, David C.; ...

    2018-01-29

    Nearly all aspects of earthquake rupture are controlled by the friction along the fault that progressively increases with tectonic forcing but in general cannot be directly measured. We show that fault friction can be determined at any time, from the continuous seismic signal. In a classic laboratory experiment of repeating earthquakes, we find that the seismic signal follows a specific pattern with respect to fault friction, allowing us to determine the fault's position within its failure cycle. Using machine learning, we show that instantaneous statistical characteristics of the seismic signal are a fingerprint of the fault zone shear stress andmore » frictional state. Further analysis of this fingerprint leads to a simple equation of state quantitatively relating the seismic signal power and the friction on the fault. Finally, these results show that fault zone frictional characteristics and the state of stress in the surroundings of the fault can be inferred from seismic waves, at least in the laboratory.« less

  19. Estimating Fault Friction From Seismic Signals in the Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rouet-Leduc, Bertrand; Hulbert, Claudia; Bolton, David C.

    Nearly all aspects of earthquake rupture are controlled by the friction along the fault that progressively increases with tectonic forcing but in general cannot be directly measured. We show that fault friction can be determined at any time, from the continuous seismic signal. In a classic laboratory experiment of repeating earthquakes, we find that the seismic signal follows a specific pattern with respect to fault friction, allowing us to determine the fault's position within its failure cycle. Using machine learning, we show that instantaneous statistical characteristics of the seismic signal are a fingerprint of the fault zone shear stress andmore » frictional state. Further analysis of this fingerprint leads to a simple equation of state quantitatively relating the seismic signal power and the friction on the fault. Finally, these results show that fault zone frictional characteristics and the state of stress in the surroundings of the fault can be inferred from seismic waves, at least in the laboratory.« less

  20. Seismic detection and analysis of icequakes at Columbia Glacier, Alaska

    USGS Publications Warehouse

    O'Neel, Shad; Marshall, Hans P.; McNamara, Daniel E.; Pfeffer, William Tad

    2007-01-01

    Contributions to sea level rise from rapidly retreating marine-terminating glaciers are large and increasing. Strong increases in iceberg calving occur during retreat, which allows mass transfer to the ocean at a much higher rate than possible through surface melt alone. To study this process, we deployed an 11-sensor passive seismic network at Columbia Glacier, Alaska, during 2004–2005. We show that calving events generate narrow-band seismic signals, allowing frequency domain detections. Detection parameters were determined using direct observations of calving and validated using three statistical methods and hypocenter locations. The 1–3 Hz detections provide a good measure of the temporal distribution and size of calving events. Possible source mechanisms for the unique waveforms are discussed, and we analyze potential forcings for the observed seismicity.

  1. Sliding episodes during the 2002-2003 Stromboli lava effusion: Insights from seismic, volcanic, and statistical data analysis

    NASA Astrophysics Data System (ADS)

    Falsaperla, S.; Maiolino, V.; Spampinato, S.; Jaquet, O.; Neri, M.

    2008-04-01

    Repeated phenomena of flank instability accompanied the 28 December 2002 to 21 July 2003 eruption of Stromboli volcano. The major episodes were two tsunamigenic landslides on 30 December 2002, 2 d after the volcano unrest. After 30 December, sliding processes remodeled the area affected by slope instability. We propose analyses of 565 sliding episodes taking place from December 2002 to February 2003. We try to shed light on their main seismic features and links with the ongoing seismic and volcanic activity using variogram analysis as well. A characterization of the seismic signals in the time and frequency domains is presented for 185 sliding episodes. Their frequency content is between 1 Hz and 7 Hz. On the basis of the dominant peaks and shape of the spectrum, we identify three subclasses of signals, one of which has significant energy below 2 Hz. Low-frequency signatures were also found in the seismic records of the landslides of 30 December, which affected the aerial and submarine northwestern flank of the volcano. Accordingly, we surmise that spectral analysis might provide evidence of sliding phenomena with submarine runouts. We find no evidence of sliding processes induced by earthquakes. Additionally, a negative statistical correlation between sliding episodes and explosion quakes is highlighted by variogram analysis. Variograms indicate a persistent behavior, memory, of the flank instability from 5 to 10 d. We interpret the climax in the occurrence rate of the sliding processes between 24 and 29 January 2003 as the result of favorable conditions to slope instability due to the emplacement of NW-SE aligned, dike-fed vents located near the scarp of the landslide area. Afterward, the stabilizing effect of the lava flows over the northwestern flank of the volcano limited erosive phenomena to the unstable, loose slope not covered by lava.

  2. An analysis of seismic risk from a tourism point of view.

    PubMed

    Mäntyniemi, Päivi

    2012-07-01

    Global awareness of natural calamities increased after the destructive Indian Ocean tsunami of December 2004, largely because many foreigners lost their lives, especially in Thailand. This paper explores how best to communicate the seismic risk posed by different travel destinations to crisis management personnel in tourists' home countries. The analysis of seismic risk should be straightforward enough for non-specialists, yet powerful enough to identify the travel destinations that are most at risk. The output for each location is a point in 3D space composed of the natural and built-up environment and local tourism. The tourism-specific factors can be tailored according to the tourists' nationality. The necessary information can be collected from various directories and statistics, much of it available over the Internet. The output helps to illustrate the overall seismic risk conditions of different travel destinations, allows for comparison across destinations, and identifies the places that are most at risk. © 2012 The Author(s). Journal compilation © Overseas Development Institute, 2012.

  3. "Geo-statistics methods and neural networks in geophysical applications: A case study"

    NASA Astrophysics Data System (ADS)

    Rodriguez Sandoval, R.; Urrutia Fucugauchi, J.; Ramirez Cruz, L. C.

    2008-12-01

    The study is focus in the Ebano-Panuco basin of northeastern Mexico, which is being explored for hydrocarbon reservoirs. These reservoirs are in limestones and there is interest in determining porosity and permeability in the carbonate sequences. The porosity maps presented in this study are estimated from application of multiattribute and neural networks techniques, which combine geophysics logs and 3-D seismic data by means of statistical relationships. The multiattribute analysis is a process to predict a volume of any underground petrophysical measurement from well-log and seismic data. The data consist of a series of target logs from wells which tie a 3-D seismic volume. The target logs are neutron porosity logs. From the 3-D seismic volume a series of sample attributes is calculated. The objective of this study is to derive a set of attributes and the target log values. The selected set is determined by a process of forward stepwise regression. The analysis can be linear or nonlinear. In the linear mode the method consists of a series of weights derived by least-square minimization. In the nonlinear mode, a neural network is trained using the select attributes as inputs. In this case we used a probabilistic neural network PNN. The method is applied to a real data set from PEMEX. For better reservoir characterization the porosity distribution was estimated using both techniques. The case shown a continues improvement in the prediction of the porosity from the multiattribute to the neural network analysis. The improvement is in the training and the validation, which are important indicators of the reliability of the results. The neural network showed an improvement in resolution over the multiattribute analysis. The final maps provide more realistic results of the porosity distribution.

  4. Automatic Seismic Signal Processing Research.

    DTIC Science & Technology

    1981-09-01

    be used. We then rave mD(k) AT(k) + b (11) S2 aT S SD(k) - a(k) a so Equation (9) becomes ( Gnanadesikan , 1977, p. 83; Young and Calvert, 1974, Equation... Gnanadesikan (1977, p. 196), "The main function of statistical data analysis is to extricate and explicate the informational content of a body of...R. C. Goff (1980), "Evaluation of the MARS Seismic Event Detector," Systems, Science and Software Report SSS-R-81-4656, August. Gnanadesikan , R

  5. Assessing criticality in seismicity by entropy

    NASA Astrophysics Data System (ADS)

    Goltz, C.

    2003-04-01

    There is an ongoing discussion whether the Earth's crust is in a critical state and whether this state is permanent or intermittent. Intermittent criticality would allow specification of time-dependent hazard in principle. Analysis of a spatio-temporally evolving synthetic critical point phenomenon and of real seismicity using configurational entropy shows that the method is a suitable approach for the characterisation of critical point dynamics. Results obtained rather support the notion of intermittent criticality in earthquakes. Statistical significance of the findings is assessed by the method of surrogate data.

  6. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2005

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Estes, Steve; McNutt, Stephen R.

    2006-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988 (Figure 1). The primary objectives of the seismic program are the real-time seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents calculated earthquake hypocenters and seismic phase arrival data, and details changes in the seismic monitoring program for the period January 1 through December 31, 2005.The AVO seismograph network was used to monitor the seismic activity at thirty-two volcanoes within Alaska in 2005 (Figure 1). The network was augmented by two new subnetworks to monitor the Semisopochnoi Island volcanoes and Little Sitkin Volcano. Seismicity at these volcanoes was still being studied at the end of 2005 and has not yet been added to the list of permanently monitored volcanoes in the AVO weekly update. Following an extended period of monitoring to determine the background seismicity at the Mount Peulik, Ukinrek Maars, and Korovin Volcano, formal monitoring of these volcanoes began in 2005. AVO located 9,012 earthquakes in 2005.Monitoring highlights in 2005 include: (1) seismicity at Mount Spurr remaining above background, starting in February 2004, through the end of the year and into 2006; (2) an increase in seismicity at Augustine Volcano starting in May 2005, and continuing through the end of the year into 2006; (3) volcanic tremor and seismicity related to low-level strombolian activity at Mount Veniaminof in January to March and September; and (4) a seismic swarm at Tanaga Volcano in October and November.This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field in 2005; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2005; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2005.

  7. Comparative statistical and spectral studies of seismic and non-seismic sub-ionospheric VLF anomalies

    NASA Astrophysics Data System (ADS)

    Wolbang, Daniel; Biernat, Helfried; Schwingenschuh, Konrad; Eichelberger, Hans; Prattes, Gustav; Besser, Bruno; Boudjada, Mohammed Y.; Rozhnoi, Alexander; Solovieva, Maria; Biagi, Pier Francesco; Friedrich, Martin

    2013-04-01

    We present a comparative study of seismic and non-seismic sub-ionospheric VLF anomalies. Our method is based on parameter variations of the sub-ionospheric VLF waveguide formed by the surface and the lower ionosphere. The used radio links working in the frequency range between 10 and 50 kHz, the receivers are part of the European and Russian networks. Various authors investigated the lithopsheric-atmospheric-ionospheric coupling and predicted the lowering of the ionosphere over earthquake preparation zones [1]. The received nighttime signal of a sub-ionospheric waveguide depends strongly on the height of the ionospheric E-layer, typically 80 to 85 km. This height is characterized by a typical gradient of the electron density near the atmospheric-ionospheric boundary [2]. In the last years it has been turned out that one of the major issues of sub-ionospheric seismo-electromagnetic VLF studies are the non-seismic influences on the links, which have to be carefully characterized. Among others this could be traveling ionospheric disturbances, geomagnetic storms as well as electron precipitation. Our emphasis is on the analysis of daily, monthly and annual variations of the VLF amplitude. To improve the statistics we investigate the behavior and typical variations of the VLF amplitude and phase over a period of more than 2 years. One important parameter considered is the rate how often the fluctuations are falling below a significant level derived from a mean value. The temporal variations and the amplitudes of these depressions are studied for several years for sub-ionospheric VLF radio links with the receivers in Graz and Kamchatka. In order to study the difference between seismic and non-seismic turbulences in the lower ionosphere a power spectrum analysis of the received signal is performed too. We are especially interested in variations T>6 min which are typical for atmospheric gravity waves causing the lithospheric-atmospheric-ionospheric coupling [3]. All measured and derived VLF parameters are compared with VLF observations several weeks before an earthquake (e.g. L'Aquila, Italy, April 6, 2009) and with co- and post-seismic phenomena. It is shown that this comparative study will improve the one parameter seismo-electromagnetic VLF methods. References: [1] A. Molchanov, M. Hayakawa: Seismo-Electromagnetics and related Phenomena: History and latest results, Terrapub, 2008. [2] S. Pulinets, K. Boyarchuk: Ionospheric Precursors of Earthquakes, Springer, 2004 [3] A. Rozhnoi et al.: Observation evidences of atmospheric Gravity Waves induced by seismic activity from analysis of subionospheric LF signal spectra, National Hazards and Earth System Sciences, 7, 625-628, 2007.

  8. Statistical Study on Variations of the Ionospheric Ion Density Observed by DEMETER and Related to Seismic Activities

    NASA Astrophysics Data System (ADS)

    Yan, Rui; Parrot, Michel; Pinçon, Jean-Louis

    2017-12-01

    In this paper, we present the result of a statistical study performed on the ionospheric ion density variations above areas of seismic activity. The ion density was observed by the low altitude satellite DEMETER between 2004 and 2010. In the statistical analysis a superposed epoch method is used where the observed ionospheric ion density close to the epicenters both in space and in time is compared to background values recorded at the same location and in the same conditions. Data associated with aftershocks have been carefully removed from the database to prevent spurious effects on the statistics. It is shown that, during nighttime, anomalous ionospheric perturbations related to earthquakes with magnitudes larger than 5 are evidenced. At the time of these perturbations the background ion fluctuation departs from a normal distribution. They occur up to 200 km from the epicenters and mainly 5 days before the earthquakes. As expected, an ion density perturbation occurring just after the earthquakes and close to the epicenters is also evidenced.

  9. DARHT Multi-intelligence Seismic and Acoustic Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, Garrison Nicole; Van Buren, Kendra Lu; Hemez, Francois M.

    The purpose of this report is to document the analysis of seismic and acoustic data collected at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory for robust, multi-intelligence decision making. The data utilized herein is obtained from two tri-axial seismic sensors and three acoustic sensors, resulting in a total of nine data channels. The goal of this analysis is to develop a generalized, automated framework to determine internal operations at DARHT using informative features extracted from measurements collected external of the facility. Our framework involves four components: (1) feature extraction, (2) data fusion, (3) classification, andmore » finally (4) robustness analysis. Two approaches are taken for extracting features from the data. The first of these, generic feature extraction, involves extraction of statistical features from the nine data channels. The second approach, event detection, identifies specific events relevant to traffic entering and leaving the facility as well as explosive activities at DARHT and nearby explosive testing sites. Event detection is completed using a two stage method, first utilizing signatures in the frequency domain to identify outliers and second extracting short duration events of interest among these outliers by evaluating residuals of an autoregressive exogenous time series model. Features extracted from each data set are then fused to perform analysis with a multi-intelligence paradigm, where information from multiple data sets are combined to generate more information than available through analysis of each independently. The fused feature set is used to train a statistical classifier and predict the state of operations to inform a decision maker. We demonstrate this classification using both generic statistical features and event detection and provide a comparison of the two methods. Finally, the concept of decision robustness is presented through a preliminary analysis where uncertainty is added to the system through noise in the measurements.« less

  10. Spatio-temporal distribution of Oklahoma earthquakes: Exploring relationships using a nearest-neighbor approach: Nearest-neighbor analysis of Oklahoma

    DOE PAGES

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    2017-06-24

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less

  11. Spatio-temporal distribution of Oklahoma earthquakes: Exploring relationships using a nearest-neighbor approach: Nearest-neighbor analysis of Oklahoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less

  12. Forecasting volcanic unrest using seismicity: The good, the bad and the time consuming

    NASA Astrophysics Data System (ADS)

    Salvage, Rebecca; Neuberg, Jurgen W.

    2013-04-01

    Volcanic eruptions are inherently unpredictable in nature, with scientists struggling to forecast the type and timing of events, in particular in real time scenarios. Current understanding suggests that the use of statistical patterns within precursory datasets of seismicity prior to eruptive events could hold the potential to be used as real time forecasting tools. They allow us to determine times of clear deviation in data, which might be indicative of volcanic unrest. The identification of low frequency seismic swarms and the acceleration of this seismicity prior to observed volcanic unrest may be key in developing forecasting tools. The development of these real time forecasting models which can be implemented at volcano observatories is of particular importance since the identification of early warning signals allows danger to the proximal population to be minimized. We concentrate on understanding the significance and development of these seismic swarms as unrest develops at the volcano. In particular, analysis of accelerations in event rate, amplitude and energy rates released by seismicity prior to eruption suggests that these are important indicators of developing unrest. Real time analysis of these parameters simultaneously allows possible improvements to forecasting models. Although more time and computationally intense, cross correlation techniques applied to continuous seismicity prior to volcanic unrest scenarios allows all significant seismic events to be analysed, rather than only those which can be detected by an automated identification system. This may allow a more accurate forecast since all precursory seismicity can be taken into account. In addition, the classification of seismic events based on spectral characteristics may allow us to isolate individual types of signals which are responsible for certain types of unrest. In this way, we may be able to better forecast the type of eruption that may ensue, or at least some of its prevailing characteristics.

  13. New Possibilities In Assessing Time-dependent Seismic Risk

    NASA Astrophysics Data System (ADS)

    Kossobokov, V.

    A novel understanding of seismic occurrence process in terms of dynamics of a hierar- chical system of blocks-and-faults implies the necessity of new approaches to seismic risk assessment, which would allow for evident heterogeneity of seismic distribution in space and time. Spatial, apparently fractal, patterns of seismic distribution should be treated appropriately in estimation of seismic hazard. Otherwise the result could be over- or underestimated significantly. The patterns are clearly associated with tec- tonic movement, which traces being accumulated in a time-scale of tens of thousand years or larger provide geographic, geologic, gravity, and magnetic evidence of inten- sity of driving forces, their directivity and dating. This, term-less, in a sense of hu- man life-time, evidence, both clear and masked, requires analysis that involves pattern recognition and interpretation before it is used in favor of a conclusion about present day seismic activity. Moreover, the existing reproducible intermediate-term medium- range earthquake prediction algorithms that have passed statistical significance testing in forward application complement a knowledgeable estimation of the temporal devi- ation of seismic hazard in a given area from a constant. Bringing together the two estimations and convolving them with a given distribution of valuables of different kinds, e.g. population, industry, economy, etc., finalizes an estimation of seismic risk distribution.

  14. Relationships between Induced Seismicity and Fluid Injection: Development of Strategies to Manage Injection

    NASA Astrophysics Data System (ADS)

    Eichhubl, Peter; Frohlich, Cliff; Gale, Julia; Olson, Jon; Fan, Zhiqiang; Gono, Valerie

    2014-05-01

    Induced seismicity during or following the subsurface injection of waste fluids such as well stimulation flow back and production fluids has recently received heightened public and industry attention. It is understood that induced seismicity occurs by reactivation of existing faults that are generally present in the injection intervals. We seek to address the question why fluid injection triggers earthquakes in some areas and not in others, with the aim toward improved injection methods that optimize injection volume and cost while avoiding induced seismicity. A GIS database has been built of natural and induced earthquakes in four hydrocarbon-producing basins: the Fort Worth Basin, South Texas, East Texas/Louisiana, and the Williston Basin. These areas are associated with disposal from the Barnett, Eagle Ford, Bakken, and Haynesville Shales respectively. In each region we analyzed data that were been collected using temporary seismographs of the National Science Foundation's USArray Transportable Array. Injection well locations, formations, histories, and volumes are also mapped using public and licensed datasets. Faults are mapped at a range of scales for selected areas that show different levels of seismic activity, and scaling relationships used to extrapolate between the seismic and wellbore scale. Reactivation potential of these faults is assessed using fault occurrence, and in-situ stress conditions, identifying areas of high and low fault reactivation potential. A correlation analysis between fault reactivation potential, induced seismicity, and fluid injection will use spatial statistics to quantify the probability of seismic fault reactivation for a given injection pressure in the studied reservoirs. The limiting conditions inducing fault reactivation will be compared to actual injection parameters (volume, rate, injection duration and frequency) where available. The objective of this project is a statistical reservoir- to basin-scale assessment of fault reactivation and seismicity induced by fluid injection. By assessing the occurrence of earthquakes (M>2) evenly across large geographic regions, this project differs from previous studies of injection-induced seismicity that focused on earthquakes large enough to cause public concern in well-populated areas. The understanding of triggered seismicity gained through this project is expected to allow for improved design strategies for waste fluid injection to industry and public decision makers.

  15. Pre-seismic geomagnetic and ionosphere signatures related to the Mw5.7 earthquake occurred in Vrancea zone on September 24, 2016

    NASA Astrophysics Data System (ADS)

    Stanica, Dragos Armand; Stanica, Dumitru; Błęcki, Jan; Ernst, Tomasz; Jóźwiak, Waldemar; Słomiński, Jan

    2018-02-01

    To emphasize the relationship between the pre-seismic geomagnetic signals and Vrancea seismicity, in this work it is hypothesized that before an earthquake initiation, the high stress reached into seismogenic volume generates dehydration of the rocks and fracturing processes followed by release of electric charges along the faulting systems, which lead to resistivity changes. These changes were explored on September 2016 by the normalized function Bzn obtained from the geomagnetic data recorded in ULF range (0.001-0.0083 Hz). A statistical analysis was also performed to discriminate on the new Bzn* time series a pre-seismic signature related to the Mw5.7 earthquake. Significant anomalous behavior of Bzn* was identified on September 21, with 3 days prior to the onset of the seismic event. Similar information is provided by registrations of the magnetic and electron concentration variations in the ionosphere over the Vrancea zone, by Swarm satellites, 4 days and 1 day before the earthquake.

  16. Quantifying the seismicity on Taiwan

    NASA Astrophysics Data System (ADS)

    Wu, Yi-Hsuan; Chen, Chien-Chih; Turcotte, Donald L.; Rundle, John B.

    2013-07-01

    We quantify the seismicity on the island of Taiwan using the frequency-magnitude statistics of earthquakes since 1900. A break in Gutenberg-Richter scaling for large earthquakes in global seismicity has been observed, this break is also observed in our Taiwan study. The seismic data from the Central Weather Bureau Seismic Network are in good agreement with the Gutenberg-Richter relation taking b ≈ 1 when M < 7. For large earthquakes, M ≥ 7, the seismic data fit Gutenberg-Richter scaling with b ≈ 1.5. If the Gutenberg-Richter scaling for M < 7 earthquakes is extrapolated to larger earthquakes, we would expect a M > 8 earthquake in the study region about every 25 yr. However, our analysis shows a lower frequency of occurrence of large earthquakes so that the expected frequency of M > 8 earthquakes is about 200 yr. The level of seismicity for smaller earthquakes on Taiwan is about 12 times greater than in Southern California and the possibility of a M ≈ 9 earthquake north or south of Taiwan cannot be ruled out. In light of the Fukushima, Japan nuclear disaster, we also discuss the implications of our study for the three operating nuclear power plants on the coast of Taiwan.

  17. Geomagnetic signal induced by the M5.7 earthquake occurred on September 24-th, 2016, in the seismic active Vrancea zone, Romania

    NASA Astrophysics Data System (ADS)

    Stanica, Dumitru; Armand Stanica, Dragos

    2017-04-01

    In this paper, we used the geomagnetic time series collected in real time by the electromagnetic monitoring system, placed at the Geomagnetic Observatory Provita de Sus, to emphasize possible relationships between the pre-seismic anomalous behavior of the normalized function Bzn and M5.7 earthquake occurrence in Vrancea seismic active zone, on September 24, 2016. It has already been demonstrated (Stanica and Stanica, 2012, Stanica et al., 2015) that for a 2D geoelectric structure, in pre-seismic conditions, the normalized function Bzn has significant changes in magnitudes due to the electrical conductivity changes, possibly associated with the earthquake-induced rupture-processes and high-pressure fluid flow through the faulting systems developed inside the Vrancea seismogenic volume and along the Carpathian electrical conductivity anomaly. In this circumstances, the daily mean distributions of the Bzn = Bz/Bperp (where Bz is vertical component of the geomagnetic field; Bperp is geomagnetic component perpendicular to the geoelectric strike) and its standard deviation (SD) are performed in the ULF frequency range 0.001Hz to 0.0083Hz by using both the FFT band-pass filter analysis and statistical analysis based on a standardized random variable equation. After analyzing the pre-seismic anomalous intervals, a pre-seismic geomagnetic signal greater than 5 SD was identified on September 22, 2016, what means a lead time of 2 days before the M5.7 earthquake occurred on September 24, emphasized in real time on the web site (www.geodin.ro). The final conclusion is that the proposed geomagnetic methodology might be used to provide suitable information for the extreme seismic hazard assessment and risk mitigation. References: Dumitru Stanica and Dragos Armand Stanica, Earthquakes precursors, in "Earthquake Research and Analysis-Statistical Studies, Observations and Planning" Book 5, edited by: Dr. Sebastiano D'Amico, ISBN 978-953-51-0134-5, InTech open access publisher, Chapter 4, 71-100, 2012 Dragos Armand Stanica, Dumitru Stanica, Nicoleta Vladimirescu, Long-range anomalous electromagnetic effect related to M9 Great Tohoku earthquake, Earth Sciences. Vol. 4, No. 1, 2015, pp. 31-38, http://www.sciencepublishinggroup.com/j/earth, doi: 10.11648/j.earth.20150401.13

  18. Interactions and triggering in a 3D rate and state asperity model

    NASA Astrophysics Data System (ADS)

    Dublanchet, P.; Bernard, P.

    2012-12-01

    Precise relocation of micro-seismicity and careful analysis of seismic source parameters have progressively imposed the concept of seismic asperities embedded in a creeping fault segment as being one of the most important aspect that should appear in a realistic representation of micro-seismic sources. Another important issue concerning micro-seismic activity is the existence of robust empirical laws describing the temporal and magnitude distribution of earthquakes, such as the Omori law, the distribution of inter-event time and the Gutenberg-Richter law. In this framework, this study aims at understanding statistical properties of earthquakes, by generating synthetic catalogs with a 3D, quasi-dynamic continuous rate and state asperity model, that takes into account a realistic geometry of asperities. Our approach contrasts with ETAS models (Kagan and Knopoff, 1981) usually implemented to produce earthquake catalogs, in the sense that the non linearity observed in rock friction experiments (Dieterich, 1979) is fully taken into account by the use of rate and state friction law. Furthermore, our model differs from discrete models of faults (Ziv and Cochard, 2006) because the continuity allows us to define realistic geometries and distributions of asperities by the assembling of sub-critical computational cells that always fail in a single event. Moreover, this model allows us to adress the question of the influence of barriers and distribution of asperities on the event statistics. After recalling the main observations of asperities in the specific case of Parkfield segment of San-Andreas Fault, we analyse earthquake statistical properties computed for this area. Then, we present synthetic statistics obtained by our model that allow us to discuss the role of barriers on clustering and triggering phenomena among a population of sources. It appears that an effective size of barrier, that depends on its frictional strength, controls the presence or the absence, in the synthetic catalog, of statistical laws that are similar to what is observed for real earthquakes. As an application, we attempt to draw a comparison between synthetic statistics and the observed statistics of Parkfield in order to characterize what could be a realistic frictional model of Parkfield area. More generally, we obtained synthetic statistical properties that are in agreement with power-law decays characterized by exponents that match the observations at a global scale, showing that our mechanical model is able to provide new insights into the understanding of earthquake interaction processes in general.

  19. Unraveling earthquake stresses: Insights from dynamically triggered and induced earthquakes

    NASA Astrophysics Data System (ADS)

    Velasco, A. A.; Alfaro-Diaz, R. A.

    2017-12-01

    Induced seismicity, earthquakes caused by anthropogenic activity, has more than doubled in the last several years resulting from practices related to oil and gas production. Furthermore, large earthquakes have been shown to promote the triggering of other events within two fault lengths (static triggering), due to static stresses caused by physical movement along the fault, and also remotely from the passage of seismic waves (dynamic triggering). Thus, in order to understand the mechanisms for earthquake failure, we investigate regions where natural, induced, and dynamically triggered events occur, and specifically target Oklahoma. We first analyze data from EarthScope's USArray Transportable Array (TA) and local seismic networks implementing an optimized (STA/LTA) detector in order to develop local detection and earthquake catalogs. After we identify triggered events through statistical analysis, and perform a stress analysis to gain insight on the stress-states leading to triggered earthquake failure. We use our observations to determine the role of different transient stresses in contributing to natural and induced seismicity by comparing these stresses to regional stress orientation. We also delineate critically stressed regions of triggered seismicity that may indicate areas susceptible to earthquake hazards associated with sustained fluid injection in provinces of induced seismicity. Anthropogenic injection and extraction activity can alter the stress state and fluid flow within production basins. By analyzing the stress release of these ancient faults caused by dynamic stresses, we may be able to determine if fluids are solely responsible for increased seismic activity in induced regions.

  20. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed technique to a training dataset of induced earthquakes recorded by Berkeley-Geysers network, which is installed in The Geysers geothermal area in Northern California. The reliability of the techniques is then tested by using a different dataset performing seismic hazard analysis in a time-evolving approach, which provides with ground-motion values having fixed probabilities of exceedence. Those values can be finally compared with the observations by using appropriate statistical tests.

  1. Joint inversion of marine seismic AVA and CSEM data using statistical rock-physics models and Markov random fields: Stochastic inversion of AVA and CSEM data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, J.; Hoversten, G.M.

    2011-09-15

    Joint inversion of seismic AVA and CSEM data requires rock-physics relationships to link seismic attributes to electrical properties. Ideally, we can connect them through reservoir parameters (e.g., porosity and water saturation) by developing physical-based models, such as Gassmann’s equations and Archie’s law, using nearby borehole logs. This could be difficult in the exploration stage because information available is typically insufficient for choosing suitable rock-physics models and for subsequently obtaining reliable estimates of the associated parameters. The use of improper rock-physics models and the inaccuracy of the estimates of model parameters may cause misleading inversion results. Conversely, it is easy tomore » derive statistical relationships among seismic and electrical attributes and reservoir parameters from distant borehole logs. In this study, we develop a Bayesian model to jointly invert seismic AVA and CSEM data for reservoir parameter estimation using statistical rock-physics models; the spatial dependence of geophysical and reservoir parameters are carried out by lithotypes through Markov random fields. We apply the developed model to a synthetic case, which simulates a CO{sub 2} monitoring application. We derive statistical rock-physics relations from borehole logs at one location and estimate seismic P- and S-wave velocity ratio, acoustic impedance, density, electrical resistivity, lithotypes, porosity, and water saturation at three different locations by conditioning to seismic AVA and CSEM data. Comparison of the inversion results with their corresponding true values shows that the correlation-based statistical rock-physics models provide significant information for improving the joint inversion results.« less

  2. Data User's Note: Apollo seismological investigations

    NASA Technical Reports Server (NTRS)

    Vostreys, R. W.

    1980-01-01

    Seismological objectives and equipment used in the passive seismic, active seismic, lunar seismic profiling, and the lunar gravimeter experiments conducted during Apollo 11, 12, 14, 15, 16, and 17 missions are described. The various formats in which the data form these investigations can be obtained are listed an an index showing the NSSDC identification number is provided. Tables show manned lunar landing missions, lunar seismic network statistics, lunar impact coordinate statistics, detonation masses and times of EP's, the ALSEP (Apollo 14) operational history; compressed scale playout tape availability, LSPE coverage for one lunation, and experimenter interpreted events types.

  3. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    NASA Astrophysics Data System (ADS)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  4. Study of pre-seismic kHz EM emissions by means of complex systems

    NASA Astrophysics Data System (ADS)

    Balasis, Georgios; Papadimitriou, Constantinos; Eftaxias, Konstantinos

    2010-05-01

    The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe disparate problems ranging from particle physics to economies of societies. A corollary is that transferring ideas and results from investigators in hitherto disparate areas will cross-fertilize and lead to important new results. It is well-known that the Boltzmann-Gibbs statistical mechanics works best in dealing with systems composed of either independent subsystems or interacting via short-range forces, and whose subsystems can access all the available phase space. For systems exhibiting long-range correlations, memory, or fractal properties, non-extensive Tsallis statistical mechanics becomes the most appropriate mathematical framework. As it was mentioned a central property of the magnetic storm, solar flare, and earthquake preparation process is the possible occurrence of coherent large-scale collective with a very rich structure, resulting from the repeated nonlinear interactions among collective with a very rich structure, resulting from the repeated nonlinear interactions among its constituents. Consequently, the non-extensive statistical mechanics is an appropriate regime to investigate universality, if any, in magnetic storm, solar flare, earthquake and pre-failure EM emission occurrence. A model for earthquake dynamics coming from a non-extensive Tsallis formulation, starting from first principles, has been recently introduced. This approach leads to a Gutenberg-Richter type law for the magnitude distribution of earthquakes which provides an excellent fit to seismicities generated in various large geographic areas usually identified as "seismic regions". We examine whether the Gutenberg-Richter law corresponding to a non-extensive Tsallis statistics is able to describe the distribution of amplitude of earthquakes, pre-seismic kHz EM emissions (electromagnetic earthquakes), solar flares, and magnetic storms. The analysis shows that the introduced non-extensive model provides an excellent fit to the experimental data, incorporating the characteristics of universality by means of non-extensive statistics into the extreme events under study.

  5. Seismic Hazard Assessment for a Characteristic Earthquake Scenario: Probabilistic-Deterministic Method

    NASA Astrophysics Data System (ADS)

    mouloud, Hamidatou

    2016-04-01

    The objective of this paper is to analyze the seismic activity and the statistical treatment of seismicity catalog the Constantine region between 1357 and 2014 with 7007 seismic event. Our research is a contribution to improving the seismic risk management by evaluating the seismic hazard in the North-East Algeria. In the present study, Earthquake hazard maps for the Constantine region are calculated. Probabilistic seismic hazard analysis (PSHA) is classically performed through the Cornell approach by using a uniform earthquake distribution over the source area and a given magnitude range. This study aims at extending the PSHA approach to the case of a characteristic earthquake scenario associated with an active fault. The approach integrates PSHA with a high-frequency deterministic technique for the prediction of peak and spectral ground motion parameters in a characteristic earthquake. The method is based on the site-dependent evaluation of the probability of exceedance for the chosen strong-motion parameter. We proposed five sismotectonique zones. Four steps are necessary: (i) identification of potential sources of future earthquakes, (ii) assessment of their geological, geophysical and geometric, (iii) identification of the attenuation pattern of seismic motion, (iv) calculation of the hazard at a site and finally (v) hazard mapping for a region. In this study, the procedure of the earthquake hazard evaluation recently developed by Kijko and Sellevoll (1992) is used to estimate seismic hazard parameters in the northern part of Algeria.

  6. Statistical Analysis and ETAS Modeling of Seismicity Induced by Production of Geothermal Energy from Hydrothermal Systems

    NASA Astrophysics Data System (ADS)

    Dinske, C.; Langenbruch, C.; Shapiro, S. A.

    2017-12-01

    We investigate seismicity related to hydrothermal systems in Germany and Italy, focussing on temporal changes of seismicity rates. Our analysis was motivated by numerical simulations The modeling of stress changes caused by the injection and production of fluid revealed that seismicity rates decrease on a long-term perspective which is not observed in the considered case studies. We analyze the waiting time distributions of the seismic events in both time domain (inter event times) and fluid volume domain (inter event volume). We find clear indications that the observed seismicity comprises two components: (1) seismicity that is directly triggered by production and re-injection of fluid, i.e. induced events, and (2) seismicity that is triggered by earthquake interactions, i.e. aftershock triggering. In order to better constrain our numerical simulations using the observed induced seismicity we apply catalog declustering to seperate the two components. We use the magnitude-dependent space-time windowing approach introduced by Gardner and Knopoff (1974) and test several published algorithms to calculate the space-time windows. After declustering, we conclude that the different hydrothermal reservoirs show a comparable seismic response to the circulation of fluid and additional triggering by earthquake interactions. The declustered catalogs contain approximately 50 per cent of the number of events in the original catalogs. We then perform ETAS (Epidemic Type Aftershock; Ogata, 1986, 1988) modeling for two reasons. First, we want to know whether the different reservoirs are also comparable regarding earthquake interaction patterns. Second, if we identify systematic patterns, ETAS modeling can contribute to forecast seismicity during production of geothermal energy. We find that stationary ETAS models cannot accurately capture real seismicity rate changes. One reason for this finding is given by the rate of observed induced events which is not constant over time. Hence we utilize non-stationary ETAS modeling (Kumazawa and Ogata, 2013, 2014) which results in a good agreement with the observation. But the required non-stationarity of the process of seismicity triggering complicates an implementation of ETAS modeling in induced seismicity forecast models.

  7. Detailed investigation of Long-Period activity at Campi Flegrei by Convolutive Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Capuano, P.; De Lauro, E.; De Martino, S.; Falanga, M.

    2016-04-01

    This work is devoted to the analysis of seismic signals continuously recorded at Campi Flegrei Caldera (Italy) during the entire year 2006. The radiation pattern associated with the Long-Period energy release is investigated. We adopt an innovative Independent Component Analysis algorithm for convolutive seismic series adapted and improved to give automatic procedures for detecting seismic events often buried in the high-level ambient noise. The extracted waveforms characterized by an improved signal-to-noise ratio allows the recognition of Long-Period precursors, evidencing that the seismic activity accompanying the mini-uplift crisis (in 2006), which climaxed in the three days from 26-28 October, had already started at the beginning of the month of October and lasted until mid of November. Hence, a more complete seismic catalog is then provided which can be used to properly quantify the seismic energy release. To better ground our results, we first check the robustness of the method by comparing it with other blind source separation methods based on higher order statistics; secondly, we reconstruct the radiation patterns of the extracted Long-Period events in order to link the individuated signals directly to the sources. We take advantage from Convolutive Independent Component Analysis that provides basic signals along the three directions of motion so that a direct polarization analysis can be performed with no other filtering procedures. We show that the extracted signals are mainly composed of P waves with radial polarization pointing to the seismic source of the main LP swarm, i.e. a small area in the Solfatara, also in the case of the small-events, that both precede and follow the main activity. From a dynamical point of view, they can be described by two degrees of freedom, indicating a low-level of complexity associated with the vibrations from a superficial hydrothermal system. Our results allow us to move towards a full description of the complexity of the source, which can be used, by means of the small-intensity precursors, for hazard-model development and forecast-model testing, showing an illustrative example of the applicability of the CICA method to regions with low seismicity in high ambient noise.

  8. Statistical methods for investigating quiescence and other temporal seismicity patterns

    USGS Publications Warehouse

    Matthews, M.V.; Reasenberg, P.A.

    1988-01-01

    We propose a statistical model and a technique for objective recognition of one of the most commonly cited seismicity patterns:microearthquake quiescence. We use a Poisson process model for seismicity and define a process with quiescence as one with a particular type of piece-wise constant intensity function. From this model, we derive a statistic for testing stationarity against a 'quiescence' alternative. The large-sample null distribution of this statistic is approximated from simulated distributions of appropriate functionals applied to Brownian bridge processes. We point out the restrictiveness of the particular model we propose and of the quiescence idea in general. The fact that there are many point processes which have neither constant nor quiescent rate functions underscores the need to test for and describe nonuniformity thoroughly. We advocate the use of the quiescence test in conjunction with various other tests for nonuniformity and with graphical methods such as density estimation. ideally these methods may promote accurate description of temporal seismicity distributions and useful characterizations of interesting patterns. ?? 1988 Birkha??user Verlag.

  9. Seismic risk analysis for the Babcock and Wilcox facility, Leechburg, Pennsylvania

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-10-21

    The results of a detailed seismic risk analysis of the Babcock and Wilcox Plutonium Fuel Fabrication facility at Leechburg, Pennsylvania are presented. This report focuses on earthquakes; the other natural hazards, being addressed in separate reports, are severe weather (strong winds and tornados) and floods. The calculational method used is based on Cornell's work (1968); it has been previously applied to safety evaluations of major projects. The historical seismic record was established after a review of available literature, consultation with operators of local seismic arrays and examination of appropriate seismic data bases. Because of the aseismicity of the region aroundmore » the site, an analysis different from the conventional closest approach in a tectonic province was adapted. Earthquakes as far from the site as 1,000 km were included, as were the possibility of earthquakes at the site. In addition, various uncertainties in the input were explicitly considered in the analysis. The results of the risk analysis, which include a Bayesian estimate of the uncertainties, are presented, expressed as return period accelerations. The best estimate curve indicates that the Babcock and Wilcox facility will experience 0.05 g every 220 years and 0.10 g every 1400 years. The bounding curves roughly represent the one standard deviation confidence limits about the best estimate, reflecting the uncertainty in certain of the input. Detailed examination of the results show that the accelerations are very insensitive to the details of the source region geometries or the historical earthquake statistics in each region and that each of the source regions contributes almost equally to the cumulative risk at the site. If required for structural analysis, acceleration response spectra for the site can be constructed by scaling the mean response spectrum for alluvium in WASH 1255 by these peak accelerations.« less

  10. Strong Motion Network of Medellín and Aburrá Valley: technical advances, seismicity records and micro-earthquake monitoring

    NASA Astrophysics Data System (ADS)

    Posada, G.; Trujillo, J. C., Sr.; Hoyos, C.; Monsalve, G.

    2017-12-01

    The tectonics setting of Colombia is determined by the interaction of Nazca, Caribbean and South American plates, together with the Panama-Choco block collision, which makes a seismically active region. Regional seismic monitoring is carried out by the National Seismological Network of Colombia and the Accelerometer National Network of Colombia. Both networks calculate locations, magnitudes, depths and accelerations, and other seismic parameters. The Medellín - Aburra Valley is located in the Northern segment of the Central Cordillera of Colombia, and according to the Colombian technical seismic norm (NSR-10), is a region of intermediate hazard, because of the proximity to seismic sources of the Valley. Seismic monitoring in the Aburra Valley began in 1996 with an accelerometer network which consisted of 38 instruments. Currently, the network consists of 26 stations and is run by the Early Warning System of Medellin and Aburra Valley (SIATA). The technical advances have allowed the real-time communication since a year ago, currently with 10 stations; post-earthquake data is processed through operationally near-real-time, obtaining quick results in terms of location, acceleration, spectrum response and Fourier analysis; this information is displayed at the SIATA web site. The strong motion database is composed by 280 earthquakes; this information is the basis for the estimation of seismic hazards and risk for the region. A basic statistical analysis of the main information was carried out, including the total recorded events per station, natural frequency, maximum accelerations, depths and magnitudes, which allowed us to identify the main seismic sources, and some seismic site parameters. With the idea of a more complete seismic monitoring and in order to identify seismic sources beneath the Valley, we are in the process of installing 10 low-cost shake seismometers for micro-earthquake monitoring. There is no historical record of earthquakes with a magnitude greater than 3.5 beneath the Aburra Valley, and the neotectonic evidence are limited, so it is expected that this network helps to characterize the seismic hazards.

  11. Semiautomatic and Automatic Cooperative Inversion of Seismic and Magnetotelluric Data

    NASA Astrophysics Data System (ADS)

    Le, Cuong V. A.; Harris, Brett D.; Pethick, Andrew M.; Takam Takougang, Eric M.; Howe, Brendan

    2016-09-01

    Natural source electromagnetic methods have the potential to recover rock property distributions from the surface to great depths. Unfortunately, results in complex 3D geo-electrical settings can be disappointing, especially where significant near-surface conductivity variations exist. In such settings, unconstrained inversion of magnetotelluric data is inexorably non-unique. We believe that: (1) correctly introduced information from seismic reflection can substantially improve MT inversion, (2) a cooperative inversion approach can be automated, and (3) massively parallel computing can make such a process viable. Nine inversion strategies including baseline unconstrained inversion and new automated/semiautomated cooperative inversion approaches are applied to industry-scale co-located 3D seismic and magnetotelluric data sets. These data sets were acquired in one of the Carlin gold deposit districts in north-central Nevada, USA. In our approach, seismic information feeds directly into the creation of sets of prior conductivity model and covariance coefficient distributions. We demonstrate how statistical analysis of the distribution of selected seismic attributes can be used to automatically extract subvolumes that form the framework for prior model 3D conductivity distribution. Our cooperative inversion strategies result in detailed subsurface conductivity distributions that are consistent with seismic, electrical logs and geochemical analysis of cores. Such 3D conductivity distributions would be expected to provide clues to 3D velocity structures that could feed back into full seismic inversion for an iterative practical and truly cooperative inversion process. We anticipate that, with the aid of parallel computing, cooperative inversion of seismic and magnetotelluric data can be fully automated, and we hold confidence that significant and practical advances in this direction have been accomplished.

  12. Map of Pseudo-F-statistics of seismic noise parameters as an indicator of current seismic danger in Japan

    NASA Astrophysics Data System (ADS)

    Lyubushin, Alexey

    2016-04-01

    The problem of estimate of current seismic danger based on monitoring of seismic noise properties from broadband seismic network F-net in Japan (84 stations) is considered. Variations of the following seismic noise parameters are analyzed: multifractal singularity spectrum support width, generalized Hurst exponent, minimum Hölder-Lipschitz exponent and minimum normalized entropy of squared orthogonal wavelet coefficients. These parameters are estimated within adjacent time windows of the length 1 day for seismic noise waveforms from each station. Calculating daily median values of these parameters by all stations provides 4-dimensional time series which describes integral properties of the seismic noise in the region covered by the network. Cluster analysis is applied to the sequence of clouds of 4-dimensional vectors within moving time window of the length 365 days with mutual shift 3 days starting from the beginning of 1997 up to the current time. The purpose of the cluster analysis is to find the best number of clusters (BNC) from probe numbers which are varying from 1 up to the maximum value 40. The BNC is found from the maximum of pseudo-F-statistics (PFS). A 2D map could be created which presents dependence of PFS on the tested probe number of clusters and the right-hand end of moving time window which is rather similar to usual spectral time-frequency diagrams. In the paper [1] it was shown that the BNC before Tohoku mega-earthquake on March 11, 2011, has strongly chaotic regime with jumps from minimum up to maximum values in the time interval 1 year before the event and this time intervals was characterized by high PFS values. The PFS-map is proposed as the method for extracting time intervals with high current seismic danger. The next danger time interval after Tohoku mega-EQ began at the end of 2012 and was finished at the middle of 2013. Starting from middle of 2015 the high PFS values and chaotic regime of BNC variations were returned. This could be interpreted as the increasing of the danger of the next mega-EQ in Japan in the region of Nankai Trough [1] at the first half of 2016. References 1. Lyubushin, A., 2013. How soon would the next mega-earthquake occur in Japan? // Natural Science, 5 (8A1), 1-7. http://dx.doi.org/10.4236/ns.2013.58A1001

  13. Tectonic Divisions Based on Gravity Data and Earthquake Distribution Characteristics in the North South Seismic Belt, China

    NASA Astrophysics Data System (ADS)

    Tian, T.; Zhang, J.; Jiang, W.

    2017-12-01

    The North South Seismic Belt is located in the middle of China, and this seismic belt can be divided into 12 tectonic zones, including the South West Yunnan (I), the Sichuan Yunnan (II), the Qiang Tang (III), the Bayan Har (IV), the East Kunlun Qaidam (V), the Qi Lian Mountain (VI), the Tarim(VII), the East Alashan (VIII), the East Sichuan (IX), the Ordos(X), the Middle Yangtze River (XI) and the Edge of Qinghai Tibet Block (XII) zone. Based on the Bouguer Gravity data calculated from the EGM2008 model, the Euler deconvolution was used to obtain the edge of tectonic zone to amend the traditional tectonic divisions. In every tectonic zone and the whole research area, the logarithm of the total energy of seismic was calculated. The Time Series Analysis (TSA) for all tectonic zones and the whole area were progressed in R, and 12 equal divisions were made (A1-3, B1-3, C1-3, D1-3) by latitude and longitude as a control group. A simple linear trend fitting of time was used, and the QQ figure was used to show the residual distribution features. Among the zones according to Gravity anomalies, I, II and XII show similar statistical characteristic, with no earthquake free year (on which year there was no earthquake in the zone), and it shows that the more seismic activity area is more similar in statistical characteristic as the large area, no matter how large the zone is or how many earthquakes are in the zone. Zone IV, V, IX, III, VII and VIII show one or several seismic free year during 1970s (IV, V and IX) and 1980s (III, VII and VIII), which may implicate the earthquake activity were low decades ago or the earthquake catalogue were not complete in these zones, or both. Zone VI, X and XI show many earthquake free years even in this decade, which means in these zones the earthquake activity were very low even if the catalogue were not complete. In the control group, the earthquake free year zone appeared random and independent of the seismic density, and in all equal divided zones with seismic free years, the seismic free years all appeared in 1970s, which only related to the incompleteness of the earthquake catalogue in the west area of China. In conclusion, the tectonic divisions based on Gravity anomalies can provide a more efficient way to add space factor in the time series analysis with specific tectonic implications.

  14. Universality in the dynamical properties of seismic vibrations

    NASA Astrophysics Data System (ADS)

    Chatterjee, Soumya; Barat, P.; Mukherjee, Indranil

    2018-02-01

    We have studied the statistical properties of the observed magnitudes of seismic vibration data in discrete time in an attempt to understand the underlying complex dynamical processes. The observed magnitude data are taken from six different geographical locations. All possible magnitudes are considered in the analysis including catastrophic vibrations, foreshocks, aftershocks and commonplace daily vibrations. The probability distribution functions of these data sets obey scaling law and display a certain universality characteristic. To investigate the universality features in the observed data generated by a complex process, we applied Random Matrix Theory (RMT) in the framework of Gaussian Orthogonal Ensemble (GOE). For all these six places the observed data show a close fit with the predictions of RMT. This reinforces the idea of universality in the dynamical processes generating seismic vibrations.

  15. Proceedings of the 11th Annual DARPA/AFGL Seismic Research symposium

    NASA Astrophysics Data System (ADS)

    Lewkowicz, James F.; McPhetres, Jeanne M.

    1990-11-01

    The following subjects are covered: near source observations of quarry explosions; small explosion discrimination and yield estimation; Rg as a depth discriminant for earthquakes and explosions: a case study in New England; a comparative study of high frequency seismic noise at selected sites in the USSR and USA; chemical explosions and the discrimination problem; application of simulated annealing to joint hypocenter determination; frequency dependence of Q(sub Lg) and Q in the continental crust; statistical approaches to testing for compliance with a threshold test ban treaty; broad-band studies of seismic sources at regional and teleseismic distances using advanced time series analysis methods; effects of depth of burial and tectonic release on regional and teleseismic explosion waveforms; finite difference simulations of seismic wave excitation at Soviet test sites with deterministic structures; stochastic geologic effects on near-field ground motions; the damage mechanics of porous rock; nonlinear attenuation mechanism in salt at moderate strain; compressional- and shear-wave polarizations at the Anza seismic array; and a generalized beamforming approach to real time network detection and phase association.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog’s inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable withmore » respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.« less

  17. Seismic Hazard and Risk Assessments for Beijing-Tianjin-Tangshan, China, Area

    USGS Publications Warehouse

    Xie, F.; Wang, Z.; Liu, J.

    2011-01-01

    Seismic hazard and risk in the Beijing-Tianjin-Tangshan, China, area were estimated from 500-year intensity observations. First, we digitized the intensity observations (maps) using ArcGIS with a cell size of 0.1 ?? 0.1??. Second, we performed a statistical analysis on the digitized intensity data, determined an average b value (0.39), and derived the intensity-frequency relationship (hazard curve) for each cell. Finally, based on a Poisson model for earthquake occurrence, we calculated seismic risk in terms of a probability of I ??? 7, 8, or 9 in 50 years. We also calculated the corresponding 10 percent probability of exceedance of these intensities in 50 years. The advantages of assessing seismic hazard and risk from intensity records are that (1) fewer assumptions (i. e., earthquake source and ground motion attenuation) are made, and (2) site-effect is included. Our study shows that the area has high seismic hazard and risk. Our study also suggests that current design peak ground acceleration or intensity for the area may not be adequate. ?? 2010 Birkh??user / Springer Basel AG.

  18. Seismic signal and noise on Europa

    NASA Astrophysics Data System (ADS)

    Panning, Mark; Stähler, Simon; Bills, Bruce; Castillo Castellanos, Jorge; Huang, Hsin-Hua; Husker, Allen; Kedar, Sharon; Lorenz, Ralph; Pike, William T.; Schmerr, Nicholas; Tsai, Victor; Vance, Steven

    2017-10-01

    Seismology is one of our best tools for detailing interior structure of planetary bodies, and a seismometer is included in the baseline and threshold mission design for the upcoming Europa Lander mission. Guiding mission design and planning for adequate science return, though, requires modeling of both the anticipated signal and noise. Assuming ice seismicity on Europa behaves according to statistical properties observed in Earth catalogs and scaling cumulative seismic moment release to the moon, we can simulate long seismic records and estimate background noise and peak signal amplitudes (Panning et al., 2017). This suggests a sensitive instrument comparable to many broadband terrestrial instruments or the SP instrument from the InSight mission to Mars will be able to record signals, while high frequency geophones are likely inadequate. We extend this analysis to also begin incorporation of spatial and temporal variation due to the tidal cycle, which can help inform landing site selection. We also begin exploration of how chaotic terrane at the bottom of the ice shell and inter-ice heterogeneities (i.e. internal melt structures) may affect anticipated seismic observations using 2D numerical seismic simulations.M. P. Panning, S. C. Stähler, H.-H. Huang, S. D. Vance, S. Kedar, V. C. Tsai, W. T. Pike, R. D. Lorenz, “Expected seismicity and the seismic noise environment of Europa,” J. Geophys. Res., in revision, 2017.

  19. An analysis of Greek seismicity based on Non Extensive Statistical Physics: The interdependence of magnitude, interevent time and interevent distance.

    NASA Astrophysics Data System (ADS)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2014-05-01

    The context of Non Extensive Statistical Physics (NESP) has recently been suggested to comprise an appropriate tool for the analysis of complex dynamic systems with scale invariance, long-range interactions, long-range memory and systems that evolve in a fractal-like space-time. This is because the active tectonic grain is thought to comprise a (self-organizing) complex system; therefore, its expression (seismicity) should be manifested in the temporal and spatial statistics of energy release rates. In addition to energy release rates expressed by the magnitude M, measures of the temporal and spatial interactions are the time (Δt) and hypocentral distance (Δd) between consecutive events. Recent work indicated that if the distributions of M, Δt and Δd are independent so that the joint probability p(M,Δt,Δd) factorizes into the probabilities of M, Δt and Δd, i.e. p(M,Δt,Δd)= p(M)p(Δt)p(Δd), then the frequency of earthquake occurrence is multiply related, not only to magnitude as the celebrated Gutenberg - Richter law predicts, but also to interevent time and distance by means of well-defined power-laws consistent with NESP. The present work applies these concepts to investigate the self-organization and temporal/spatial dynamics of seismicity in Greece and western Turkey, for the period 1964-2011. The analysis was based on the ISC earthquake catalogue which is homogenous by construction with consistently determined hypocenters and magnitude. The presentation focuses on the analysis of bivariate Frequency-Magnitude-Time distributions, while using the interevent distances as spatial constraints (or spatial filters) for studying the spatial dependence of the energy and time dynamics of the seismicity. It is demonstrated that the frequency of earthquake occurrence is multiply related to the magnitude and the interevent time by means of well-defined multi-dimensional power-laws consistent with NESP and has attributes of universality,as its holds for a broad range of spatial, temporal and magnitude scales. Provided that the multivariate empirical frequency distributions are based on a sufficient number of observations as an empirical lower limit, the results are stable and consistent with the established ken, irrespective of the magnitude and spatio-temporal range of the earthquake catalogue, or operations pertaining to re-sampling, bootstrapping or re-arrangement of the catalogue. It is also demonstrated that that the expression of the regional active tectonic grain may comprise a mixture of processes significantly dependent on Δd. The analysis of the size (energy) distribution of earthquakes yielded results consistent with a correlated sub-extensive system; the results are also consistent with conventional determinations of Frequency-Magnitude distributions. The analysis of interevent times, has determined the existence of sub-extensivity and near-field interaction (correlation) in the complete catalogue of Greek and western Turkish seismicity (mixed background earthquake activity and aftershock processes),as well as in the pure background process (declustered catalogue).This could be attributed to the joint effect of near-field interaction between neighbouring earthquakes or seismic areas and interaction within aftershock sequences. The background process appears to be moderately - weakly correlated at the far field. Formal random temporal processes have not been detected. A general syllogism affordable by the above observations is that aftershock sequences may be an integral part of the seismogenetic process, as they appear to partake in long-range interaction. A formal explanation of such an effect is pending, but may nevertheless involve delayed remote triggering of seismic activity by (transient or static) stress transfer from the main shocks and large aftershocks and/or cascading effects already discussed by Marsan and Lengliné (2008). In this view, the effect weakens when aftershocks are removed because aftershocks are the link between the main shocks and their remote offshoot. Overall, the above results compare well to the results of North Californian seismicity which have shown that the expression of seismicity at Northern California is generally consistent with non-extensive (sub-extensive) thermodynamics. Acknowledgments: This work was supported by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project "Integrated understanding of Seismicity, using innovative methodologies of Fracture Mechanics along with Earthquake and Non-Extensive Statistical Physics - Application to the geodynamic system of the Hellenic Arc - SEISMO FEAR HELLARC". References: Tzanis A., Vallianatos F., Efstathiou A., Multidimensional earthquake frequency distributions consistent with Non-Extensive Statistical Physics: the interdependence of magnitude, interevent time and interevent distance in North California. Bulletin of the Geological Society of Greece, vol. XLVII 2013. Proceedings of the 13th International Congress, Chania, Sept. 2013 Tzanis A., Vallianatos F., Efstathiou A., Generalized multidimensional earthquake frequency distributions consistent with Non-Extensive Statistical Physics: An appraisal of the universality in the interdependence of magnitude, interevent time and interevent distance Geophysical Research Abstracts, Vol. 15, EGU2013-628, 2013, EGU General Assembly 2013 Marsan, D. and Lengliné, O., 2008. Extending earthquakes's reach through cascading, Science, 319, 1076; doi: 10.1126/science.1148783 On-line Bulletin, http://www.isc.ac.uk, Internatl. Seis. Cent., Thatcham, United Kingdom, 2011.

  20. Seismic activity prediction using computational intelligence techniques in northern Pakistan

    NASA Astrophysics Data System (ADS)

    Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat

    2017-10-01

    Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.

  1. The Seismic risk perception in Italy deduced by a statistical sample

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro; Pessina, Vera; Peruzza, Laura; Cerbara, Loredana; Crescimbene, Cristiana

    2015-04-01

    In 2014 EGU Assembly we presented the results of a web a survey on the perception of seismic risk in Italy. The data were derived from over 8,500 questionnaires coming from all Italian regions. Our questionnaire was built by using the semantic differential method (Osgood et al. 1957) with a seven points Likert scale. The questionnaire is inspired the main theoretical approaches of risk perception (psychometric paradigm, cultural theory, etc.) .The results were promising and seem to clearly indicate an underestimation of seismic risk by the italian population. Based on these promising results, the DPC has funded our research for the second year. In 2015 EGU Assembly we present the results of a new survey deduced by an italian statistical sample. The importance of statistical significance at national scale was also suggested by ISTAT (Italian Statistic Institute), considering the study as of national interest, accepted the "project on the perception of seismic risk" as a pilot study inside the National Statistical System (SISTAN), encouraging our RU to proceed in this direction. The survey was conducted by a company specialised in population surveys using the CATI method (computer assisted telephone interview). Preliminary results will be discussed. The statistical support was provided by the research partner CNR-IRPPS. This research is funded by Italian Civil Protection Department (DPC).

  2. Application of Visual Attention in Seismic Attribute Analysis

    NASA Astrophysics Data System (ADS)

    He, M.; Gu, H.; Wang, F.

    2016-12-01

    It has been proved that seismic attributes can be used to predict reservoir. The joint of multi-attribute and geological statistics, data mining, artificial intelligence, further promote the development of the seismic attribute analysis. However, the existing methods tend to have multiple solutions and insufficient generalization ability, which is mainly due to the complex relationship between seismic data and geological information, and undoubtedly own partly to the methods applied. Visual attention is a mechanism model of the human visual system which can concentrate on a few significant visual objects rapidly, even in a mixed scene. Actually, the model qualify good ability of target detection and recognition. In our study, the targets to be predicted are treated as visual objects, and an object representation based on well data is made in the attribute dimensions. Then in the same attribute space, the representation is served as a criterion to search the potential targets outside the wells. This method need not predict properties by building up a complicated relation between attributes and reservoir properties, but with reference to the standard determined before. So it has pretty good generalization ability, and the problem of multiple solutions can be weakened by defining the threshold of similarity.

  3. Seismic Modeling Of Reservoir Heterogeneity Scales: An Application To Gas Hydrate Reservoirs

    NASA Astrophysics Data System (ADS)

    Huang, J.; Bellefleur, G.; Milkereit, B.

    2008-12-01

    Natural gas hydrates, a type of inclusion compound or clathrate, are composed of gas molecules trapped within a cage of water molecules. The occurrence of gas hydrates in permafrost regions has been confirmed by core samples recovered from the Mallik gas hydrate research wells located within Mackenzie Delta in Northwest Territories of Canada. Strong vertical variations of compressional and shear sonic velocities and weak surface seismic expressions of gas hydrates indicate that lithological heterogeneities control the distribution of hydrates. Seismic scattering studies predict that typical scales and strong physical contrasts due to gas hydrate concentration will generate strong forward scattering, leaving only weak energy captured by surface receivers. In order to understand the distribution of hydrates and the seismic scattering effects, an algorithm was developed to construct heterogeneous petrophysical reservoir models. The algorithm was based on well logs showing power law features and Gaussian or Non-Gaussian probability density distribution, and was designed to honor the whole statistical features of well logs such as the characteristic scales and the correlation among rock parameters. Multi-dimensional and multi-variable heterogeneous models representing the same statistical properties were constructed and applied to the heterogeneity analysis of gas hydrate reservoirs. The petrophysical models provide the platform to estimate rock physics properties as well as to study the impact of seismic scattering, wave mode conversion, and their integration on wave behavior in heterogeneous reservoirs. Using the Biot-Gassmann theory, the statistical parameters obtained from Mallik 5L-38, and the correlation length estimated from acoustic impedance inversion, gas hydrate volume fraction in Mallik area was estimated to be 1.8%, approximately 2x108 m3 natural gas stored in a hydrate bearing interval within 0.25 km2 lateral extension and between 889 m and 1115 m depth. With parallel 3-D viscoelastic Finite Difference (FD) software, we conducted a 3D numerical experiment of near offset Vertical Seismic Profile. The synthetic results implied that the strong attenuation observed in the field data might be caused by the scattering.

  4. Eruption Forecasting in Alaska: A Retrospective and Test of the Distal VT Model

    NASA Astrophysics Data System (ADS)

    Prejean, S. G.; Pesicek, J. D.; Wellik, J.; Cameron, C.; White, R. A.; McCausland, W. A.; Buurman, H.

    2015-12-01

    United States volcano observatories have successfully forecast most significant US eruptions in the past decade. However, eruptions of some volcanoes remain stubbornly difficult to forecast effectively using seismic data alone. The Alaska Volcano Observatory (AVO) has responded to 28 eruptions from 10 volcanoes since 2005. Eruptions that were not forecast include those of frequently active volcanoes with basaltic-andesite magmas, like Pavlof, Veniaminof, and Okmok volcanoes. In this study we quantify the success rate of eruption forecasting in Alaska and explore common characteristics of eruptions not forecast. In an effort to improve future forecasts, we re-examine seismic data from eruptions and known intrusive episodes in Alaska to test the effectiveness of the distal VT model commonly employed by the USGS-USAID Volcano Disaster Assistance Program (VDAP). In the distal VT model, anomalous brittle failure or volcano-tectonic (VT) earthquake swarms in the shallow crust surrounding the volcano occur as a secondary response to crustal strain induced by magma intrusion. Because the Aleutian volcanic arc is among the most seismically active regions on Earth, distinguishing distal VT earthquake swarms for eruption forecasting purposes from tectonic seismicity unrelated to volcanic processes poses a distinct challenge. In this study, we use a modified beta-statistic to identify pre-eruptive distal VT swarms and establish their statistical significance with respect to long-term background seismicity. This analysis allows us to explore the general applicability of the distal VT model and quantify the likelihood of encountering false positives in eruption forecasting using this model alone.

  5. Discriminating Characteristics of Tectonic and Human-Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Zaliapin, I. V.; Ben-Zion, Y.

    2015-12-01

    We analyze statistical features of background and clustered subpopulations of earthquakes in different regions in an effort to distinguish between human-induced and natural seismicity. Analysis of "end-member" areas known to be dominated by human-induced earthquakes (the Geyser geothermal field in northern California and TauTona gold mine in South Africa) and regular tectonic activity (the San Jacinto fault zone in southern California and Coso region excluding the Coso geothermal field in eastern central California) reveals several distinguishing characteristics. Induced seismicity is shown to have (i) higher rate of background events (both absolute and relative to the total rate), (ii) faster temporal offspring decay, (iii) higher intensity of repeating events, (iv) larger proportion of small clusters, and (v) larger spatial separation between parent and offspring, compared to regular tectonic activity. These differences also successfully discriminate seismicity within the Coso and Salton Sea geothermal fields in California before and after the expansion of geothermal production during the 1980s.

  6. Fragility Analysis Methodology for Degraded Structures and Passive Components in Nuclear Power Plants - Illustrated using a Condensate Storage Tank

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, J.; Braverman, J.; Hofmayer, C.

    2010-06-30

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structuresmore » and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. In the Year 1 scope of work, BNL collected and reviewed degradation occurrences in US NPPs and identified important aging characteristics needed for the seismic capability evaluations. This information is presented in the Annual Report for the Year 1 Task, identified as BNL Report-81741-2008 and also designated as KAERI/RR-2931/2008. The report presents results of the statistical and trending analysis of this data and compares the results to prior aging studies. In addition, the report provides a description of U.S. current regulatory requirements, regulatory guidance documents, generic communications, industry standards and guidance, and past research related to aging degradation of SSCs. In the Year 2 scope of work, BNL carried out a research effort to identify and assess degradation models for the long-term behavior of dominant materials that are determined to be risk significant to NPPs. Multiple models have been identified for concrete, carbon and low-alloy steel, and stainless steel. These models are documented in the Annual Report for the Year 2 Task, identified as BNL Report-82249-2009 and also designated as KAERI/TR-3757/2009. This report describes the research effort performed by BNL for the Year 3 scope of work. The objective is for BNL to develop the seismic fragility capacity for a condensate storage tank with various degradation scenarios. The conservative deterministic failure margin method has been utilized for the undegraded case and has been modified to accommodate the degraded cases. A total of five seismic fragility analysis cases have been described: (1) undegraded case, (2) degraded stainless tank shell, (3) degraded anchor bolts, (4) anchorage concrete cracking, and (5)a perfect combination of the three degradation scenarios. Insights from these fragility analyses are also presented.« less

  7. Landslides triggered by the 12 January 2010 Port-au-Prince, Haiti, Mw = 7.0 earthquake: visual interpretation, inventory compiling, and spatial distribution statistical analysis

    NASA Astrophysics Data System (ADS)

    Xu, C.; Shyu, J. B. H.; Xu, X.

    2014-07-01

    The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw= 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and the thicknesses of their erosion with topographic, geologic, and seismic parameters. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolution satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various environmental parameters. These parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several previous publications. Therefore, we carried out comparisons of inventories of landslides triggered by the Haiti earthquake with other published results and proposed possible reasons for any differences. We suggest that the empirical functions between earthquake magnitude and co-seismic landslides need to be updated on the basis of the abundant and more complete co-seismic landslide inventories recently available.

  8. Landslides triggered by the 12 January 2010 Mw 7.0 Port-au-Prince, Haiti, earthquake: visual interpretation, inventory compiling and spatial distribution statistical analysis

    NASA Astrophysics Data System (ADS)

    Xu, C.; Shyu, J. B. H.; Xu, X.-W.

    2014-02-01

    The 12 January 2010 Port-au-Prince, Haiti, earthquake (Mw 7.0) triggered tens of thousands of landslides. The purpose of this study is to investigate the correlations of the occurrence of landslides and their erosion thicknesses with topographic factors, seismic parameters, and their distance from roads. A total of 30 828 landslides triggered by the earthquake covered a total area of 15.736 km2, distributed in an area more than 3000 km2, and the volume of landslide accumulation materials is estimated to be about 29 700 000 m3. These landslides are of various types, mostly belonging to shallow disrupted landslides and rock falls, but also include coherent deep-seated landslides and rock slides. These landslides were delineated using pre- and post-earthquake high-resolutions satellite images. Spatial distribution maps and contour maps of landslide number density, landslide area percentage, and landslide erosion thickness were constructed in order to analyze the spatial distribution patterns of co-seismic landslides. Statistics of size distribution and morphometric parameters of co-seismic landslides were carried out and were compared with other earthquake events in the world. Four proxies of co-seismic landslide abundance, including landslides centroid number density (LCND), landslide top number density (LTND), landslide area percentage (LAP), and landslide erosion thickness (LET) were used to correlate co-seismic landslides with various landslide controlling parameters. These controlling parameters include elevation, slope angle, slope aspect, slope curvature, topographic position, distance from drainages, lithology, distance from the epicenter, distance from the Enriquillo-Plantain Garden fault, distance along the fault, and peak ground acceleration (PGA). A comparison of these impact parameters on co-seismic landslides shows that slope angle is the strongest impact parameter on co-seismic landslide occurrence. Our co-seismic landslide inventory is much more detailed than other inventories in several previous publications. Therefore, we carried out comparisons of inventories of landslides triggered by the Haiti earthquake with other published results and proposed possible reasons of any differences. We suggest that the empirical functions between earthquake magnitude and co-seismic landslides need to update on the basis of the abundant and more complete co-seismic landslide inventories recently available.

  9. Clusters of Earthquakes In The Southern of Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Posadas, A. M.; Luzón, F.; Vidal, F.

    The southern part of the Iberian Peninsula forms part of the western border of Eurasia- Africa plate boundary. This area is characterized by the occurrence of earthquakes of moderate magnitude (the maximum magnitude ranging from 4.5 to 5.5). From the point of view of seismic activity, this region is the most active one in he Iberian Penin- sula. Until earlier 80, only the National Seismic Network belonging to the National Geographic Institute monitores the activity in the south of Iberian Peninsula. From 1983 to the actuality, the Andalusian Seismic Network belonging to the Andalusian Geophysics Institute and Seismic Disaster Prevention, records the microseismicity of the area. Nowadays, the earthquakes catalogue used belongs to the Andalusian Insti- tute of Geophysics and Seismic Disaster Prevention and it counts on more than 20000 events registered from 1985 to 2001. Today, after 20 years of recording seismic ac- tivity, statistics analysis of the catalogue have sense. In this paper we present a first approach to the clustering properties of the seismicity in the south of the Iberian Penin- sula. The analysis carried out starts with the study of clustering properties (temporal and spatial properties) in the Southern of Iberian Peninsula seismicity to demonstrate, by using the Fractal Dimension of the temporal earthquake distribution and the Mor- ishita Index of the spatial distribution of earthquakes, that this seismicity is charac- terized by a tendency to form earthquake clusters, both spatial and temporal clusters. As an example, five seismogenetic areas of the zone are analyzed (Adra-Berja, Agron, Alboran, Antequera and Loja). This particular study of the series find out the b param- eter from the Gutenberg-Richter's Law (which characterizes the energetic relaxation of events), the p parameter from Omori's Law (that characterizes the temporal relax- ation of aftershocks) and the Fractal Dimension of the spatial distribution of earth- quakes (to find the characteristic geometry seismogenetic zone).

  10. Spots of Seismic Danger Extracted by Properties of Low-Frequency Seismic Noise

    NASA Astrophysics Data System (ADS)

    Lyubushin, Alexey

    2013-04-01

    A new method of seismic danger estimate is presented which is based on using properties of low-frequency seismic noise from broadband networks. Two statistics of noise waveforms are considered: multi-fractal singularity spectrum support width D and minimum normalized entropy En of squared orthogonal wavelet coefficients. The maps of D and En are plotted in the moving time window. Let us call the regions extracted by low values of D and high values of En as "spots of seismic danger" - SSD. Mean values of D and En are strongly anti-correlated - that is why statistics D and En extract the same SSD. Nevertheless their mutual considering is expedient because these parameters are based on different approaches. The physical mechanism which underlies the method is consolidation of small blocks of the Earth's crust into the large one before the strong earthquake. This effect has a consequence that seismic noise does not include spikes which are connected with mutual movements of small blocks. The absence of irregular spikes in the noise follows the decreasing of D and increasing of entropy En. The stability in space and size of the SSD provides estimates of the place and energy of the probable future earthquake. The increasing or decreasing of SSD size and minimum or maximum values of D and En within SSD allows estimate the trend of seismic danger. The method is illustrating by the analysis of seismic noise from broadband seismic network F-net in Japan [1-5]. Statistically significant decreasing of D allowed a hypothesis about approaching Japan to a future seismic catastrophe to be formulated at the middle of 2008. The peculiarities of correlation coefficient estimate within 1 year time window between median values of D and generalized Hurst exponent allowed to make a decision that starting from July of 2010 Japan come to the state of waiting strong earthquake [3]. The method extracted a huge SSD near Japan which includes the region of future Tohoku mega-earthquake and the region of Nankai Trough. The analysis of seismic noise after March 2011 indicates increasing of probability of the 2nd mega-earthquake starting from the middle of 2013 within the region of Nankai Trough which remains to be SSD. References 1. Lyubushin A. Multifractal Parameters of Low-Frequency Microseisms // V. de Rubeis et al. (eds.), Synchronization and Triggering: from Fracture to Earthquake Processes, GeoPlanet: Earth and Planetary Sciences 1, DOI 10.1007/978-3-642-12300-9_15, Springer-Verlag Berlin Heidelberg, 2010, 388p., Chapter 15, pp.253-272. http://www.springerlink.com/content/hj2l211577533261/ 2. Lyubushin A.A. Synchronization of multifractal parameters of regional and global low-frequency microseisms - European Geosciences Union General Assembly 2010, Vienna, 02-07 of May, 2010, Geophysical Research Abstracts, Vol. 12, EGU2010-696, 2010. http://meetingorganizer.copernicus.org/EGU2010/EGU2010-696.pdf 3. Lyubushin A.A. Synchronization phenomena of low-frequency microseisms. European Seismological Commission, 32nd General Assembly, September 06-10, 2010, Montpelier, France. Book of abstracts, p.124, session ES6. http://alexeylyubushin.narod.ru/ESC-2010_Book_of_abstracts.pdf 4. Lyubushin A.A. Seismic Catastrophe in Japan on March 11, 2011: Long-Term Prediction on the Basis of Low-Frequency Microseisms - Izvestiya, Atmospheric and Oceanic Physics, 2011, Vol. 46, No. 8, pp. 904-921. http://www.springerlink.com/content/kq53j2667024w715/ 5. Lyubushin, A. Prognostic properties of low-frequency seismic noise. Natural Science, 4, 659-666.doi: 10.4236/ns.2012.428087. http://www.scirp.org/journal/PaperInformation.aspx?paperID=21656

  11. SOME APPLICATIONS OF SEISMIC SOURCE MECHANISM STUDIES TO ASSESSING UNDERGROUND HAZARD.

    USGS Publications Warehouse

    McGarr, A.; ,

    1984-01-01

    Various measures of the seismic source mechanism of mine tremors, such as magnitude, moment, stress drop, apparent stress, and seismic efficiency, can be related directly to several aspects of the problem of determining the underground hazard arising from strong ground motion of large seismic events. First, the relation between the sum of seismic moments of tremors and the volume of stope closure caused by mining during a given period can be used in conjunction with magnitude-frequency statistics and an empirical relation between moment and magnitude to estimate the maximum possible sized tremor for a given mining situation. Second, it is shown that the 'energy release rate,' a commonly-used parameter for predicting underground seismic hazard, may be misleading in that the importance of overburden stress, or depth, is overstated. Third, results involving the relation between peak velocity and magnitude, magnitude-frequency statistics, and the maximum possible magnitude are applied to the problem of estimating the frequency at which design limits of certain underground support equipment are likely to be exceeded.

  12. Investigation of Pre-Earthquake Ionospheric Disturbances by 3D Tomographic Analysis

    NASA Astrophysics Data System (ADS)

    Yagmur, M.

    2016-12-01

    Ionospheric variations before earthquakes have been widely discussed phenomena in ionospheric studies. To clarify the source and mechanism of these phenomena is highly important for earthquake forecasting. To well understanding the mechanical and physical processes of pre-seismic Ionospheric anomalies that might be related even with Lithosphere-Atmosphere-Ionosphere-Magnetosphere Coupling, both statistical and 3D modeling analysis are needed. For these purpose, firstly we have investigated the relation between Ionospheric TEC Anomalies and potential source mechanisms such as space weather activity and lithospheric phenomena like positive surface electric charges. To distinguish their effects on Ionospheric TEC, we have focused on pre-seismically active days. Then, we analyzed the statistical data of 54 earthquakes that M≽6 between 2000 and 2013 as well as the 2011 Tohoku and the 2016 Kumamoto Earthquakes in Japan. By comparing TEC anomaly and Solar activity by Dst Index, we have found that 28 events that might be related with Earthquake activity. Following the statistical analysis, we also investigate the Lithospheric effect on TEC change on selected days. Among those days, we have chosen two case studies as the 2011 Tohoku and the 2016 Kumamoto Earthquakes to make 3D reconstructed images by utilizing 3D Tomography technique with Neural Networks. The results will be presented in our presentation. Keywords : Earthquake, 3D Ionospheric Tomography, Positive and Negative Anomaly, Geomagnetic Storm, Lithosphere

  13. Estimation of background noise level on seismic station using statistical analysis for improved analysis accuracy

    NASA Astrophysics Data System (ADS)

    Han, S. M.; Hahm, I.

    2015-12-01

    We evaluated the background noise level of seismic stations in order to collect the observation data of high quality and produce accurate seismic information. Determining of the background noise level was used PSD (Power Spectral Density) method by McNamara and Buland (2004) in this study. This method that used long-term data is influenced by not only innate electronic noise of sensor and a pulse wave resulting from stabilizing but also missing data and controlled by the specified frequency which is affected by the irregular signals without site characteristics. It is hard and inefficient to implement process that filters out the abnormal signal within the automated system. To solve these problems, we devised a method for extracting the data which normally distributed with 90 to 99% confidence intervals at each period. The availability of the method was verified using 62-seismic stations with broadband and short-period sensors operated by the KMA (Korea Meteorological Administration). Evaluation standards were NHNM (New High Noise Model) and NLNM (New Low Noise Model) published by the USGS (United States Geological Survey). It was designed based on the western United States. However, Korean Peninsula surrounded by the ocean on three sides has a complicated geological structure and a high population density. So, we re-designed an appropriate model in Korean peninsula by statistically combined result. The important feature is that secondary-microseism peak appeared at a higher frequency band. Acknowledgements: This research was carried out as a part of "Research for the Meteorological and Earthquake Observation Technology and Its Application" supported by the 2015 National Institute of Meteorological Research (NIMR) in the Korea Meteorological Administration.

  14. Mechanical and Statistical Evidence of Human-Caused Earthquakes - A Global Data Analysis

    NASA Astrophysics Data System (ADS)

    Klose, C. D.

    2012-12-01

    The causality of large-scale geoengineering activities and the occurrence of earthquakes with magnitudes of up to M=8 is discussed and mechanical and statistical evidence is provided. The earthquakes were caused by artificial water reservoir impoundments, underground and open-pit mining, coastal management, hydrocarbon production and fluid injections/extractions. The presented global earthquake catalog has been recently published in the Journal of Seismology and is available for the public at www.cdklose.com. The data show evidence that geomechanical relationships exist with statistical significance between a) seismic moment magnitudes of observed earthquakes, b) anthropogenic mass shifts on the Earth's crust, and c) lateral distances of the earthquake hypocenters to the locations of the mass shifts. Research findings depend on uncertainties, in particular, of source parameter estimations of seismic events before instrumental recoding. First analyses, however, indicate that that small- to medium size earthquakes (M6) tend to be triggered. The rupture propagation of triggered events might be dominated by pre-existing tectonic stress conditions. Besides event specific evidence, large earthquakes such as China's 2008 M7.9 Wenchuan earthquake fall into a global pattern and can not be considered as outliers or simply seen as an act of god. Observations also indicate that every second seismic event tends to occur after a decade, while pore pressure diffusion seems to only play a role when injecting fluids deep underground. The chance of an earthquake to nucleate after two or 20 years near an area with a significant mass shift is 25% or 75% respectively. Moreover, causative effects of seismic activities highly depend on the tectonic stress regime in the Earth's crust in which geoengineering takes place.

  15. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2002

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Moran, Seth C.; Sánchez, John; Estes, Steve; McNutt, Stephen R.; Paskievitch, John

    2003-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988 (Power and others, 1993; Jolly and others, 1996; Jolly and others, 2001; Dixon and others, 2002). The primary objectives of this program are the seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents the basic seismic data and changes in the seismic monitoring program for the period January 1, 2002 through December 31, 2002. Appendix G contains a list of publications pertaining to seismicity of Alaskan volcanoes based on these and previously recorded data. The AVO seismic network was used to monitor twenty-four volcanoes in real time in 2002. These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai Volcanic Group (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Great Sitkin Volcano, and Kanaga Volcano (Figure 1). Monitoring highlights in 2002 include an earthquake swarm at Great Sitkin Volcano in May-June; an earthquake swarm near Snowy Mountain in July-September; low frequency (1-3 Hz) tremor and long-period events at Mount Veniaminof in September-October and in December; and continuing volcanogenic seismic swarms at Shishaldin Volcano throughout the year. Instrumentation and data acquisition highlights in 2002 were the installation of a subnetwork on Okmok Volcano, the establishment of telemetry for the Mount Veniaminof subnetwork, and the change in the data acquisition system to an EARTHWORM detection system. AVO located 7430 earthquakes during 2002 in the vicinity of the monitored volcanoes. This catalog includes: (1) a description of instruments deployed in the field and their locations; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of velocity models used for earthquake locations; (4) a summary of earthquakes located in 2002; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2002.The AVO seismic network was used to monitor twenty-four volcanoes in real time in 2002. These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai Volcanic Group (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Great Sitkin Volcano, and Kanaga Volcano (Figure 1). Monitoring highlights in 2002 include an earthquake swarm at Great Sitkin Volcano in May-June; an earthquake swarm near Snowy Mountain in July-September; low frequency (1-3 Hz) tremor and long-period events at Mount Veniaminof in September-October and in December; and continuing volcanogenic seismic swarms at Shishaldin Volcano throughout the year. Instrumentation and data acquisition highlights in 2002 were the installation of a subnetwork on Okmok Volcano, the establishment of telemetry for the Mount Veniaminof subnetwork, and the change in the data acquisition system to an EARTHWORM detection system. AVO located 7430 earthquakes during 2002 in the vicinity of the monitored volcanoes.This catalog includes: (1) a description of instruments deployed in the field and their locations; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of velocity models used for earthquake locations; (4) a summary of earthquakes located in 2002; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2002.

  16. Evaluating the Relationship Between Seismicity and Subsurface Well Activity in Utah

    NASA Astrophysics Data System (ADS)

    Lajoie, L. J.; Bennett, S. E. K.

    2014-12-01

    Understanding the relationship between seismicity and subsurface well activity is crucial to evaluating the seismic hazard of transient, non-tectonic seismicity. Several studies have demonstrated correlations between increased frequency of earthquake occurrence and the injection/production of fluids (e.g. oil, water) in nearby subsurface wells in intracontinental settings (e.g. Arkansas, Colorado, Ohio, Oklahoma, Texas). Here, we evaluate all earthquake magnitudes for the past 20-30 years across the diverse seismotectonic settings of Utah. We explore earthquakes within 5 km and subsequent to completion dates of oil and gas wells. We compare seismicity rates prior to well establishment with rates after well establishment in an attempt to discriminate between natural and anthropogenic earthquakes in areas of naturally high background seismicity. In a few central Utah locations, we find that the frequency of shallow (0-10 km) earthquakes increased subsequent to completion of gas wells within 5 km, and at depths broadly similar to bottom hole depths. However, these regions typically correspond to mining regions of the Wasatch Plateau, complicating our ability to distinguish between earthquakes related to either well activity or mining. We calculate earthquake density and well density and compare their ratio (earthquakes per area/wells per area) with several published metrics of seismotectonic setting. Areas with a higher earthquake-well ratio are located in relatively high strain regions (determined from GPS) associated with the Intermountain Seismic Belt, but cannot be attributed to any specific Quaternary-active fault. Additionally, higher ratio areas do not appear to coincide with anomalously high heat flow values, where rocks are typically thermally weakened. Incorporation of timing and volume data for well injection/production would allow for more robust temporal statistical analysis and hazard analysis.

  17. Tempo-spatial analysis of Fennoscandian intraplate seismicity

    NASA Astrophysics Data System (ADS)

    Roberts, Roland; Lund, Björn

    2017-04-01

    Coupled spatial-temporal patterns of the occurrence of earthquakes in Fennoscandia are analysed using non-parametric methods. The occurrence of larger events is unambiguously and very strongly temporally clustered, with major implications for the assessment of seismic hazard in areas such as Fennoscandia. In addition, there is a clear pattern of geographical migration of activity. Data from the Swedish National Seismic Network and a collated international catalogue are analysed. Results show consistent patterns on different spatial and temporal scales. We are currently investigating these patterns in order to assess the statistical significance of the tempo-spatial patterns, and to what extent these may be consistent with stress transfer mechanism such as coulomb stress and pore fluid migration. Indications are that some further mechanism is necessary in order to explain the data, perhaps related to post-glacial uplift, which is up to 1cm/year.

  18. Assessment of the Seismic Risk in the City of Yerevan and its Mitigation by Application of Innovative Seismic Isolation Technologies

    NASA Astrophysics Data System (ADS)

    Melkumyan, Mikayel G.

    2011-03-01

    It is obvious that the problem of precise assessment and/or analysis of seismic hazard (SHA) is quite a serious issue, and seismic risk reduction considerably depends on it. It is well known that there are two approaches in seismic hazard analysis, namely, deterministic (DSHA) and probabilistic (PSHA). The latter utilizes statistical estimates of earthquake parameters. However, they may not exist in a specific region, and using PSHA it is difficult to take into account local aspects, such as specific regional geology and site effects, with sufficient precision. For this reason, DSHA is preferable in many cases. After the destructive 1988 Spitak earthquake, the SHA of the territory of Armenia has been revised and increased. The distribution pattern for seismic risk in Armenia is given. Maximum seismic risk is concentrated in the region of the capital, the city of Yerevan, where 40% of the republic's population resides. We describe the method used for conducting seismic resistance assessment of the existing reinforced concrete (R/C) buildings. Using this assessment, as well as GIS technology, the coefficients characterizing the seismic risk of destruction were calculated for almost all buildings of Yerevan City. The results of the assessment are presented. It is concluded that, presently, there is a particularly pressing need for strengthening existing buildings. We then describe non-conventional approaches to upgrading the earthquake resistance of existing multistory R/C frame buildings by means of Additional Isolated Upper Floor (AIUF) and of existing stone and frame buildings by means of base isolation. In addition, innovative seismic isolation technologies were developed and implemented in Armenia for construction of new multistory multifunctional buildings. The advantages of these technologies are listed in the paper. It is worth noting that the aforementioned technologies were successfully applied for retrofitting an existing 100-year-old bank building in Irkutsk (Russia), for retrofit design of an existing 177-year-old municipality building in Iasi (Romania) and for construction of a new clinic building in Stepanakert (Nagorno Karabakh). Short descriptions of these projects are presented. Since 1994 the total number of base and roof isolated buildings constructed, retrofitted or under construction in Armenia, has reached 32. Statistics of seismically isolated buildings are given in the paper. The number of base isolated buildings per capita in Armenia is one of the highest in the world. In Armenia, for the first time in history, retrofitting of existing buildings by base isolation was carried out without interruption in the use of the buildings. The description of different base isolated buildings erected in Armenia, as well as the description of the method of retrofitting of existing buildings which is patented in Armenia (M. G. Melkumyan, patent of the Republic of Armenia No. 579), are also given in the paper.

  19. Some Probabilistic and Statistical Properties of the Seismic Regime of Zemmouri (Algeria) Seismoactive Zone

    NASA Astrophysics Data System (ADS)

    Baddari, Kamel; Bellalem, Fouzi; Baddari, Ibtihel; Makdeche, Said

    2016-10-01

    Statistical tests have been used to adjust the Zemmouri seismic data using a distribution function. The Pareto law has been used and the probabilities of various expected earthquakes were computed. A mathematical expression giving the quantiles was established. The extreme values limiting law confirmed the accuracy of the adjustment method. Using the moment magnitude scale, a probabilistic model was made to predict the occurrences of strong earthquakes. The seismic structure has been characterized by the slope of the recurrence plot γ, fractal dimension D, concentration parameter K sr, Hurst exponents H r and H t. The values of D, γ, K sr, H r, and H t diminished many months before the principal seismic shock ( M = 6.9) of the studied seismoactive zone has occurred. Three stages of the deformation of the geophysical medium are manifested in the variation of the coefficient G% of the clustering of minor seismic events.

  20. A Statistical Reappraisal in the Relationship between Global and Greek Seismic Activity

    NASA Astrophysics Data System (ADS)

    Liritzis, I.; Diagourtas, D.; Makropoulos, C.

    1995-01-01

    For the period 1917 1987, Greek seismic activity exhibits a very significant positive correlation to the preceding global activity with a time-lag of 15 years. It seems that all Greece and the two characteristic areas in which we have separated it (Greece without Arc, and the area of the Greek seismic Arc), follow the global seismic activity but with a time-shift of 15 years. Moreover, it seems to exist an intrinsic interaction mechanism between the Greek seismic arc and the rest of Greece, which may be deduced by their different behavior exhibited when they are correlated with the global activity, as well as from the correlation between themselves, where a very significant positive correlation has been found with a time-lag of 3 years, for Greece without arc preceding. A quasi-periodic term of 30-yrs is also observed in these detailed four seismic time-series. The cross-correlation analysis of seismic time-series, as shown, is served as a powerful tool to clarify the complicated space-time pattern of the world wide mosaic of tectonic plate motions. The implications of spring-block model of tectonic plates interaction is invoked, considering the earth's rotation rate changes as their triggering agent. Particular emphasis is given to the potential of such studies in earthquake prediction efforts from local or regional scales to a global scale and vice-versa.

  1. The sequentially discounting autoregressive (SDAR) method for on-line automatic seismic event detecting on long term observation

    NASA Astrophysics Data System (ADS)

    Wang, L.; Toshioka, T.; Nakajima, T.; Narita, A.; Xue, Z.

    2017-12-01

    In recent years, more and more Carbon Capture and Storage (CCS) studies focus on seismicity monitoring. For the safety management of geological CO2 storage at Tomakomai, Hokkaido, Japan, an Advanced Traffic Light System (ATLS) combined different seismic messages (magnitudes, phases, distributions et al.) is proposed for injection controlling. The primary task for ATLS is the seismic events detection in a long-term sustained time series record. Considering the time-varying characteristics of Signal to Noise Ratio (SNR) of a long-term record and the uneven energy distributions of seismic event waveforms will increase the difficulty in automatic seismic detecting, in this work, an improved probability autoregressive (AR) method for automatic seismic event detecting is applied. This algorithm, called sequentially discounting AR learning (SDAR), can identify the effective seismic event in the time series through the Change Point detection (CPD) of the seismic record. In this method, an anomaly signal (seismic event) can be designed as a change point on the time series (seismic record). The statistical model of the signal in the neighborhood of event point will change, because of the seismic event occurrence. This means the SDAR aims to find the statistical irregularities of the record thought CPD. There are 3 advantages of SDAR. 1. Anti-noise ability. The SDAR does not use waveform messages (such as amplitude, energy, polarization) for signal detecting. Therefore, it is an appropriate technique for low SNR data. 2. Real-time estimation. When new data appears in the record, the probability distribution models can be automatic updated by SDAR for on-line processing. 3. Discounting property. the SDAR introduces a discounting parameter to decrease the influence of present statistic value on future data. It makes SDAR as a robust algorithm for non-stationary signal processing. Within these 3 advantages, the SDAR method can handle the non-stationary time-varying long-term series and achieve real-time monitoring. Finally, we employ the SDAR on a synthetic model and Tomakomai Ocean Bottom Cable (OBC) baseline data to prove the feasibility and advantage of our method.

  2. Seasonal variability in Tibetan seismicity 1991-2013

    NASA Astrophysics Data System (ADS)

    Randolph-Flagg, N. G.; Day, J.; Burgmann, R.; Manga, M.

    2013-12-01

    Seismicity in the High Himalaya in Nepal (Bollinger et al., GRL, 2007, Bettinelli et al., EPSL, 2008), the San Andreas fault near Parkfield, California (Christiansen et al., 2007), Mt. Hochstaufen in Germany (Hainzl et al., 2006), and some Cascade Range volcanoes (Christiansen et al., GRL, 2005; Saar and Manga, EPSL, 2003) shows seasonal modulation. From 1991 to 2013, seismicity throughout the ~500 km by ~1000 km Tibetan Plateau also appears to be modulated with 66% more shallow (depth < 20km) earthquakes in spring and fall than in the summer and winter. This variation cannot be explained by seasonal changes in seismic network coverage or triggering by (or occurrence of) large magnitude earthquakes. Significant foreshocks and aftershocks of the 2008 M7.9 Wenchuan earthquake in Sichuan dominate the seismic record from 2008 to 2009 and those years are not considered in the statistical analysis. The Tibetan seismicity, although weaker, is very similar to the modulation observed in Nepal and in the locked section of the San Andreas fault at Parkfield. To explain this biannual signal, we assess the possible effects of hydrologic loading (and unloading), pore pressure diffusion, fault plane orientation, evapotranspiration, earth tides, and atmospheric pressure. The similarity in seasonal signals throughout the area suggests that many faults on the Tibetan Plateau are critically stressed and sensitive to small transient stresses.

  3. The smart cluster method. Adaptive earthquake cluster identification and analysis in strong seismic regions

    NASA Astrophysics Data System (ADS)

    Schaefer, Andreas M.; Daniell, James E.; Wenzel, Friedemann

    2017-07-01

    Earthquake clustering is an essential part of almost any statistical analysis of spatial and temporal properties of seismic activity. The nature of earthquake clusters and subsequent declustering of earthquake catalogues plays a crucial role in determining the magnitude-dependent earthquake return period and its respective spatial variation for probabilistic seismic hazard assessment. This study introduces the Smart Cluster Method (SCM), a new methodology to identify earthquake clusters, which uses an adaptive point process for spatio-temporal cluster identification. It utilises the magnitude-dependent spatio-temporal earthquake density to adjust the search properties, subsequently analyses the identified clusters to determine directional variation and adjusts its search space with respect to directional properties. In the case of rapid subsequent ruptures like the 1992 Landers sequence or the 2010-2011 Darfield-Christchurch sequence, a reclassification procedure is applied to disassemble subsequent ruptures using near-field searches, nearest neighbour classification and temporal splitting. The method is capable of identifying and classifying earthquake clusters in space and time. It has been tested and validated using earthquake data from California and New Zealand. A total of more than 1500 clusters have been found in both regions since 1980 with M m i n = 2.0. Utilising the knowledge of cluster classification, the method has been adjusted to provide an earthquake declustering algorithm, which has been compared to existing methods. Its performance is comparable to established methodologies. The analysis of earthquake clustering statistics lead to various new and updated correlation functions, e.g. for ratios between mainshock and strongest aftershock and general aftershock activity metrics.

  4. Seismo-Ionospheric Coupling as Intensified EIA Observed by Satellite Electron Density and GPS-TEC Data

    NASA Astrophysics Data System (ADS)

    Ryu, K.; Jangsoo, C.; Kim, S. G.; Jeong, K. S.; Parrot, M.; Pulinets, S. A.; Oyama, K. I.

    2014-12-01

    Examples of intensified EIA features temporally and spatially related to large earthquakes observed by satellites and GPS-TEC are introduced. The precursory, concurrent, and ex-post enhancements of EIA represented by the equatorial electron density, which are thought to be related to the M8.7 Northern Sumatra earthquake of March 2005, the M8.0 Pisco earthquake of August 2007, and the M7.9 Wenchuan Earthquake of 12 May 2008, are shown with space weather condition. Based on the case studies, statistical analysis on the ionospheric electron density data measured by the Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions satellite (DEMETER) over a period of 2005-2010 was executed in order to investigate the correlation between seismic activity and equatorial plasma density variations. To simplify the analysis, three equatorial regions with frequent earthquakes were selected and then one-dimensional time series analysis between the daily seismic activity indices and the EIA intensity indices were performed for each region with excluding the possible effects from the geomagnetic and solar activity. The statistically significant values of the lagged cross-correlation function, particularly in the region with minimal effects of longitudinal asymmetry, indicate that some of the very large earthquakes with M > 7.0 in the low latitude region can accompany observable seismo-ionospheric coupling phenomena in the form of EIA enhancements, even though the seismic activity is not the most significant driver of the equatorial ionospheric evolution. The physical mechanisms of the seismo-ionospheric coupling to explain the observation and the possibility of earthquake prediction using the EIA intensity variation are discussed.

  5. Monitoring Instrument Performance in Regional Broadband Seismic Network Using Ambient Seismic Noise

    NASA Astrophysics Data System (ADS)

    Ye, F.; Lyu, S.; Lin, J.

    2017-12-01

    In the past ten years, the number of seismic stations has increased significantly, and regional seismic networks with advanced technology have been gradually developed all over the world. The resulting broadband data help to improve the seismological research. It is important to monitor the performance of broadband instruments in a new network in a long period of time to ensure the accuracy of seismic records. Here, we propose a method that uses ambient noise data in the period range 5-25 s to monitor instrument performance and check data quality in situ. The method is based on an analysis of amplitude and phase index parameters calculated from pairwise cross-correlations of three stations, which provides multiple references for reliable error estimates. Index parameters calculated daily during a two-year observation period are evaluated to identify stations with instrument response errors in near real time. During data processing, initial instrument responses are used in place of available instrument responses to simulate instrument response errors, which are then used to verify our results. We also examine feasibility of the tailing noise using data from stations selected from USArray in different locations and analyze the possible instrumental errors resulting in time-shifts used to verify the method. Additionally, we show an application that effects of instrument response errors that experience pole-zeros variations on monitoring temporal variations in crustal properties appear statistically significant velocity perturbation larger than the standard deviation. The results indicate that monitoring seismic instrument performance helps eliminate data pollution before analysis begins.

  6. Structural interpretation of seismic data and inherent uncertainties

    NASA Astrophysics Data System (ADS)

    Bond, Clare

    2013-04-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that a wide variety of conceptual models were applied to single seismic datasets. Highlighting not only spatial variations in fault placements, but whether interpreters thought they existed at all, or had the same sense of movement. Further, statistical analysis suggests that the strategies an interpreter employs are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments a small number of experts are focused on to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.

  7. From intuition to statistics in building subsurface structural models

    USGS Publications Warehouse

    Brandenburg, J.P.; Alpak, F.O.; Naruk, S.; Solum, J.

    2011-01-01

    Experts associated with the oil and gas exploration industry suggest that combining forward trishear models with stochastic global optimization algorithms allows a quantitative assessment of the uncertainty associated with a given structural model. The methodology is applied to incompletely imaged structures related to deepwater hydrocarbon reservoirs and results are compared to prior manual palinspastic restorations and borehole data. This methodology is also useful for extending structural interpretations into other areas of limited resolution, such as subsalt in addition to extrapolating existing data into seismic data gaps. This technique can be used for rapid reservoir appraisal and potentially have other applications for seismic processing, well planning, and borehole stability analysis.

  8. Correlative weighted stacking for seismic data in the wavelet domain

    USGS Publications Warehouse

    Zhang, S.; Xu, Y.; Xia, J.; ,

    2004-01-01

    Horizontal stacking plays a crucial role for modern seismic data processing, for it not only compresses random noise and multiple reflections, but also provides a foundational data for subsequent migration and inversion. However, a number of examples showed that random noise in adjacent traces exhibits correlation and coherence. The average stacking and weighted stacking based on the conventional correlative function all result in false events, which are caused by noise. Wavelet transform and high order statistics are very useful methods for modern signal processing. The multiresolution analysis in wavelet theory can decompose signal on difference scales, and high order correlative function can inhibit correlative noise, for which the conventional correlative function is of no use. Based on the theory of wavelet transform and high order statistics, high order correlative weighted stacking (HOCWS) technique is presented in this paper. Its essence is to stack common midpoint gathers after the normal moveout correction by weight that is calculated through high order correlative statistics in the wavelet domain. Synthetic examples demonstrate its advantages in improving the signal to noise (S/N) ration and compressing the correlative random noise.

  9. Passive (Micro-) Seismic Event Detection by Identifying Embedded "Event" Anomalies Within Statistically Describable Background Noise

    NASA Astrophysics Data System (ADS)

    Baziw, Erick; Verbeek, Gerald

    2012-12-01

    Among engineers there is considerable interest in the real-time identification of "events" within time series data with a low signal to noise ratio. This is especially true for acoustic emission analysis, which is utilized to assess the integrity and safety of many structures and is also applied in the field of passive seismic monitoring (PSM). Here an array of seismic receivers are used to acquire acoustic signals to monitor locations where seismic activity is expected: underground excavations, deep open pits and quarries, reservoirs into which fluids are injected or from which fluids are produced, permeable subsurface formations, or sites of large underground explosions. The most important element of PSM is event detection: the monitoring of seismic acoustic emissions is a continuous, real-time process which typically runs 24 h a day, 7 days a week, and therefore a PSM system with poor event detection can easily acquire terabytes of useless data as it does not identify crucial acoustic events. This paper outlines a new algorithm developed for this application, the so-called SEED™ (Signal Enhancement and Event Detection) algorithm. The SEED™ algorithm uses real-time Bayesian recursive estimation digital filtering techniques for PSM signal enhancement and event detection.

  10. Spatiotemporal distribution of Oklahoma earthquakes: Exploring relationships using a nearest-neighbor approach

    NASA Astrophysics Data System (ADS)

    Vasylkivska, Veronika S.; Huerta, Nicolas J.

    2017-07-01

    Determining the spatiotemporal characteristics of natural and induced seismic events holds the opportunity to gain new insights into why these events occur. Linking the seismicity characteristics with other geologic, geographic, natural, or anthropogenic factors could help to identify the causes and suggest mitigation strategies that reduce the risk associated with such events. The nearest-neighbor approach utilized in this work represents a practical first step toward identifying statistically correlated clusters of recorded earthquake events. Detailed study of the Oklahoma earthquake catalog's inherent errors, empirical model parameters, and model assumptions is presented. We found that the cluster analysis results are stable with respect to empirical parameters (e.g., fractal dimension) but were sensitive to epicenter location errors and seismicity rates. Most critically, we show that the patterns in the distribution of earthquake clusters in Oklahoma are primarily defined by spatial relationships between events. This observation is a stark contrast to California (also known for induced seismicity) where a comparable cluster distribution is defined by both spatial and temporal interactions between events. These results highlight the difficulty in understanding the mechanisms and behavior of induced seismicity but provide insights for future work.

  11. Seismic signal and noise on Europa and how to use it

    NASA Astrophysics Data System (ADS)

    Panning, M. P.; Stähler, S. C.; Bills, B. G.; Castillo, J.; Huang, H. H.; Husker, A. L.; Kedar, S.; Lorenz, R. D.; Pike, W. T.; Schmerr, N. C.; Tsai, V. C.; Vance, S.

    2017-12-01

    Seismology is one of our best tools for detailing interior structure of planetary bodies, and a seismometer is included in the baseline and threshold mission design for a potential Europa lander mission. Guiding mission design and planning for adequate science return, though, requires modeling of both the anticipated signal and noise. Assuming ice seismicity on Europa behaves according to statistical properties observed in Earth catalogs and scaling cumulative seismic moment release to the moon, we simulate long seismic records and estimate background noise and peak signal amplitudes (Panning et al., 2017). This suggests a sensitive instrument comparable to many broadband terrestrial instruments or the SP instrument from the InSight mission to Mars will be able to record signals, while high frequency geophones are likely inadequate. We extend this analysis to also begin incorporation of spatial and temporal variation due to the tidal cycle, which can help inform landing site selection. We also begin exploration of how chaotic terrane at the bottom of the ice shell and inter-ice heterogeneities (i.e. internal melt structures) may affect predicted seismic observations using 2D numerical seismic simulations. We also show some of the key seismic observations to determine interior properties of Europa (Stähler et al., 2017). M. P. Panning, S. C. Stähler, H.-H. Huang, S. D. Vance, S. Kedar, V. C. Tsai, W. T. Pike, R. D. Lorenz, "Expected seismicity and the seismic noise environment of Europa," J. Geophys. Res., in revision, 2017. S. C. Stähler, M. P. Panning, S. D. Vance, R. D. Lorenz, M. van Driel, T. Nissen-Meyer, S. Kedar, "Seismic wave propagation in icy ocean worlds," J. Geophys. Res., in revision, 2017.

  12. Insight into subdecimeter fracturing processes during hydraulic fracture experiment in Äspö hard rock laboratory, Sweden

    NASA Astrophysics Data System (ADS)

    Kwiatek, Grzegorz; Martínez-Garzón, Patricia; Plenkers, Katrin; Leonhardt, Maria; Zang, Arno; Dresen, Georg; Bohnhoff, Marco

    2017-04-01

    We analyze the nano- and picoseismicity recorded during a hydraulic fracturing in-situ experiment performed in Äspö Hard Rock Laboratory, Sweden. The fracturing experiment included six fracture stages driven by three different water injection schemes (continuous, progressive and pulse pressurization) and was performed inside a 28 m long, horizontal borehole located at 410 m depth. The fracturing process was monitored with two different seismic networks covering a wide frequency band between 0.01 Hz and 100000 Hz and included broadband seismometers, geophones, high-frequency accelerometers and acoustic emission sensors. The combined seismic network allowed for detection and detailed analysis of seismicity with moment magnitudes MW<-4 (source sizes approx. on cm scale) that occurred solely during the hydraulic fracturing and refracturing stages. We relocated the seismicity catalog using the double-difference technique and calculated the source parameters (seismic moment, source size, stress drop, focal mechanism and seismic moment tensors). The physical characteristics of induced seismicity are compared to the stimulation parameters and to the formation parameters of the site. The seismic activity varies significantly depending on stimulation strategy with conventional, continuous stimulation being the most seismogenic. We find a systematic spatio-temporal migration of microseismic events (propagation away and towards wellbore injection interval) and temporal transitions in source mechanisms (opening - shearing - collapse) both being controlled by changes in fluid injection pressure. The derived focal mechanism parameters are in accordance with the local stress field orientation, and signify the reactivation of pre-existing rock flaws. The seismicity follows statistical and source scaling relations observed at different scales elsewhere, however, at an extremely low level of seismic efficiency.

  13. Fractal analysis of earthquake swarms of Vogtland/NW-Bohemia intraplate seismicity

    NASA Astrophysics Data System (ADS)

    Mittag, Reinhard J.

    2003-03-01

    The special type of intraplate microseismicity with swarm-like occurrence of earthquakes within the Vogtland/NW-Bohemian Region is analysed to reveal the nature and the origin of the seismogenic regime. The long-term data set of continuous seismic monitoring since 1962, including more than 26000 events within a range of about 5 units of local magnitude, provides an unique database for statistical investigations. Most earthquakes occur in narrow hypocentral volumes (clusters) within the lower part of the upper crust, but also single event occurrence outside of spatial clusters is observed. Temporal distribution of events is concentrated in clusters (swarms), which last some days until few month in dependence of intensity. Since 1962 three strong swarms occurred (1962, 1985/86, 2000), including two seismic cycles. Spatial clusters are distributed along a fault system of regional extension (Leipzig-Regensburger Störung), which is supposed to act as the joint tectonic fracture zone for the whole seismogenic region. Seismicity is analysed by fractal analysis, suggesting a unifractal behaviour of seismicity and uniform character of seismotectonic regime for the whole region. A tendency of decreasing fractal dimension values is observed for temporal distribution of earthquakes, indicating an increasing degree of temporal clustering from swarm to swarm. Following the idea of earthquake triggering by magma intrusions and related fluid and gas release into the tectonically pre-stressed parts of the crust, a steady increased intensity of intrusion and/or fluid and gas release might account for that observation. Additionally, seismic parameters for Vogtland/NW-Bohemia intraplate seismicity are compared with an adequate data set of mining-induced seismicity in a nearby mine of Lubin/Poland and with synthetic data sets to evaluate parameter estimation. Due to different seismogenic regime of tectonic and induced seismicity, significant differences between b-values and temporal dimension values are observed. Most significant for intraplate seismicity are relatively low fractal dimension values for temporal distribution. That observation reflects the strong degree of temporal earthquake clustering, which might explain the episodic character of earthquake swarms and support the idea of push-like triggering of earthquake avalanches by intruding magma.

  14. A pilot study of the Earthquake Precursors in the Southwest Peloponnes, Greece

    NASA Astrophysics Data System (ADS)

    Velez, A. P.; Tsinganos, K.; Karastathis, V. K.; Kafatos, M.; Ouzounov, D.; Papadopoulos, G. A.; Tselentis, A.; Eleftheriou, G.; Mouzakiotis, E.; Gika, F.; Aspiotis, T.; Liakopoulos, S.; Voulgaris, N.

    2016-12-01

    A seismic array of the most contemporary technology has been recently installed in the area of Southwest Peloponnese, Greece, an area well known for its high seismic activity. The tectonic regime of the Hellenic arc was the reason for many lethal earthquakes with considerable damage to the broader area of East Mediterranean sea. The seismic array is based on nine 32-bit stations with broadband borehole seismometers. The seismogenic region, monitored by the array, is offshore. At this place the earthquake location suffers by poor azimuthal coverage and the stations of the national seismic network are very distant to this area. Therefore, the existing network cannot effectively monitor the microseismicity. The new array achieved a detailed monitoring of the small events dropping considerably the magnitude of completeness. The detectability of the microearthquakes has been drastically improved permitting so the statistical assessment of earthquake sequences in the area. In parallel the monitored seismicity is directly related with Radon measurement in the soil, taken at three stations in the area.. Radon measurements are performed indirectly by means γ-ray spectrometry of its radioactive progenies 214Pb and 214Bi (emitted at 351 keV and 609 keV, respectively). NaI(Tl) detectors have been installed at 1 m depth, at sites in vicinity of faults providing continuous real time data. Local meteorological records for atmospheric corrections are also continuously recorded. According to the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model atmospheric thermal anomalies observed before strong events can be attributed to increased radon concentration. This is also supported by the statistical analysis of AVHRR/NOAA-18 satellite thermal infrared (TIR) daily records. A combined study of precursor's signals is expected to provide a reliable assessment of their ability on short-term forecasting.

  15. Rigorous Approach in Investigation of Seismic Structure and Source Characteristicsin Northeast Asia: Hierarchical and Trans-dimensional Bayesian Inversion

    NASA Astrophysics Data System (ADS)

    Mustac, M.; Kim, S.; Tkalcic, H.; Rhie, J.; Chen, Y.; Ford, S. R.; Sebastian, N.

    2015-12-01

    Conventional approaches to inverse problems suffer from non-linearity and non-uniqueness in estimations of seismic structures and source properties. Estimated results and associated uncertainties are often biased by applied regularizations and additional constraints, which are commonly introduced to solve such problems. Bayesian methods, however, provide statistically meaningful estimations of models and their uncertainties constrained by data information. In addition, hierarchical and trans-dimensional (trans-D) techniques are inherently implemented in the Bayesian framework to account for involved error statistics and model parameterizations, and, in turn, allow more rigorous estimations of the same. Here, we apply Bayesian methods throughout the entire inference process to estimate seismic structures and source properties in Northeast Asia including east China, the Korean peninsula, and the Japanese islands. Ambient noise analysis is first performed to obtain a base three-dimensional (3-D) heterogeneity model using continuous broadband waveforms from more than 300 stations. As for the tomography of surface wave group and phase velocities in the 5-70 s band, we adopt a hierarchical and trans-D Bayesian inversion method using Voronoi partition. The 3-D heterogeneity model is further improved by joint inversions of teleseismic receiver functions and dispersion data using a newly developed high-efficiency Bayesian technique. The obtained model is subsequently used to prepare 3-D structural Green's functions for the source characterization. A hierarchical Bayesian method for point source inversion using regional complete waveform data is applied to selected events from the region. The seismic structure and source characteristics with rigorously estimated uncertainties from the novel Bayesian methods provide enhanced monitoring and discrimination of seismic events in northeast Asia.

  16. Probabilistic Seismic Hazard Assessment for Iraq Using Complete Earthquake Catalogue Files

    NASA Astrophysics Data System (ADS)

    Ameer, A. S.; Sharma, M. L.; Wason, H. R.; Alsinawi, S. A.

    2005-05-01

    Probabilistic seismic hazard analysis (PSHA) has been carried out for Iraq. The earthquake catalogue used in the present study covers an area between latitude 29° 38.5° N and longitude 39° 50° E containing more than a thousand events for the period 1905 2000. The entire Iraq region has been divided into thirteen seismogenic sources based on their seismic characteristics, geological setting and tectonic framework. The completeness of the seismicity catalogue has been checked using the method proposed by Stepp (1972). The analysis of completeness shows that the earthquake catalogue is not complete below Ms=4.8 for all of Iraq and seismic source zones S1, S4, S5, and S8, while it varies for the other seismic zones. A statistical treatment of completeness of the data file was carried out in each of the magnitude classes. The Frequency Magnitude Distributions (FMD) for the study area including all seismic source zones were established and the minimum magnitude of complete reporting (Mc) were then estimated. For the entire Iraq the Mc was estimated to be about Ms=4.0 while S11 shows the lowest Mc to be about Ms=3.5 and the highest Mc of about Ms=4.2 was observed for S4. The earthquake activity parameters (activity rate λ, b value, maximum regional magnitude mmax) as well as the mean return period (R) with a certain lower magnitude mmin ≥ m along with their probability of occurrence have been determined for all thirteen seismic source zones of Iraq. The maximum regional magnitude mmax was estimated as 7.87 ± 0.86 for entire Iraq. The return period for magnitude 6.0 is largest for source zone S3 which is estimated to be 705 years while the smallest value is estimated as 9.9 years for all of Iraq.

  17. Time Series Analysis of Soil Radon Data Using Multiple Linear Regression and Artificial Neural Network in Seismic Precursory Studies

    NASA Astrophysics Data System (ADS)

    Singh, S.; Jaishi, H. P.; Tiwari, R. P.; Tiwari, R. C.

    2017-07-01

    This paper reports the analysis of soil radon data recorded in the seismic zone-V, located in the northeastern part of India (latitude 23.73N, longitude 92.73E). Continuous measurements of soil-gas emission along Chite fault in Mizoram (India) were carried out with the replacement of solid-state nuclear track detectors at weekly interval. The present study was done for the period from March 2013 to May 2015 using LR-115 Type II detectors, manufactured by Kodak Pathe, France. In order to reduce the influence of meteorological parameters, statistical analysis tools such as multiple linear regression and artificial neural network have been used. Decrease in radon concentration was recorded prior to some earthquakes that occurred during the observation period. Some false anomalies were also recorded which may be attributed to the ongoing crustal deformation which was not major enough to produce an earthquake.

  18. Analysis of the seismicity preceding large earthquakes

    NASA Astrophysics Data System (ADS)

    Stallone, Angela; Marzocchi, Warner

    2017-04-01

    The most common earthquake forecasting models assume that the magnitude of the next earthquake is independent from the past. This feature is probably one of the most severe limitations of the capability to forecast large earthquakes. In this work, we investigate empirically on this specific aspect, exploring whether variations in seismicity in the space-time-magnitude domain encode some information on the size of the future earthquakes. For this purpose, and to verify the stability of the findings, we consider seismic catalogs covering quite different space-time-magnitude windows, such as the Alto Tiberina Near Fault Observatory (TABOO) catalogue, the California and Japanese seismic catalog. Our method is inspired by the statistical methodology proposed by Baiesi & Paczuski (2004) and elaborated by Zaliapin et al. (2008) to distinguish between triggered and background earthquakes, based on a pairwise nearest-neighbor metric defined by properly rescaled temporal and spatial distances. We generalize the method to a metric based on the k-nearest-neighbors that allows us to consider the overall space-time-magnitude distribution of k-earthquakes, which are the strongly correlated ancestors of a target event. Finally, we analyze the statistical properties of the clusters composed by the target event and its k-nearest-neighbors. In essence, the main goal of this study is to verify if different classes of target event magnitudes are characterized by distinctive "k-foreshocks" distributions. The final step is to show how the findings of this work may (or not) improve the skill of existing earthquake forecasting models.

  19. Probabilistic Models For Earthquakes With Large Return Periods In Himalaya Region

    NASA Astrophysics Data System (ADS)

    Chaudhary, Chhavi; Sharma, Mukat Lal

    2017-12-01

    Determination of the frequency of large earthquakes is of paramount importance for seismic risk assessment as large events contribute to significant fraction of the total deformation and these long return period events with low probability of occurrence are not easily captured by classical distributions. Generally, with a small catalogue these larger events follow different distribution function from the smaller and intermediate events. It is thus of special importance to use statistical methods that analyse as closely as possible the range of its extreme values or the tail of the distributions in addition to the main distributions. The generalised Pareto distribution family is widely used for modelling the events which are crossing a specified threshold value. The Pareto, Truncated Pareto, and Tapered Pareto are the special cases of the generalised Pareto family. In this work, the probability of earthquake occurrence has been estimated using the Pareto, Truncated Pareto, and Tapered Pareto distributions. As a case study, the Himalayas whose orogeny lies in generation of large earthquakes and which is one of the most active zones of the world, has been considered. The whole Himalayan region has been divided into five seismic source zones according to seismotectonic and clustering of events. Estimated probabilities of occurrence of earthquakes have also been compared with the modified Gutenberg-Richter distribution and the characteristics recurrence distribution. The statistical analysis reveals that the Tapered Pareto distribution better describes seismicity for the seismic source zones in comparison to other distributions considered in the present study.

  20. Evaluation of seismic performance of reinforced concrete (RC) buildings under near-field earthquakes

    NASA Astrophysics Data System (ADS)

    Moniri, Hassan

    2017-03-01

    Near-field ground motions are significantly severely affected on seismic response of structure compared with far-field ground motions, and the reason is that the near-source forward directivity ground motions contain pulse-long periods. Therefore, the cumulative effects of far-fault records are minor. The damage and collapse of engineering structures observed in the last decades' earthquakes show the potential of damage in existing structures under near-field ground motions. One important subject studied by earthquake engineers as part of a performance-based approach is the determination of demand and collapse capacity under near-field earthquake. Different methods for evaluating seismic structural performance have been suggested along with and as part of the development of performance-based earthquake engineering. This study investigated the results of illustrious characteristics of near-fault ground motions on the seismic response of reinforced concrete (RC) structures, by the use of Incremental Nonlinear Dynamic Analysis (IDA) method. Due to the fact that various ground motions result in different intensity-versus-response plots, this analysis is done again under various ground motions in order to achieve significant statistical averages. The OpenSees software was used to conduct nonlinear structural evaluations. Numerical modelling showed that near-source outcomes cause most of the seismic energy from the rupture to arrive in a single coherent long-period pulse of motion and permanent ground displacements. Finally, a vulnerability of RC building can be evaluated against pulse-like near-fault ground motions effects.

  1. Non-stationary background intensity and Caribbean seismic events

    NASA Astrophysics Data System (ADS)

    Valmy, Larissa; Vaillant, Jean

    2014-05-01

    We consider seismic risk calculation based on models with non-stationary background intensity. The aim is to improve predictive strategies in the framework of seismic risk assessment from models describing at best the seismic activity in the Caribbean arc. Appropriate statistical methods are required for analyzing the volumes of data collected. The focus is on calculating earthquakes occurrences probability and analyzing spatiotemporal evolution of these probabilities. The main modeling tool is the point process theory in order to take into account past history prior to a given date. Thus, the seismic event conditional intensity is expressed by means of the background intensity and the self exciting component. This intensity can be interpreted as the expected event rate per time and / or surface unit. The most popular intensity model in seismology is the ETAS (Epidemic Type Aftershock Sequence) model introduced and then generalized by Ogata [2, 3]. We extended this model and performed a comparison of different probability density functions for the triggered event times [4]. We illustrate our model by considering the CDSA (Centre de Données Sismiques des Antilles) catalog [1] which contains more than 7000 seismic events occurred in the Lesser Antilles arc. Statistical tools for testing the background intensity stationarity and for dynamical segmentation are presented. [1] Bengoubou-Valérius M., Bazin S., Bertil D., Beauducel F. and Bosson A. (2008). CDSA: a new seismological data center for the French Lesser Antilles, Seismol. Res. Lett., 79 (1), 90-102. [2] Ogata Y. (1998). Space-time point-process models for earthquake occurrences, Annals of the Institute of Statistical Mathematics, 50 (2), 379-402. [3] Ogata, Y. (2011). Significant improvements of the space-time ETAS model for forecasting of accurate baseline seismicity, Earth, Planets and Space, 63 (3), 217-229. [4] Valmy L. and Vaillant J. (2013). Statistical models in seismology: Lesser Antilles arc case, Bull. Soc. géol. France, 2013, 184 (1), 61-67.

  2. Natural time analysis of critical phenomena: The case of pre-fracture electromagnetic emissions

    NASA Astrophysics Data System (ADS)

    Potirakis, S. M.; Karadimitrakis, A.; Eftaxias, K.

    2013-06-01

    Criticality of complex systems reveals itself in various ways. One way to monitor a system at critical state is to analyze its observable manifestations using the recently introduced method of natural time. Pre-fracture electromagnetic (EM) emissions, in agreement to laboratory experiments, have been consistently detected in the MHz band prior to significant earthquakes. It has been proposed that these emissions stem from the fracture of the heterogeneous materials surrounding the strong entities (asperities) distributed along the fault, preventing the relative slipping. It has also been proposed that the fracture of heterogeneous material could be described in analogy to the critical phase transitions in statistical physics. In this work, the natural time analysis is for the first time applied to the pre-fracture MHz EM signals revealing their critical nature. Seismicity and pre-fracture EM emissions should be two sides of the same coin concerning the earthquake generation process. Therefore, we also examine the corresponding foreshock seismic activity, as another manifestation of the same complex system at critical state. We conclude that the foreshock seismicity data present criticality features as well.

  3. Natural time analysis of critical phenomena: the case of pre-fracture electromagnetic emissions.

    PubMed

    Potirakis, S M; Karadimitrakis, A; Eftaxias, K

    2013-06-01

    Criticality of complex systems reveals itself in various ways. One way to monitor a system at critical state is to analyze its observable manifestations using the recently introduced method of natural time. Pre-fracture electromagnetic (EM) emissions, in agreement to laboratory experiments, have been consistently detected in the MHz band prior to significant earthquakes. It has been proposed that these emissions stem from the fracture of the heterogeneous materials surrounding the strong entities (asperities) distributed along the fault, preventing the relative slipping. It has also been proposed that the fracture of heterogeneous material could be described in analogy to the critical phase transitions in statistical physics. In this work, the natural time analysis is for the first time applied to the pre-fracture MHz EM signals revealing their critical nature. Seismicity and pre-fracture EM emissions should be two sides of the same coin concerning the earthquake generation process. Therefore, we also examine the corresponding foreshock seismic activity, as another manifestation of the same complex system at critical state. We conclude that the foreshock seismicity data present criticality features as well.

  4. Statistical Seismology and Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Tiampo, K. F.; González, P. J.; Kazemian, J.

    2014-12-01

    While seismicity triggered or induced by natural resources production such as mining or water impoundment in large dams has long been recognized, the recent increase in the unconventional production of oil and gas has been linked to rapid rise in seismicity in many places, including central North America (Ellsworth et al., 2012; Ellsworth, 2013). Worldwide, induced events of M~5 have occurred and, although rare, have resulted in both damage and public concern (Horton, 2012; Keranen et al., 2013). In addition, over the past twenty years, the increase in both number and coverage of seismic stations has resulted in an unprecedented ability to precisely record the magnitude and location of large numbers of small magnitude events. The increase in the number and type of seismic sequences available for detailed study has revealed differences in their statistics that previously difficult to quantify. For example, seismic swarms that produce significant numbers of foreshocks as well as aftershocks have been observed in different tectonic settings, including California, Iceland, and the East Pacific Rise (McGuire et al., 2005; Shearer, 2012; Kazemian et al., 2014). Similarly, smaller events have been observed prior to larger induced events in several occurrences from energy production. The field of statistical seismology has long focused on the question of triggering and the mechanisms responsible (Stein et al., 1992; Hill et al., 1993; Steacy et al., 2005; Parsons, 2005; Main et al., 2006). For example, in most cases the associated stress perturbations are much smaller than the earthquake stress drop, suggesting an inherent sensitivity to relatively small stress changes (Nalbant et al., 2005). Induced seismicity provides the opportunity to investigate triggering and, in particular, the differences between long- and short-range triggering. Here we investigate the statistics of induced seismicity sequences from around the world, including central North America and Spain, and natural earthquake swarms and non-swarm tectonic events from California, Nevada and Iceland. We compare the foreshock and aftershock Omori decay parameters and the Gutenberg-Richter frequency-magnitude scaling relationships for these different sequences in order to better understand the relationship between triggering and cascade sequences.

  5. Statistical analyses and characteristics of volcanic tremor on Stromboli Volcano (Italy)

    NASA Astrophysics Data System (ADS)

    Falsaperla, S.; Langer, H.; Spampinato, S.

    A study of volcanic tremor on Stromboli is carried out on the basis of data recorded daily between 1993 and 1995 by a permanent seismic station (STR) located 1.8km away from the active craters. We also consider the signal of a second station (TF1), which operated for a shorter time span. Changes in the spectral tremor characteristics can be related to modifications in volcanic activity, particularly to lava effusions and explosive sequences. Statistical analyses were carried out on a set of spectra calculated daily from seismic signals where explosion quakes were present or excluded. Principal component analysis and cluster analysis were applied to identify different classes of spectra. Three clusters of spectra are associated with two different states of volcanic activity. One cluster corresponds to a state of low to moderate activity, whereas the two other clusters are present during phases with a high magma column as inferred from the occurrence of lava fountains or effusions. We therefore conclude that variations in volcanic activity at Stromboli are usually linked to changes in the spectral characteristics of volcanic tremor. Site effects are evident when comparing the spectra calculated from signals synchronously recorded at STR and TF1. However, some major spectral peaks at both stations may reflect source properties. Statistical considerations and polarization analysis are in favor of a prevailing presence of P-waves in the tremor signal along with a position of the source northwest of the craters and at shallow depth.

  6. The characteristics of seismological data from offshore observatory in the northeastern South Korea

    NASA Astrophysics Data System (ADS)

    Cho, H. M.; Kim, G.; Che, I. Y.; Lim, I. S.; Kim, Y.; Shin, I. C.

    2017-12-01

    The real-time seismic observation in the ocean is challenging but provides unprecedented data appropriate for seismological research in the ocean from local to global scale. The offshore seismic observatory in the northeastern South Korea operated by Korea Institute of Geoscience and Mineral Resources (KIGAM) integrates the seismic, hydro-acoustic, and infrasound data and transmits the integrated data with oceanographic sensing and SOH (State of Health) to KIGAM in real-time. The observatory is equipped with ocean bottom broadband seismometer (120 s - 50 Hz) laid on the sea-floor approximately 80 meters below sea level. This study focuses on the properties of the data from the sea-floor, noise level evaluation of the observatory in the shallow water, and assessing event detection threshold of the offshore site. We computes the power spectral density (PSD) to describe the background seismic noise and its variations with seasonal change and meteorological condition. The seismic noise probability density functions from the PSDs shows that broadband seismic noise is generally high compared with the Peterson's NLNM and NHNM model. The statistical analysis of the seismic noise is given. We compares the noise level with that of the nearby onshore broadband seismometer. The quality of waveform data from the local, regional, and teleseismic earthquake are evaluated and compared with corresponding onshore data. The S-wave amplification is prominent on the sea-floor observations from local earthquake. The detection threshold on the local earthquake is estimated.

  7. Far-field triggering of foreshocks near the nucleation zone of the 5 September 2012 (MW 7.6) Nicoya Peninsula, Costa Rica earthquake

    NASA Astrophysics Data System (ADS)

    Walter, Jacob I.; Meng, Xiaofeng; Peng, Zhigang; Schwartz, Susan Y.; Newman, Andrew V.; Protti, Marino

    2015-12-01

    On 5 September 2012, a moment magnitude (MW) 7.6 earthquake occurred directly beneath the Nicoya Peninsula, an area with dense seismic and geodetic network coverage. The mainshock ruptured a portion of a previously identified locked patch that was recognized due to a decade-long effort to delineate the megathrust seismic and aseismic processes in this area. Here we conduct a comprehensive study of the seismicity prior to this event utilizing a matched-filter analysis that allows us to decrease the magnitude of catalog completeness by 1 unit. We observe a statistically significant increase in seismicity rate below the Nicoya Peninsula following the 27 August 2012 (MW 7.3) El Salvador earthquake (about 450 km to the northwest and 9 days prior to the Nicoya earthquake). Additionally, we identify a cluster of small-magnitude (<2.2) earthquakes preceding the mainshock by about 35 min and within 15 km of its hypocenter. The immediate foreshock sequence occurred in the same area as those earthquakes triggered shortly after the El Salvador event; though it is not clear whether the effect of triggering from the El Salvador event persisted until the foreshock sequence given the uncertainties in seismicity rates from a relatively small number of earthquakes. If megathrust earthquakes at such distances can induce significant increases in seismicity during the days before another larger event, this sequence strengthens the need for real-time seismicity monitoring for large earthquake forecasting.

  8. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2003

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Moran, Seth C.; Sanchez, John J.; McNutt, Stephen R.; Estes, Steve; Paskievitch, John

    2004-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988. The primary objectives of this program are the near real time seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents the calculated earthquake hypocenter and phase arrival data, and changes in the seismic monitoring program for the period January 1 through December 31, 2003.The AVO seismograph network was used to monitor the seismic activity at twenty-seven volcanoes within Alaska in 2003. These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai volcanic cluster (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Okmok Caldera, Great Sitkin Volcano, Kanaga Volcano, Tanaga Volcano, and Mount Gareloi. Monitoring highlights in 2003 include: continuing elevated seismicity at Mount Veniaminof in January-April (volcanic unrest began in August 2002), volcanogenic seismic swarms at Shishaldin Volcano throughout the year, and low-level tremor at Okmok Caldera throughout the year. Instrumentation and data acquisition highlights in 2003 were the installation of subnetworks on Tanaga and Gareloi Islands, the installation of broadband installations on Akutan Volcano and Okmok Caldera, and the establishment of telemetry for the Okmok Caldera subnetwork. AVO located 3911 earthquakes in 2003.This catalog includes: (1) a description of instruments deployed in the field and their locations; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of velocity models used for earthquake locations; (4) a summary of earthquakes located in 2003; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2003.

  9. 78 FR 59732 - Revisions to Design of Structures, Components, Equipment, and Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-27

    ...,'' Section 3.7.2, ``Seismic System Analysis,'' Section 3.7.3, ``Seismic Subsystem Analysis,'' Section 3.8.1... Analysis,'' (Accession No. ML13198A223); Section 3.7.3, ``Seismic Subsystem Analysis,'' (Accession No..., ``Seismic System Analysis,'' Section 3.7.3, ``Seismic Subsystem Analysis,'' Section 3.8.1, ``Concrete...

  10. Evidences of a lithospheric fault zone in the Sicily Channel continental rift (southern Italy) from instrumental seismicity data

    NASA Astrophysics Data System (ADS)

    Calò, M.; Parisi, L.

    2014-10-01

    Sicily Channel is a portion of Mediterranean Sea, between Sicily (Southern Italy) and Tunisia, representing a part of the foreland Apennine-Maghrebian thrust belt. The seismicity of the region is commonly associated with the normal faulting related to the rifting process and volcanic activity of the region. However, certain seismic patterns suggest the existence of some mechanism coexisting with the rifting process. In this work, we present the results of a statistical analysis of the instrumental seismicity and a reliable relocalization of the events recorded in the last 30 yr in the Sicily Channel and western Sicily using the Double Difference method and 3-D Vp and Vs tomographic models. Our procedure allows us to discern the seismic regime of the Sicily sea from the Tyrrhenian one and to describe the main features of an active fault zone in the study area that could not be related to the rifting process. We report that most of the events are highly clustered in the region between 12.5°-13.5°E and 35.5°-37°N with hypocentral depth of 5-40 km, and reaching 70 km depth in the southernmost sector. The alignment of the seismic clusters, the distribution of volcanic and geothermal regions and the location of some large events occurred in the last century suggest the existence of a subvertical shear zone extending for least 250 km and oriented approximately NNE-SSW. The spatial distribution of the seismic moment suggests that this transfer fault zone is seismically discontinuous showing large seismic gaps in proximity of the Ferdinandea Island, and Graham and Nameless Bank.

  11. Can we see the distal dyke communicate with the caldera? Examples of temporal correlation analysis using seismicity from the Bárðarbunga volcano

    NASA Astrophysics Data System (ADS)

    Jónsdóttir, Kristín; Jónasson, Kristján; Tumi Guðmundsson, Magnús; Hensch, Martin; Hooper, Andrew; Holohan, Eoghan; Sigmundsson, Freysteinn; Halldórsson, Sæmundur Ari; Vogfjörð, Kristín; Roberts, Matthew; Barsotti, Sara; Ófeigsson, Benedikt; Hjörleifsdóttir, Vala; Magnússon, Eyjólfur; Pálsson, Finnur; Parks, Michelle; Dumont, Stephanie; Einarsson, Páll; Guðmundsson, Gunnar

    2016-04-01

    The Bárðarbunga volcano is composed of a large oval caldera (7x11 km) and fissures extending tens of kilometers away from the caldera along the rift zone, which marks the divergent plate boundary across Iceland. On August 16th, 2014 an intense seismic swarm started below the Bárðarbunga caldera and in the two weeks that followed a dyke migrated some 47 km laterally in the uppermost 6-10 km of the crust along the rift. The dyke propagation terminated in lava fields just north of Vatnajökull glacier, where a major (1.5 km3) six months long eruption took place. Intense earthquake activity in the caldera started in the period August 21-24 with over 70 M5 earthquakes accompanying slow caldera collapse, as verified by various geodetic measurements. The subsidence is likely due to magma withdrawal from a reservoir at depth beneath the caldera. During a five months period, October-February, the seismic activity was separated by over 30 km in two clusters; one along the caldera rims (due to piecewise caldera subsidence) and the other at the far end of the dyke (as a result of small shear movements). Here we present statistical analysis comparing the temporal behaviour of seismicity recorded in the two clusters. By comparing the earthquake rate in the dyke in temporal bins before and after caldera subsidence earthquakes to the rate away from these bins (background rate), we show posing a statistical p-value test, that the number of dyke earthquakes was significantly higher (p <0.05) in the period 0-3 hours before a large earthquake (>M4.6) in the caldera. Increased dyke seismicity was also observed 0-3 hours following a large caldera earthquake. Elevated seismicity in the dyke before a large caldera earthquake may occur when a constriction in the dyke was reduced, followed by pressure drop in the chamber. Assuming that the large caldera earthquakes occurred when chamber pressure was lowest, the subsiding caldera piston may have caused temporary higher pressure in the dyke and thereby increased the likelihood of an earthquake. Our results thus suggests mechanical coupling over long distances between the distal end of the dyke and the magma chamber and support a simple plumbing system.

  12. Using Groundwater physiochemical properties for assessing potential earthquake precursor

    NASA Astrophysics Data System (ADS)

    Inbar, Nimrod; Reuveni, Yuval; Anker, Yaakov; Guttman, Joseph

    2017-04-01

    Worldwide studies reports pre-seismic, co-seismic and post-seismic reaction of groundwater to earthquakes. The unique hydrological and geological situation in Israel resulted in relatively deep water wells which are located close to seismically active tectonic plate boundary. Moreover, the Israeli experience show that anomalies may occurs 60-90 minutes prior to the seismic event (Guttman et al., 2005; Anker et al., 2016). Here, we try to assess the possible connection between changes in physiochemical parameters of groundwater and earthquakes along the Dead Sea Transform (DST) region. A designated network of monitoring stations was installed in MEKOROT abandoned deep water wells, continuously measuring water table, conductivity and temperature at a sampling rate of 1 minute. Preliminary analysis compares changes in the measured parameters with rain events, tidal effects and earthquake occurrences of all measured magnitudes (>2.5Md) at monitoring area surroundings. The acquired data set over one year recorded simultaneous abrupt changes in several wells which seems disconnected from standard hydrological occurrences such as precipitation, abstraction or tidal effects. At this stage, our research aims to determine and rationalize a baseline for "normal response" of the measured parameters to external occurrences while isolating those cases in which "deviations" from that base line is recorded. We apply several analysis techniques both in time and frequency domain with the measured signal as well as statistical analysis of several measured earthquake parameters, which indicate potential correlations between earthquakes occurrences and the measured signal. We show that at least in one seismic event (5.1 Md) a potential precursor may have been recorded. Reference: Anker, Y., N. Inbar, A. Y. Dror, Y. Reuveni, J. Guttman, A. Flexer, (2016). Groundwater response to ground movements, as a tool for earthquakes monitoring and a possible precursor. 8th International Conference on Urban Planning and Transportation. Guttman, J., Flexer, A. & Yellin-Dror, A. (2005). Water level changes in wells - a predictor for earthquakes? IAHS Publ. Vol. 303, pp. 1-5.

  13. Integrating Social impacts on Health and Health-Care Systems in Systemic Seismic Vulnerability Analysis

    NASA Astrophysics Data System (ADS)

    Kunz-Plapp, T.; Khazai, B.; Daniell, J. E.

    2012-04-01

    This paper presents a new method for modeling health impacts caused by earthquake damage which allows for integrating key social impacts on individual health and health-care systems and for implementing these impacts in quantitative systemic seismic vulnerability analysis. In current earthquake casualty estimation models, demand on health-care systems is estimated by quantifying the number of fatalities and severity of injuries based on empirical data correlating building damage with casualties. The expected number of injured people (sorted by priorities of emergency treatment) is combined together with post-earthquake reduction of functionality of health-care facilities such as hospitals to estimate the impact on healthcare systems. The aim here is to extend these models by developing a combined engineering and social science approach. Although social vulnerability is recognized as a key component for the consequences of disasters, social vulnerability as such, is seldom linked to common formal and quantitative seismic loss estimates of injured people which provide direct impact on emergency health care services. Yet, there is a consensus that factors which affect vulnerability and post-earthquake health of at-risk populations include demographic characteristics such as age, education, occupation and employment and that these factors can aggravate health impacts further. Similarly, there are different social influences on the performance of health care systems after an earthquake both on an individual as well as on an institutional level. To link social impacts of health and health-care services to a systemic seismic vulnerability analysis, a conceptual model of social impacts of earthquakes on health and the health care systems has been developed. We identified and tested appropriate social indicators for individual health impacts and for health care impacts based on literature research, using available European statistical data. The results will be used to develop a socio-physical model of systemic seismic vulnerability that enhances the further understanding of societal seismic risk by taking into account social vulnerability impacts for health and health-care system, shelter, and transportation.

  14. Background noise spectra of global seismic stations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wada, M.M.; Claassen, J.P.

    1996-08-01

    Over an extended period of time station noise spectra were collected from various sources for use in estimating the detection and location performance of global networks of seismic stations. As the database of noise spectra enlarged and duplicate entries became available, an effort was mounted to more carefully select station noise spectra while discarding others. This report discusses the methodology and criteria by which the noise spectra were selected. It also identifies and illustrates the station noise spectra which survived the selection process and which currently contribute to the modeling efforts. The resulting catalog of noise statistics not only benefitsmore » those who model network performance but also those who wish to select stations on the basis of their noise level as may occur in designing networks or in selecting seismological data for analysis on the basis of station noise level. In view of the various ways by which station noise were estimated by the different contributors, it is advisable that future efforts which predict network performance have available station noise data and spectral estimation methods which are compatible with the statistics underlying seismic noise. This appropriately requires (1) averaging noise over seasonal and/or diurnal cycles, (2) averaging noise over time intervals comparable to those employed by actual detectors, and (3) using logarithmic measures of the noise.« less

  15. Monte Carlo Analysis of Reservoir Models Using Seismic Data and Geostatistical Models

    NASA Astrophysics Data System (ADS)

    Zunino, A.; Mosegaard, K.; Lange, K.; Melnikova, Y.; Hansen, T. M.

    2013-12-01

    We present a study on the analysis of petroleum reservoir models consistent with seismic data and geostatistical constraints performed on a synthetic reservoir model. Our aim is to invert directly for structure and rock bulk properties of the target reservoir zone. To infer the rock facies, porosity and oil saturation seismology alone is not sufficient but a rock physics model must be taken into account, which links the unknown properties to the elastic parameters. We then combine a rock physics model with a simple convolutional approach for seismic waves to invert the "measured" seismograms. To solve this inverse problem, we employ a Markov chain Monte Carlo (MCMC) method, because it offers the possibility to handle non-linearity, complex and multi-step forward models and provides realistic estimates of uncertainties. However, for large data sets the MCMC method may be impractical because of a very high computational demand. To face this challenge one strategy is to feed the algorithm with realistic models, hence relying on proper prior information. To address this problem, we utilize an algorithm drawn from geostatistics to generate geologically plausible models which represent samples of the prior distribution. The geostatistical algorithm learns the multiple-point statistics from prototype models (in the form of training images), then generates thousands of different models which are accepted or rejected by a Metropolis sampler. To further reduce the computation time we parallelize the software and run it on multi-core machines. The solution of the inverse problem is then represented by a collection of reservoir models in terms of facies, porosity and oil saturation, which constitute samples of the posterior distribution. We are finally able to produce probability maps of the properties we are interested in by performing statistical analysis on the collection of solutions.

  16. Analysis of the 2012 Ahar-Varzeghan (Iran) seismic sequence: Insights from statistical and stress transfer modeling

    NASA Astrophysics Data System (ADS)

    Yazdi, Pouye; Santoyo, Miguel Angel; Gaspar-Escribano, Jorge M.

    2018-02-01

    The 2012 Ahar-Varzeghan (Northwestern Iran) earthquake doublet and its following seismic sequence are analyzed in this paper. First, it is examined the time-varying statistical characteristics of seismic activity since the occurrence of the doublet (two large events with Mw = 6.4 and 6.2) that initiated the sequence on 11 August 2012. A power law magnitude-frequency distribution (1.9 ≤ M ≤ 6.4) is obtained, with relatively low b-values for the complete series indicating the existence of relatively large magnitudes and high-stress level in the area. The Omori-Utsu model of the aftershock population decay with time shows a moderate decrease in activity rate. An epidemic-type aftershock sequence model that separates background seismicity from triggered aftershocks is then used to describe the temporal evolution of the seismicity during the period following the occurrence of the doublet. Results for the entire series (above cutoff magnitude Mc = 1.9) indicate a relatively low productivity related to the earthquake-earthquake triggering. Indeed, the majority of these events seems to be generated by underlying transient or aseismic processes, which might be added to the tectonic loading stress. The proportion of aftershock events significantly increases when the analysis is limited to larger events (M ≥ 3.0) suggesting that the triggered large aftershocks entail a substantial portion of the energy released. In order to analyze the spatial distribution of the sequence, new source models are proposed for the two main shocks. For the first shock, the coseismic slip distribution is constrained by the available data on surface ruptures. A Coulomb failure stress transfer model produced by the first event along optimally-oriented planes allows identifying the areas with positive stress loads where the rupture of the subsequent aftershocks may have occurred. The positive Δ CFS areas are compared for two depth intervals: 3-10 km and 15-22 km overlapping over 350 relocated hypocenters, giving arguments supporting the interpretation of Δ CFS as a main mechanism for aftershock triggering in deeper zones of the upper crust.

  17. Location error uncertainties - an advanced using of probabilistic inverse theory

    NASA Astrophysics Data System (ADS)

    Debski, Wojciech

    2016-04-01

    The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analyzed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. While estimating of the earthquake foci location is relatively simple a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling, and apriori uncertainties. In this presentation we addressed this task when statistics of observational and/or modeling errors are unknown. This common situation requires introduction of apriori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland we illustrate an approach based on an analysis of Shanon's entropy calculated for the aposteriori distribution. We show that this meta-characteristic of the aposteriori distribution carries some information on uncertainties of the solution found.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, S; Larsen, S; Wagoner, J

    Seismic imaging and tracking methods have intelligence and monitoring applications. Current systems, however, do not adequately calibrate or model the unknown geological heterogeneity. Current systems are also not designed for rapid data acquisition and analysis in the field. This project seeks to build the core technological capabilities coupled with innovative deployment, processing, and analysis methodologies to allow seismic methods to be effectively utilized in the applications of seismic imaging and vehicle tracking where rapid (minutes to hours) and real-time analysis is required. The goal of this project is to build capabilities in acquisition system design, utilization of full three-dimensional (3D)more » finite difference modeling, as well as statistical characterization of geological heterogeneity. Such capabilities coupled with a rapid field analysis methodology based on matched field processing are applied to problems associated with surveillance, battlefield management, finding hard and deeply buried targets, and portal monitoring. This project, in support of LLNL's national-security mission, benefits the U.S. military and intelligence community. Fiscal year (FY) 2003 was the final year of this project. In the 2.5 years this project has been active, numerous and varied developments and milestones have been accomplished. A wireless communication module for seismic data was developed to facilitate rapid seismic data acquisition and analysis. The E3D code was enhanced to include topographic effects. Codes were developed to implement the Karhunen-Loeve (K-L) statistical methodology for generating geological heterogeneity that can be utilized in E3D modeling. The matched field processing methodology applied to vehicle tracking and based on a field calibration to characterize geological heterogeneity was tested and successfully demonstrated in a tank tracking experiment at the Nevada Test Site. A three-seismic-array vehicle tracking testbed was installed on site at LLNL for testing real-time seismic tracking methods. A field experiment was conducted over a tunnel at the Nevada Site that quantified the tunnel reflection signal and, coupled with modeling, identified key needs and requirements in experimental layout of sensors. A large field experiment was conducted at the Lake Lynn Laboratory, a mine safety research facility in Pennsylvania, over a tunnel complex in realistic, difficult conditions. This experiment gathered the necessary data for a full 3D attempt to apply the methodology. The experiment also collected data to analyze the capabilities to detect and locate in-tunnel explosions for mine safety and other applications. In FY03 specifically, a large and complex simulation experiment was conducted that tested the full modeling-based approach to geological characterization using E2D, the K-L statistical methodology, and matched field processing applied to tunnel detection with surface seismic sensors. The simulation validated the full methodology and the need for geological heterogeneity to be accounted for in the overall approach. The Lake Lynn site area was geologically modeled using the code Earthvision to produce a 32 million node 3D model grid for E3D. Model linking issues were resolved and a number of full 3D model runs were accomplished using shot locations that matched the data. E3D-generated wavefield movies showed the reflection signal would be too small to be observed in the data due to trapped and attenuated energy in the weathered layer. An analysis of the few sensors coupled to bedrock did not improve the reflection signal strength sufficiently because the shots, though buried, were within the surface layer and hence attenuated. Ability to model a complex 3D geological structure and calculate synthetic seismograms that are in good agreement with actual data (especially for surface waves and below the complex weathered layer) was demonstrated. We conclude that E3D is a powerful tool for assessing the conditions under which a tunnel could be detected in a specific geological setting. Finally, the Lake Lynn tunnel explosion data were analyzed using standard array processing techniques. The results showed that single detonations could be detected and located but simultaneous detonations would require a strategic placement of arrays.« less

  19. Fractal analysis of the spatial distribution of earthquakes along the Hellenic Subduction Zone

    NASA Astrophysics Data System (ADS)

    Papadakis, Giorgos; Vallianatos, Filippos; Sammonds, Peter

    2014-05-01

    The Hellenic Subduction Zone (HSZ) is the most seismically active region in Europe. Many destructive earthquakes have taken place along the HSZ in the past. The evolution of such active regions is expressed through seismicity and is characterized by complex phenomenology. The understanding of the tectonic evolution process and the physical state of subducting regimes is crucial in earthquake prediction. In recent years, there is a growing interest concerning an approach to seismicity based on the science of complex systems (Papadakis et al., 2013; Vallianatos et al., 2012). In this study we calculate the fractal dimension of the spatial distribution of earthquakes along the HSZ and we aim to understand the significance of the obtained values to the tectonic and geodynamic evolution of this area. We use the external seismic sources provided by Papaioannou and Papazachos (2000) to create a dataset regarding the subduction zone. According to the aforementioned authors, we define five seismic zones. Then, we structure an earthquake dataset which is based on the updated and extended earthquake catalogue for Greece and the adjacent areas by Makropoulos et al. (2012), covering the period 1976-2009. The fractal dimension of the spatial distribution of earthquakes is calculated for each seismic zone and for the HSZ as a unified system using the box-counting method (Turcotte, 1997; Robertson et al., 1995; Caneva and Smirnov, 2004). Moreover, the variation of the fractal dimension is demonstrated in different time windows. These spatiotemporal variations could be used as an additional index to inform us about the physical state of each seismic zone. As a precursor in earthquake forecasting, the use of the fractal dimension appears to be a very interesting future work. Acknowledgements Giorgos Papadakis wish to acknowledge the Greek State Scholarships Foundation (IKY). References Caneva, A., Smirnov, V., 2004. Using the fractal dimension of earthquake distributions and the slope of the recurrence curve to forecast earthquakes in Colombia. Earth Sci. Res. J., 8, 3-9. Makropoulos, K., Kaviris, G., Kouskouna, V., 2012. An updated and extended earthquake catalogue for Greece and adjacent areas since 1900. Nat. Hazards Earth Syst. Sci., 12, 1425-1430. Papadakis, G., Vallianatos, F., Sammonds, P., 2013. Evidence of non extensive statistical physics behavior of the Hellenic Subduction Zone seismicity. Tectonophysics, 608, 1037-1048. Papaioannou, C.A., Papazachos, B.C., 2000. Time-independent and time-dependent seismic hazard in Greece based on seismogenic sources. Bull. Seismol. Soc. Am., 90, 22-33. Robertson, M.C., Sammis, C.G., Sahimi, M., Martin, A.J., 1995. Fractal analysis of three-dimensional spatial distributions of earthquakes with a percolation interpretation. J. Geophys. Res., 100, 609-620. Turcotte, D.L., 1997. Fractals and chaos in geology and geophysics. Second Edition, Cambridge University Press. Vallianatos, F., Michas, G., Papadakis, G., Sammonds, P., 2012. A non-extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece). Acta Geophys., 60, 758-768.

  20. On the physical links between the dynamics of the Izu Islands 2000 dike intrusions and the statistics of the induced seismicity

    NASA Astrophysics Data System (ADS)

    Passarelli, Luigi; Rivalta, Eleonora; Simone, Cesca; Aoki, Yosuke

    2014-05-01

    The emplacement of magma-filled dikes often induce abundant seismicity in the surrounding host rocks. Most of the earthquakes are thought to occur close to the propagating tip (or edges, in 3D) of the dike, where stresses are concentrated. The resulting seismicity often appears as a swarm, controlled mainly by dike-induced stresses and stressing rate and by other factors, such as the background stressing rate, tectonic setting, regional stresses and tectonic history. The spatial distribution and focal mechanisms of the seismicity bear information on the interaction of the dike stress field and the tectonic setting of the area. The seismicity accompanying the intrusion of a dike is usually characterized by weak events, for which it is difficult to calculate the focal mechanisms. Therefore, only for a few well-recorded dike intrusions a catalog of focal mechanisms, allowing to perform a robust statistical analysis, is available. The 2000 dike intrusion at Miyakejima is in this sense an outstanding case, as about 18000 seismic events were recorded in a time span of three months. This seismic swarm was one of the most energetic ever recorded with five M>6 earthquakes. For this swarm a catalog of 1500 focal mechanisms is avalable (NIED, Japan). We perform a clustering analysis of the focal mechanism solutions, in order to infer the most frequent focal mechanism features prior to the intrusion (pre-diking period) and during the co-diking period. As previously suggested, we find that the dike stress field modified substantially the pre-existing seismicity pattern, by shadowing some non-optimally oriented strike-slip structures and increasing seismic rate on optimally oriented strike-slip tectonic structures. Alongside, during the co-diking period a large number of normal and oblique-normal faulting were observed. These events cannot be explained within the tectonics of the intrusion area. We suggest they are directly generated by the intense stress field induced at the dike edges. We further investigate the distribution of the two main clusters we identify, i.e. strike-slip and oblique-normal mechanisms. We find that the strike-slip family obeys a Gutenberg-Richter law with a b-value close to one. The oblique-normal family of events deviates from the Gutenberg-Richter distribution and is slightly bimodal, with a marked roll-off on its right-hand tail suggesting a lack of large magnitude events (M>5.5). This set of events seems to collect earthquakes rupturing above the dike, similar to graben faulting events widely observed in volcanic areas during diking. A possible explanation of the anomalous frequency-magnitude distribution is that these earthquakes may be limited in size by the thickness of the layer where they nucleate, being spatially constrained between the dike upper edge and the Earth's surface.

  1. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  2. The Search for Fluid Injection-induced Seismicity in California Oilfields

    NASA Astrophysics Data System (ADS)

    Layland-Bachmann, C. E.; Brodsky, E. E.; Foxall, W.; Goebel, T.; Jordan, P. D.

    2017-12-01

    During recent years, earthquakes associated with human activity have become a matter of heightened public concern. Wastewater injection is a major concern, as seismic events with magnitudes larger than M5.5 have been linked to this practice. Much of the research in the United States is focused on the mid-continental regions, where low rates of naturally-occurring seismicity and high-volume injection activities facilitate easier identification by statistical correlation of potentially induced seismic events . However, available industry data are often limited in these regions and therefore limits our ability to connect specific human activities to earthquakes. Specifically, many previous studies have focused primarily on injection activity in single wells, ignoring the interconnectivity of production and injection in a reservoir. The situation in California differs from the central U.S. in two ways: (1) A rich dataset of oilfield activity is publically available from state agencies, which enables a more in-depth investigation of the human forcing; and (2) the identification of potential anthropogenically-induced earthquakes is complex as a result of high tectonic activity. Here we address both differences. We utilize a public database of hydrologically connected reservoirs to assess whether there are any statistically significant correlations between the net injected volumes, reservoir pressures and injection depths, and the earthquake locations and frequencies of occurrence. We introduce a framework of physical and empirical models and statistical techniques to identify potentially induced seismic events. While the aim is to apply the methods statewide, we first apply our methods in the Southern San Joaquin Valley. Although, we find an anomalously high earthquake rate in Southern Kern County oilfields, which is consistent with previous studies, we do not find a simple straightforward correlation. To successfully study induced seismicity we need a seismic catalog that is complete and consistent down to small magnitudes. During this study, we found some important seismic coverage gaps in critical oilfields in the Central Valley that need to be addressed in order to provide societally relevant assessments.

  3. The Cross-Correlation and Reshuffling Tests in Discerning Induced Seismicity

    NASA Astrophysics Data System (ADS)

    Schultz, Ryan; Telesca, Luciano

    2018-05-01

    In recent years, cases of newly emergent induced clusters have increased seismic hazard and risk in locations with social, environmental, and economic consequence. Thus, the need for a quantitative and robust means to discern induced seismicity has become a critical concern. This paper reviews a Matlab-based algorithm designed to quantify the statistical confidence between two time-series datasets. Similar to prior approaches, our method utilizes the cross-correlation to delineate the strength and lag of correlated signals. In addition, use of surrogate reshuffling tests allows for the dynamic testing against statistical confidence intervals of anticipated spurious correlations. We demonstrate the robust nature of our algorithm in a suite of synthetic tests to determine the limits of accurate signal detection in the presence of noise and sub-sampling. Overall, this routine has considerable merit in terms of delineating the strength of correlated signals, one of which includes the discernment of induced seismicity from natural.

  4. Contribution to the assessment of the imminent seismic hazard: Geophysical, statistical (and more) challenges in the territory of Greece

    NASA Astrophysics Data System (ADS)

    Adamaki, Angeliki K.; Papadimitriou, Eleftheria E.; Karakostas, Vassilis G.; Tsaklidis, George M.

    2013-04-01

    The necessity of the imminent seismic hazard assessment stems from a strong social component which is the outcome of the need of people to inquire more in order to understand nature exhaustively and not partially, either to satisfy their inner curiosity or in favor of their self preservation instinct against the physical phenomena that the human kind cannot control. Choosing this path to follow, many seismologists have focused on forecasting the temporal and spatial distribution of earthquakes in short time scales. The possibility of knowing with a degree of certainty the way an earthquake sequence evolves proves to be an important object of research. Being more specific, the present work summarizes applications of seismicity and statistical models on seismic catalogues of areas that are specified by their tectonic structures and their past seismicity, providing information on the temporal and spatial evolution of local seismic activity, which can point out seismicity rate "irregularities" or changes as precursors of strong events, either in case of a main shock or a strong aftershock. In order to study these rate changes both preceding and following a strong earthquake, seismicity models are applied in order to estimate the Coulomb stress changes resulting from the occurrence of a strong earthquake and their results are combined with the application of a Restricted Epidemic Type Aftershock Sequence model. There are many active tectonic structures in the territory of Greece that are related with the occurrence of strong earthquakes, especially near populated areas, and the aim of this work is to contribute to the assessment of the imminent seismic hazard by applying the aforementioned models and techniques and studying the temporal evolution of several seismic sequences that occurred in the Aegean area in the near past.

  5. Seismic Search Engine: A distributed database for mining large scale seismic data

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Vaidya, S.; Kuzma, H. A.

    2009-12-01

    The International Monitoring System (IMS) of the CTBTO collects terabytes worth of seismic measurements from many receiver stations situated around the earth with the goal of detecting underground nuclear testing events and distinguishing them from other benign, but more common events such as earthquakes and mine blasts. The International Data Center (IDC) processes and analyzes these measurements, as they are collected by the IMS, to summarize event detections in daily bulletins. Thereafter, the data measurements are archived into a large format database. Our proposed Seismic Search Engine (SSE) will facilitate a framework for data exploration of the seismic database as well as the development of seismic data mining algorithms. Analogous to GenBank, the annotated genetic sequence database maintained by NIH, through SSE, we intend to provide public access to seismic data and a set of processing and analysis tools, along with community-generated annotations and statistical models to help interpret the data. SSE will implement queries as user-defined functions composed from standard tools and models. Each query is compiled and executed over the database internally before reporting results back to the user. Since queries are expressed with standard tools and models, users can easily reproduce published results within this framework for peer-review and making metric comparisons. As an illustration, an example query is “what are the best receiver stations in East Asia for detecting events in the Middle East?” Evaluating this query involves listing all receiver stations in East Asia, characterizing known seismic events in that region, and constructing a profile for each receiver station to determine how effective its measurements are at predicting each event. The results of this query can be used to help prioritize how data is collected, identify defective instruments, and guide future sensor placements.

  6. Seismicity map tools for earthquake studies

    NASA Astrophysics Data System (ADS)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  7. Early Results of Three-Year Monitoring of Red Wood Ants’ Behavioral Changes and Their Possible Correlation with Earthquake Events

    PubMed Central

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-01-01

    Simple Summary For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the video streams. Based on this automated approach, a statistical analysis of the ant behavior will be carried out. Abstract Short-term earthquake predictions with an advance warning of several hours or days are currently not possible due to both incomplete understanding of the complex tectonic processes and inadequate observations. Abnormal animal behaviors before earthquakes have been reported previously, but create problems in monitoring and reliability. The situation is different with red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae)). They have stationary mounds on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas. For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras with both a color and an infrared sensor. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the more than 45,000 hours of video streams. Based on this automated approach, a statistical analysis of the ants’ behavior will be carried out. In addition, other parameters (climate, geotectonic and biological), which may influence behavior, will be included in the analysis. PMID:26487310

  8. Precursory seismic quiescence along the Sumatra-Andaman subduction zone: past and present

    NASA Astrophysics Data System (ADS)

    Sukrungsri, Santawat; Pailoplee, Santi

    2017-03-01

    In this study, the seismic quiescence prior to hazardous earthquakes was analyzed along the Sumatra-Andaman subduction zone (SASZ). The seismicity data were screened statistically with mainshock earthquakes of M w ≥ 4.4 reported during 1980-2015 being defined as the completeness database. In order to examine the possibility of using the seismic quiescence stage as a marker of subsequent earthquakes, the seismicity data reported prior to the eight major earthquakes along the SASZ were analyzed for changes in their seismicity rate using the statistical Z test. Iterative tests revealed that Z factors of N = 50 events and T = 2 years were optimal for detecting sudden rate changes such as quiescence and to map these spatially. The observed quiescence periods conformed to the subsequent major earthquake occurrences both spatially and temporally. Using suitable conditions obtained from successive retrospective tests, the seismicity rate changes were then mapped from the most up-to-date seismicity data available. This revealed three areas along the SASZ that might generate a major earthquake in the future: (i) Nicobar Islands ( Z = 6.7), (ii) the western offshore side of Sumatra Island ( Z = 7.1), and (iii) western Myanmar ( Z = 6.7). The performance of a stochastic test using a number of synthetic randomized catalogues indicated these levels of anomalous Z value showed the above anomaly is unlikely due to chance or random fluctuations of the earthquake. Thus, these three areas have a high possibility of generating a strong-to-major earthquake in the future.

  9. Mechanical and statistical evidence of the causality of human-made mass shifts on the Earth's upper crust and the occurrence of earthquakes

    NASA Astrophysics Data System (ADS)

    Klose, Christian D.

    2013-01-01

    A global catalog of small- to large-sized earthquakes was systematically analyzed to identify causality and correlatives between human-made mass shifts in the upper Earth's crust and the occurrence of earthquakes. The mass shifts, ranging between 1 kt and 1 Tt, result from large-scale geoengineering operations, including mining, water reservoirs, hydrocarbon production, fluid injection/extractions, deep geothermal energy production and coastal management. This article shows evidence that geomechanical relationships exist with statistical significance between (a) seismic moment magnitudes M of observed earthquakes, (b) lateral distances of the earthquake hypocenters to the geoengineering "operation points" and (c) mass removals or accumulations on the Earth's crust. Statistical findings depend on uncertainties, in particular, of source parameter estimations of seismic events before instrumental recoding. Statistical observations, however, indicate that every second, seismic event tends to occur after a decade. The chance of an earthquake to nucleate after 2 or 20 years near an area with a significant mass shift is 25 or 75 %, respectively. Moreover, causative effects of seismic activities highly depend on the tectonic stress regime in which the operations take place (i.e., extensive, transverse or compressive). Results are summarized as follows: First, seismic moment magnitudes increase the more mass is locally shifted on the Earth's crust. Second, seismic moment magnitudes increase the larger the area in the crust is geomechanically polluted. Third, reverse faults tend to be more trigger-sensitive than normal faults due to a stronger alteration of the minimum vertical principal stress component. Pure strike-slip faults seem to rupture randomly and independently from the magnitude of the mass changes. Finally, mainly due to high estimation uncertainties of source parameters and, in particular, of shallow seismic events (<10 km), it remains still very difficult to discriminate between induced and triggered earthquakes with respect to the data catalog of this study. However, first analyses indicate that small- to medium-sized earthquakes (M6) seem to be triggered. The rupture propagation of triggered events might be dominated by pre-existing tectonic stress conditions.

  10. Tsunamis hazard assessment and monitoring for the Back Sea area

    NASA Astrophysics Data System (ADS)

    Partheniu, Raluca; Ionescu, Constantin; Constantin, Angela; Moldovan, Iren; Diaconescu, Mihail; Marmureanu, Alexandru; Radulian, Mircea; Toader, Victorin

    2016-04-01

    NIEP has improved lately its researches regarding tsunamis in the Black Sea. As part of the routine earthquake and tsunami monitoring activity, the first tsunami early-warning system in the Black Sea has been implemented in 2013 and is active during these last years. In order to monitor the seismic activity of the Black Sea, NIEP is using a total number of 114 real time stations and 2 seismic arrays, 18 of the stations being located in Dobrogea area, area situated in the vicinity of the Romanian Black Sea shore line. Moreover, there is a data exchange with the Black Sea surrounding countries involving the acquisition of real-time data for 17 stations from Bulgaria, Turkey, Georgia and Ukraine. This improves the capability of the Romanian Seismic Network to monitor and more accurately locate the earthquakes occurred in the Black Sea area. For tsunamis monitoring and warning, a number of 6 sea level monitoring stations, 1 infrasound barometer, 3 offshore marine buoys and 7 GPS/GNSS stations are installed in different locations along and near the Romanian shore line. In the framework of ASTARTE project, few objectives regarding the seismic hazard and tsunami waves height assessment for the Black Sea were accomplished. The seismic hazard estimation was based on statistical studies of the seismic sources and their characteristics, compiled using different seismic catalogues. Two probabilistic methods were used for the evaluation of the seismic hazard, the Cornell method, based on the Gutenberg Richter distribution parameters, and Gumbel method, based on extremes statistic. The results show maximum values of possible magnitudes and their recurrence periods, for each seismic source. Using the Tsunami Analysis Tool (TAT) software, a set of tsunami modelling scenarios have been generated for Shabla area, the seismic source that could mostly affect the Romanian shore. These simulations are structured in a database, in order to set maximum possible tsunami waves that could be generated and to establish minimum magnitude values that could trigger tsunamis in this area. Some particularities of Shabla source are: past observed magnitudes > 7 and a recurrence period of 175 years. Some other important objectives of NIEP are to continue the monitoring of the seismic activity of the Black Sea, to improve the data base of the tsunami simulations for this area, near real time fault plane solution estimations used for the warning system, and to add new seismic, GPS/GNSS and sea level monitoring equipment to the existing network. Acknowledgements: This work was partially supported by the FP7 FP7-ENV2013 6.4-3 "Assessment, Strategy And Risk Reduction For Tsunamis in Europe" (ASTARTE) Project 603839/2013 and PNII, Capacity Module III ASTARTE RO Project 268/2014. This work was partially supported by the "Global Tsunami Informal Monitoring Service - 2" (GTIMS2) Project, JRC/IPR/2015/G.2/2006/NC 260286, Ref. Ares (2015)1440256 - 01.04.2015.

  11. Uncertainty analysis of depth predictions from seismic reflection data using Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2018-03-01

    Estimating the depths of target horizons from seismic reflection data is an important task in exploration geophysics. To constrain these depths we need a reliable and accurate velocity model. Here, we build an optimum 2D seismic reflection data processing flow focused on pre - stack deghosting filters and velocity model building and apply Bayesian methods, including Gaussian process emulation and Bayesian History Matching (BHM), to estimate the uncertainties of the depths of key horizons near the borehole DSDP-258 located in the Mentelle Basin, south west of Australia, and compare the results with the drilled core from that well. Following this strategy, the tie between the modelled and observed depths from DSDP-258 core was in accordance with the ± 2σ posterior credibility intervals and predictions for depths to key horizons were made for the two new drill sites, adjacent the existing borehole of the area. The probabilistic analysis allowed us to generate multiple realizations of pre-stack depth migrated images, these can be directly used to better constrain interpretation and identify potential risk at drill sites. The method will be applied to constrain the drilling targets for the upcoming International Ocean Discovery Program (IODP), leg 369.

  12. Uncertainty analysis of depth predictions from seismic reflection data using Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2018-06-01

    Estimating the depths of target horizons from seismic reflection data is an important task in exploration geophysics. To constrain these depths we need a reliable and accurate velocity model. Here, we build an optimum 2-D seismic reflection data processing flow focused on pre-stack deghosting filters and velocity model building and apply Bayesian methods, including Gaussian process emulation and Bayesian History Matching, to estimate the uncertainties of the depths of key horizons near the Deep Sea Drilling Project (DSDP) borehole 258 (DSDP-258) located in the Mentelle Basin, southwest of Australia, and compare the results with the drilled core from that well. Following this strategy, the tie between the modelled and observed depths from DSDP-258 core was in accordance with the ±2σ posterior credibility intervals and predictions for depths to key horizons were made for the two new drill sites, adjacent to the existing borehole of the area. The probabilistic analysis allowed us to generate multiple realizations of pre-stack depth migrated images, these can be directly used to better constrain interpretation and identify potential risk at drill sites. The method will be applied to constrain the drilling targets for the upcoming International Ocean Discovery Program, leg 369.

  13. Cross-correlation earthquake precursors in the hydrogeochemical and geoacoustic signals for the Kamchatka peninsula

    NASA Astrophysics Data System (ADS)

    Ryabinin, Gennadiy; Gavrilov, Valeriy; Polyakov, Yuriy; Timashev, Serge

    2012-06-01

    We propose a new type of earthquake precursor based on the analysis of correlation dynamics between geophysical signals of different nature. The precursor is found using a two-parameter cross-correlation function introduced within the framework of flicker-noise spectroscopy, a general statistical physics approach to the analysis of time series. We consider an example of cross-correlation analysis for water salinity time series, an integral characteristic of the chemical composition of groundwater, and geoacoustic emissions recorded at the G-1 borehole on the Kamchatka peninsula in the time frame from 2001 to 2003, which is characterized by a sequence of three groups of significant seismic events. We found that cross-correlation precursors took place 27, 31, and 35 days ahead of the strongest earthquakes for each group of seismic events, respectively. At the same time, precursory anomalies in the signals themselves were observed only in the geoacoustic emissions for one group of earthquakes.

  14. Fleeing to Fault Zones: Incorporating Syrian Refugees into Earthquake Risk Analysis along the East Anatolian and Dead Sea Rift Fault Zones

    NASA Astrophysics Data System (ADS)

    Wilson, B.; Paradise, T. R.

    2016-12-01

    The influx of millions of Syrian refugees into Turkey has rapidly changed the population distribution along the Dead Sea Rift and East Anatolian Fault zones. In contrast to other countries in the Middle East where refugees are accommodated in camp environments, the majority of displaced individuals in Turkey are integrated into cities, towns, and villages—placing stress on urban settings and increasing potential exposure to strong shaking. Yet, displaced populations are not traditionally captured in data sources used in earthquake risk analysis or loss estimations. Accordingly, we present a district-level analysis assessing the spatial overlap of earthquake hazards and refugee locations in southeastern Turkey to determine how migration patterns are altering seismic risk in the region. Using migration estimates from the U.S. Humanitarian Information Unit, we create three district-level population scenarios that combine official population statistics, refugee camp populations, and low, median, and high bounds for integrated refugee populations. We perform probabilistic seismic hazard analysis alongside these population scenarios to map spatial variations in seismic risk between 2011 and late 2015. Our results show a significant relative southward increase of seismic risk for this period due to refugee migration. Additionally, we calculate earthquake fatalities for simulated earthquakes using a semi-empirical loss estimation technique to determine degree of under-estimation resulting from forgoing migration data in loss modeling. We find that including refugee populations increased casualties by 11-12% using median population estimates, and upwards of 20% using high population estimates. These results communicate the ongoing importance of placing environmental hazards in their appropriate regional and temporal context which unites physical, political, cultural, and socio-economic landscapes. Keywords: Earthquakes, Hazards, Loss-Estimation, Syrian Crisis, Migration, Refugees

  15. Some statistical features of the seismic activity related to the recent M8.2 and M7.1 earthquakes in Mexico

    NASA Astrophysics Data System (ADS)

    Guzman, L.; Baeza-Blancas, E.; Reyes, I.; Angulo Brown, F.; Rudolf Navarro, A.

    2017-12-01

    By studying the magnitude earthquake catalogs, previous studies have reported evidence that some changes in the spatial and temporal organization of earthquake activity is observedbefore and after of a main-shock. These previous studies have used different approach methods for detecting clustering behavior and distance-events density in order topoint out the asymmetric behavior of before shocks and aftershocks. Here, we present a statistical analysis of the seismic activity related to the M8.2 and M7.1 earthquakes occurredon Sept. 7th and Sept. 19th, respectively. First, we calculated the interevent time and distance for the period Sept. 7th 2016 until Oct. 20th 2017 for each seismic region ( a radius of 150 km centeredat coordinates of the M8.1 and M7.1). Next, we calculated the "velocity" of the walker as the ratio between the interevent distance and interevent time, and similarly, we also constructed the"acceleration". A slider pointer is considered to estimate some statistical features within time windows of size τ for the velocity and acceleration sequences before and after the main shocks. Specifically, we applied the fractal dimension method to detect changes in the correlation (persistence) behavior of events in the period before the main events.Our preliminary results pointed out that the fractal dimension associated to the velocity and acceleration sequences exhibits changes in the persistence behavior before the mainshock, while thescaling dimension values after the main events resemble a more uncorrelated behavior. Moreover, the relationship between the standard deviation of the velocity and the local mean velocity valuefor a given time window-size τ is described by an exponent close to 1.5, and the cumulative distribution of velocity and acceleration are well described by power law functions after the crash and stretched-exponential-like distribution before the main shock. On the other hand, we present an analysis of patterns of seismicquiescence before the M8.2 earthquake based on the Schreider algorithmover a period of 27 years. This analysis also includes the modificationof the Schreider method proposed by Muñoz-Diosdado et al. (2015).

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lefeuvre, F.E.; Wrolstad, K.H.; Zou, Ke Shan

    Total and Unocal estimated sand-shale ratios in gas reservoirs from the upper Tertiary clastics of Myanmar. They separately used deterministic pre-stack and statistical post-stack seismic attribute analysis calibrated at two wells to objectively extrapolate the lithologies and reservoir properties several kilometers away from the wells. The two approaches were then integrated and lead to a unique distribution of the sands and shales in the reservoir which fit in the known regional geological model. For the sands, the fluid distributions (gas and brine) were also estimated as well as the porosity, water saturation, thickness and clay content of the sands. Thismore » was made possible by using precise elastic modeling based on the Biot-Gassmann equation in order to integrate the effects of reservoir properties on seismic signatures.« less

  17. Physics-based forecasting of induced seismicity at Groningen gas field, the Netherlands

    NASA Astrophysics Data System (ADS)

    Dempsey, David; Suckale, Jenny

    2017-08-01

    Earthquakes induced by natural gas extraction from the Groningen reservoir, the Netherlands, put local communities at risk. Responsible operation of a reservoir whose gas reserves are of strategic importance to the country requires understanding of the link between extraction and earthquakes. We synthesize observations and a model for Groningen seismicity to produce forecasts for felt seismicity (M > 2.5) in the period February 2017 to 2024. Our model accounts for poroelastic earthquake triggering and rupture on the 325 largest reservoir faults, using an ensemble approach to model unknown heterogeneity and replicate earthquake statistics. We calculate probability distributions for key model parameters using a Bayesian method that incorporates the earthquake observations with a nonhomogeneous Poisson process. Our analysis indicates that the Groningen reservoir was not critically stressed prior to the start of production. Epistemic uncertainty and aleatoric uncertainty are incorporated into forecasts for three different future extraction scenarios. The largest expected earthquake was similar for all scenarios, with a 5% likelihood of exceeding M 4.0.

  18. A semi-empirical analysis of strong-motion peaks in terms of seismic source, propagation path, and local site conditions

    NASA Astrophysics Data System (ADS)

    Kamiyama, M.; Orourke, M. J.; Flores-Berrones, R.

    1992-09-01

    A new type of semi-empirical expression for scaling strong-motion peaks in terms of seismic source, propagation path, and local site conditions is derived. Peak acceleration, peak velocity, and peak displacement are analyzed in a similar fashion because they are interrelated. However, emphasis is placed on the peak velocity which is a key ground motion parameter for lifeline earthquake engineering studies. With the help of seismic source theories, the semi-empirical model is derived using strong motions obtained in Japan. In the derivation, statistical considerations are used in the selection of the model itself and the model parameters. Earthquake magnitude M and hypocentral distance r are selected as independent variables and the dummy variables are introduced to identify the amplification factor due to individual local site conditions. The resulting semi-empirical expressions for the peak acceleration, velocity, and displacement are then compared with strong-motion data observed during three earthquakes in the U.S. and Mexico.

  19. Synthesis of instrumentally and historically recorded earthquakes and studying their spatial statistical relationship (A case study: Dasht-e-Biaz, Eastern Iran)

    NASA Astrophysics Data System (ADS)

    Jalali, Mohammad; Ramazi, Hamidreza

    2018-06-01

    Earthquake catalogues are the main source of statistical seismology for the long term studies of earthquake occurrence. Therefore, studying the spatiotemporal problems is important to reduce the related uncertainties in statistical seismology studies. A statistical tool, time normalization method, has been determined to revise time-frequency relationship in one of the most active regions of Asia, Eastern Iran and West of Afghanistan, (a and b were calculated around 8.84 and 1.99 in the exponential scale, not logarithmic scale). Geostatistical simulation method has been further utilized to reduce the uncertainties in the spatial domain. A geostatistical simulation produces a representative, synthetic catalogue with 5361 events to reduce spatial uncertainties. The synthetic database is classified using a Geographical Information System, GIS, based on simulated magnitudes to reveal the underlying seismicity patterns. Although some regions with highly seismicity correspond to known faults, significantly, as far as seismic patterns are concerned, the new method highlights possible locations of interest that have not been previously identified. It also reveals some previously unrecognized lineation and clusters in likely future strain release.

  20. Gas chimney detection based on improving the performance of combined multilayer perceptron and support vector classifier

    NASA Astrophysics Data System (ADS)

    Hashemi, H.; Tax, D. M. J.; Duin, R. P. W.; Javaherian, A.; de Groot, P.

    2008-11-01

    Seismic object detection is a relatively new field in which 3-D bodies are visualized and spatial relationships between objects of different origins are studied in order to extract geologic information. In this paper, we propose a method for finding an optimal classifier with the help of a statistical feature ranking technique and combining different classifiers. The method, which has general applicability, is demonstrated here on a gas chimney detection problem. First, we evaluate a set of input seismic attributes extracted at locations labeled by a human expert using regularized discriminant analysis (RDA). In order to find the RDA score for each seismic attribute, forward and backward search strategies are used. Subsequently, two non-linear classifiers: multilayer perceptron (MLP) and support vector classifier (SVC) are run on the ranked seismic attributes. Finally, to capitalize on the intrinsic differences between both classifiers, the MLP and SVC results are combined using logical rules of maximum, minimum and mean. The proposed method optimizes the ranked feature space size and yields the lowest classification error in the final combined result. We will show that the logical minimum reveals gas chimneys that exhibit both the softness of MLP and the resolution of SVC classifiers.

  1. Influence of fault steps on rupture termination of strike-slip earthquake faults

    NASA Astrophysics Data System (ADS)

    Li, Zhengfang; Zhou, Bengang

    2018-03-01

    A statistical analysis was completed on the rupture data of 29 historical strike-slip earthquakes across the world. The purpose of this study is to examine the effects of fault steps on the rupture termination of these events. The results show good correlations between the type and length of steps with the seismic rupture and a poor correlation between the step number and seismic rupture. For different magnitude intervals, the smallest widths of the fault steps (Lt) that can terminate the rupture propagation are variable: Lt = 3 km for Ms 6.5 6.9, Lt = 4 km for Ms 7.0 7.5, Lt = 6 km for Ms 7.5 8.0, and Lt = 8 km for Ms 8.0 8.5. The dilational fault step is easier to rupture through than the compression fault step. The smallest widths of the fault step for the rupture arrest can be used as an indicator to judge the scale of the rupture termination of seismic faults. This is helpful for research on fault segmentation, as well as estimating the magnitude of potential earthquakes, and is thus of significance for the assessment of seismic risks.

  2. Quantitative Seismic Interpretation: Applying Rock Physics Tools to Reduce Interpretation Risk

    NASA Astrophysics Data System (ADS)

    Sondergeld, Carl H.

    This book is divided into seven chapters that cover rock physics, statistical rock physics, seismic inversion techniques, case studies, and work flows. On balance, the emphasis is on rock physics. Included are 56 color figures that greatly help in the interpretation of more complicated plots and displays.The domain of rock physics falls between petrophysics and seismics. It is the basis for interpreting seismic observations and therefore is pivotal to the understanding of this book. The first two chapters are dedicated to this topic (109 pages).

  3. Statistical validation of earthquake related observations

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2011-12-01

    The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind optional "antipodal strategy", one can make the predictions efficient, so that the wins will systematically outscore the losses. Sounds easy, however, many precursor phenomena are lacking info on a rigorous control and, in many cases, even the necessary precondition of any scientific study, i.e., an unambiguous definition of "precursor/signal". On the other hand, understanding the complexity of seismic process along with its non-stationary hierarchically organized behaviors, has led already to reproducible intermediate-term middle-range earthquake prediction technique that has passed control test in forward real-time applications during at least the last two decades. In particular, place and time of each of the mega earthquakes of 27 February 2010 in Chile and 11 March 2011 in Japan were recognized as in state of increased probability of such events in advance their occurrences in the ongoing since 1992 Global Test of the algorithms M8 and MSc. These evidences, in conjunction with a retrospective analysis of seismic activity preceding 26 December 2004 in the Indian Ocean and other mega earthquakes of the 20th century, give grounds for assuming that the algorithms of validated effectiveness in magnitude ranges M7.5+ and M8.0+ are applicable to predict the mega-earthquakes as well.

  4. On the likelihood of post-perovskite near the core-mantle boundary: A statistical interpretation of seismic observations

    NASA Astrophysics Data System (ADS)

    Cobden, Laura; Mosca, Ilaria; Trampert, Jeannot; Ritsema, Jeroen

    2012-11-01

    Recent experimental studies indicate that perovskite, the dominant lower mantle mineral, undergoes a phase change to post-perovskite at high pressures. However, it has been unclear whether this transition occurs within the Earth's mantle, due to uncertainties in both the thermochemical state of the lowermost mantle and the pressure-temperature conditions of the phase boundary. In this study we compare the relative fit to global seismic data of mantle models which do and do not contain post-perovskite, following a statistical approach. Our data comprise more than 10,000 Pdiff and Sdiff travel-times, global in coverage, from which we extract the global distributions of dln VS and dln VP near the core-mantle boundary (CMB). These distributions are sensitive to the underlying lateral variations in mineralogy and temperature even after seismic uncertainties are taken into account, and are ideally suited for investigating the likelihood of the presence of post-perovskite. A post-perovskite-bearing CMB region provides a significantly closer fit to the seismic data than a post-perovskite-free CMB region on both a global and regional scale. These results complement previous local seismic reflection studies, which have shown a consistency between seismic observations and the physical properties of post-perovskite inside the deep Earth.

  5. Progressive reactivation of the volcanic plumbing system beneath Tolbachik volcano (Kamchatka, Russia) revealed by long-period seismicity

    NASA Astrophysics Data System (ADS)

    Frank, William B.; Shapiro, Nikolaï M.; Gusev, Alexander A.

    2018-07-01

    After lying dormant for 36 yr, the Tolbachik volcano of the Klyuchevskoy group started to erupt on 27 November 2012. We investigate the preparatory phase of this eruption via a statistical analysis of the temporal behavior of long-period (LP) earthquakes that occurred beneath this volcanic system. The LP seismicity occurs close to the surface beneath the main volcanic edifices and at 30 km depth in the vicinity of a deep magmatic reservoir. The deep LP earthquakes and those beneath the Klyuchevskoy volcano occur quasi-periodically, while the LP earthquakes beneath Tolbachik are clustered in time. As the seismicity rate increased beneath Tolbachik days before the eruption, the level of the time clustering decreased. We interpret this as a manifestation of the evolution of the volcano plumbing system. We suggest that when a plumbing system awakes after quiescence, multiple cracks and channels are reactivated simultaneously and their interaction results in the strong time clustering of LP earthquakes. With time, this network of channels and cracks evolves into a more stable state with an overall increased permeability, where fluids flow uninhibited throughout the plumbing system except for a few remaining impediments that continue to generate seismic radiation. The inter-seismic source interaction and the level of earthquake time clustering in this latter state is weak. This scenario suggests that the observed evolution of the statistical behavior of the shallow LP seismicity beneath Tolbachik is an indicator of the reactivation and consolidation of the near-surface plumbing system prior to the Tolbachik eruption. The parts of the plumbing system above the deep magmatic reservoir and beneath the Klyuchevskoy volcano remain in nearly permanent activity, as demonstrated by the continuous occurrence of the deep LP earthquakes and very frequent Klyuchevskoy eruptions. This implies that these parts of the plumbing system remain in a stable permeable state and contain a few weakly interacting seismogenic sources. Our results provide new constraints on future mechanical models of the magmatic plumbing systems and demonstrate that the level of time clustering of LP earthquakes can be a useful parameter to infer information about the state of the plumbing system.

  6. Physics-based and statistical earthquake forecasting in a continental rift zone: the case study of Corinth Gulf (Greece)

    NASA Astrophysics Data System (ADS)

    Segou, Margarita

    2016-01-01

    I perform a retrospective forecast experiment in the most rapid extensive continental rift worldwide, the western Corinth Gulf (wCG, Greece), aiming to predict shallow seismicity (depth <15 km) with magnitude M ≥ 3.0 for the time period between 1995 and 2013. I compare two short-term earthquake clustering models, based on epidemic-type aftershock sequence (ETAS) statistics, four physics-based (CRS) models, combining static stress change estimations and the rate-and-state laboratory law and one hybrid model. For the latter models, I incorporate the stress changes imparted from 31 earthquakes with magnitude M ≥ 4.5 at the extended area of wCG. Special attention is given on the 3-D representation of active faults, acting as potential receiver planes for the estimation of static stress changes. I use reference seismicity between 1990 and 1995, corresponding to the learning phase of physics-based models, and I evaluate the forecasts for six months following the 1995 M = 6.4 Aigio earthquake using log-likelihood performance metrics. For the ETAS realizations, I use seismic events with magnitude M ≥ 2.5 within daily update intervals to enhance their predictive power. For assessing the role of background seismicity, I implement a stochastic reconstruction (aka declustering) aiming to answer whether M > 4.5 earthquakes correspond to spontaneous events and identify, if possible, different triggering characteristics between aftershock sequences and swarm-type seismicity periods. I find that: (1) ETAS models outperform CRS models in most time intervals achieving very low rejection ratio RN = 6 per cent, when I test their efficiency to forecast the total number of events inside the study area, (2) the best rejection ratio for CRS models reaches RN = 17 per cent, when I use varying target depths and receiver plane geometry, (3) 75 per cent of the 1995 Aigio aftershocks that occurred within the first month can be explained by static stress changes, (4) highly variable performance on behalf of both statistical and physical models is suggested by large confidence intervals of information gain per earthquake and (5) generic ETAS models can adequately predict the temporal evolution of seismicity during swarms. Furthermore, stochastic reconstruction of seismicity makes possible the identification of different triggering processes between specific seismic crises (2001, 2003-04, 2006-07) and the 1995 aftershock sequence. I find that: (1) seismic events with M ≥ 5.0 are not a part of a preceding earthquake cascade, since they are characterized by high probability being a background event (average Pback > 0.8) and (2) triggered seismicity within swarms is characterized by lower event productivity when compared with the corresponding value during aftershock sequences. I conclude that physics-based models contribute on the determination of the `new-normal' seismicity rate at longer time intervals and that their joint implementation with statistical models is beneficial for future operational forecast systems.

  7. A new concept in seismic landslide hazard analysis for practical application

    NASA Astrophysics Data System (ADS)

    Lee, Chyi-Tyi

    2017-04-01

    A seismic landslide hazard model could be constructed using deterministic approach (Jibson et al., 2000) or statistical approach (Lee, 2014). Both approaches got landslide spatial probability under a certain return-period earthquake. In the statistical approach, our recent study found that there are common patterns among different landslide susceptibility models of the same region. The common susceptibility could reflect relative stability of slopes at a region; higher susceptibility indicates lower stability. Using the common susceptibility together with an earthquake event landslide inventory and a map of topographically corrected Arias intensity, we can build the relationship among probability of failure, Arias intensity and the susceptibility. This relationship can immediately be used to construct a seismic landslide hazard map for the region that the empirical relationship built. If the common susceptibility model is further normalized and the empirical relationship built with normalized susceptibility, then the empirical relationship may be practically applied to different region with similar tectonic environments and climate conditions. This could be feasible, when a region has no existing earthquake-induce landslide data to train the susceptibility model and to build the relationship. It is worth mentioning that a rain-induced landslide susceptibility model has common pattern similar to earthquake-induced landslide susceptibility in the same region, and is usable to build the relationship with an earthquake event landslide inventory and a map of Arias intensity. These will be introduced with examples in the meeting.

  8. Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Bachmann, C. E.; Wiemer, S.; Woessner, J.; Hainzl, S.

    2011-08-01

    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was reduced and then stopped after another ML 2.5 event. A few hours later, an earthquake with ML 3.4, felt within the city, occurred, which led to bleed-off of the well. A risk study was later issued with the outcome that the experiment could not be resumed. We analyse the statistical features of the sequence and show that the sequence is well modelled with the Omori-Utsu law following the termination of water injection. Based on this model, the sequence will last 31+29/-14 years to reach the background level. We introduce statistical models based on Reasenberg and Jones and Epidemic Type Aftershock Sequence (ETAS) models, commonly used to model aftershock sequences. We compare and test different model setups to simulate the sequences, varying the number of fixed and free parameters. For one class of the ETAS models, we account for the flow rate at the injection borehole. We test the models against the observed data with standard likelihood tests and find the ETAS model accounting for the on flow rate to perform best. Such a model may in future serve as a valuable tool for designing probabilistic alarm systems for EGS experiments.

  9. On the consistency of tomographically imaged lower mantle slabs

    NASA Astrophysics Data System (ADS)

    Shephard, Grace E.; Matthews, Kara J.; Hosseini, Kasra; Domeier, Mathew

    2017-04-01

    Over the last few decades numerous seismic tomography models have been published, each constructed with choices of data input, parameterization and reference model. The broader geoscience community is increasingly utilizing these models, or a selection thereof, to interpret Earth's mantle structure and processes. It follows that seismically identified remnants of subducted slabs have been used to validate, test or refine relative plate motions, absolute plate reference frames, and mantle sinking rates. With an increasing number of models to include, or exclude, the question arises - how robust is a given positive seismic anomaly, inferred to be a slab, across a given suite of tomography models? Here we generate a series of "vote maps" for the lower mantle by comparing 14 seismic tomography models, including 7 s-wave and 7 p-wave. Considerations include the retention or removal of the mean, the use of a consistent or variable reference model, the statistical value which defines the slab "contour", and the effect of depth interpolation. Preliminary results will be presented that address the depth, location and degree of agreement between seismic tomography models, both for the 14 combined, and between the p-waves and s-waves. The analysis also permits a broader discussion of slab volumes and subduction flux. And whilst the location and geometry of slabs, matches some the documented regions of long-lived subduction, other features do not, illustrating the importance of a robust approach to slab identification.

  10. Depth-Dependent Earthquake Properties Beneath Long-Beach, CA: Implications for the Rheology at the Brittle-Ductile Transition Zone

    NASA Astrophysics Data System (ADS)

    Inbal, A.; Clayton, R. W.; Ampuero, J. P.

    2015-12-01

    Except for a few localities, seismicity along faults in southern California is generally confined to depths shallower than 15 km. Among faults hosting deep seismicity, the Newport-Inglewood Fault (NIF), which traverses the Los-Angeles basin, has an exceptionally mild surface expression and low deformation rates. Moreover, the NIF structure is not as well resolved as other, less well instrumented faults because of poor signal-to-noise ratio. Here we use data from three temporary dense seismic arrays, which were deployed for exploration purposes and contain up to several thousands of vertical geophones, to investigate the properties of deep seismicity beneath Long-Beach (LB), Compton and Santa-Fe Springs (SFS). The latter is located 15 km northeast of the NIF, presumably above a major detachment fault underthrusting the basin.Event detection is carried out using a new approach for microseismic multi-channel picking, in which downward-continued data are back-projected onto the volume beneath the arrays, and locations are derived from statistical analysis of back-projection images. Our technique reveals numerous, previously undetected events along the NIF, and confirms the presence of an active shallow structure gently dipping to the north beneath SFS. Seismicity characteristics vary along the NIF strike and dip. While LB seismicity is uncorrelated with the mapped trace of the NIF, Compton seismicity illuminates a sub-vertical fault that extends down to about 20 km. This result, along with the reported high flux of mantle Helium along the NIF (Boles et al., 2015), suggests that the NIF is deeply rooted and acts as a major conduit for mantle fluids. We find that the LB size distribution obeys the typical power-law at shallow depths, but falls off exponentially for events occurring below 20 km. Because deep seismicity occurs uniformly beneath LB, this transition is attributed to a reduction in seismic asperity density with increasing depth, consistent with a transition to a diffuse deformation regime.

  11. Oligo-Miocene Alpine Sediment Routing from Integrated Analysis of Seismic-Reflection Data and Detrital Zircon U-Pb Geochronology

    NASA Astrophysics Data System (ADS)

    Hubbard, S. M.; Sharman, G.; Covault, J. A.

    2014-12-01

    We integrate detrital zircon geochronology and 3D seismic-reflection data to reconstruct Oligo-Miocene paleogeography and sediment routing from the Alpine hinterland to Austrian Molasse foreland basin. Three-dimensional seismic-reflection data image a network of deepwater tributaries and a long-lived (>8 Ma) foredeep-axial channel belt through which predominantly southerly and westerly turbidity currents are interpreted to have transported Alpine detritus >100 km. We analyzed 793 detrital zircon grains from ten sandstone samples collected from the seismically mapped network of channel fill. Grain age populations correspond with major Alpine orogenic cycles: the Cadomian (750-530 Ma), the Caledonian (500-400 Ma), and the Variscan orogenies (350-250 Ma). Additional age populations correspond with Eocene-Oligocene Periadriatic magmatism (40-30 Ma) and pre-Alpine, Precambrian sources >750 Ma. The abundances of these age populations vary between samples. Sediment that entered the foredeep-axial channel belt from the west (freshwater Molasse) and southwest (Inntal fault zone) is characterized by statistically indistinguishable, well-distributed detrital zircon ages. Sandstone from a shallow marine unit that was deposited proximal to the northern basin margin consists of >75% Variscan (350-300 Ma) zircon, which is believed to have originated from the Bohemian Massif to the north. Mixing calculations based on the Kolmogorov-Smirnoff statistic suggest that the Alpine fold-thrust belt was an important source of detritus to the deepwater Molasse basin. We document east-to-west provenance dilution within the axial channel belt via one or more southern tributaries. Our results have important implications for sediment dispersal patterns within continental-scale orogens, including the relative role of longitudinal versus transverse sediment delivery in peripheral foreland basins.

  12. Identifying tectonic parameters that influence tsunamigenesis

    NASA Astrophysics Data System (ADS)

    van Zelst, Iris; Brizzi, Silvia; van Dinther, Ylona; Heuret, Arnauld; Funiciello, Francesca

    2017-04-01

    The role of tectonics in tsunami generation is at present poorly understood. However, the fact that some regions produce more tsunamis than others indicates that tectonics could influence tsunamigenesis. Here, we complement a global earthquake database that contains geometrical, mechanical, and seismicity parameters of subduction zones with tsunami data. We statistically analyse the database to identify the tectonic parameters that affect tsunamigenesis. The Pearson's product-moment correlation coefficients reveal high positive correlations of 0.65 between, amongst others, the maximum water height of tsunamis and the seismic coupling in a subduction zone. However, these correlations are mainly caused by outliers. The Spearman's rank correlation coefficient results in more robust correlations of 0.60 between the number of tsunamis in a subduction zone and subduction velocity (positive correlation) and the sediment thickness at the trench (negative correlation). Interestingly, there is a positive correlation between the latter and tsunami magnitude. In an effort towards multivariate statistics, a binary decision tree analysis is conducted with one variable. However, this shows that the amount of data is too scarce. To complement this limited amount of data and to assess physical causality of the tectonic parameters with regard to tsunamigenesis, we conduct a numerical study of the most promising parameters using a geodynamic seismic cycle model. We show that an increase in sediment thickness on the subducting plate results in a shift in seismic activity from outerrise normal faults to splay faults. We also show that the splay fault is the preferred rupture path for a strongly velocity strengthening friction regime in the shallow part of the subduction zone, which increases the tsunamigenic potential. A larger updip limit of the seismogenic zone results in larger vertical surface displacement.

  13. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  14. Differentiating induced and natural seismicity using space-time-magnitude statistics applied to the Coso Geothermal field

    USGS Publications Warehouse

    Schoenball, Martin; Davatzes, Nicholas C.; Glen, Jonathan M. G.

    2015-01-01

    A remarkable characteristic of earthquakes is their clustering in time and space, displaying their self-similarity. It remains to be tested if natural and induced earthquakes share the same behavior. We study natural and induced earthquakes comparatively in the same tectonic setting at the Coso Geothermal Field. Covering the preproduction and coproduction periods from 1981 to 2013, we analyze interevent times, spatial dimension, and frequency-size distributions for natural and induced earthquakes. Individually, these distributions are statistically indistinguishable. Determining the distribution of nearest neighbor distances in a combined space-time-magnitude metric, lets us identify clear differences between both kinds of seismicity. Compared to natural earthquakes, induced earthquakes feature a larger population of background seismicity and nearest neighbors at large magnitude rescaled times and small magnitude rescaled distances. Local stress perturbations induced by field operations appear to be strong enough to drive local faults through several seismic cycles and reactivate them after time periods on the order of a year.

  15. Accelerated Seismic Release and Related Aspects of Seismicity Patterns on Earthquake Faults

    NASA Astrophysics Data System (ADS)

    Ben-Zion, Y.; Lyakhovsky, V.

    Observational studies indicate that large earthquakes are sometimes preceded by phases of accelerated seismic release (ASR) characterized by cumulative Benioff strain following a power law time-to-failure relation with a term (tf-t)m, where tf is the failure time of the large event and observed values of m are close to 0.3. We discuss properties of ASR and related aspects of seismicity patterns associated with several theoretical frameworks. The subcritical crack growth approach developed to describe deformation on a crack prior to the occurrence of dynamic rupture predicts great variability and low asymptotic values of the exponent m that are not compatible with observed ASR phases. Statistical physics studies assuming that system-size failures in a deforming region correspond to critical phase transitions predict establishment of long-range correlations of dynamic variables and power-law statistics before large events. Using stress and earthquake histories simulated by the model of Ben-Zion (1996) for a discrete fault with quenched heterogeneities in a 3-D elastic half space, we show that large model earthquakes are associated with nonrepeating cyclical establishment and destruction of long-range stress correlations, accompanied by nonstationary cumulative Benioff strain release. We then analyze results associated with a regional lithospheric model consisting of a seismogenic upper crust governed by the damage rheology of Lyakhovskyet al. (1997) over a viscoelastic substrate. We demonstrate analytically for a simplified 1-D case that the employed damage rheology leads to a singular power-law equation for strain proportional to (tf-t)-1/3, and a nonsingular power-law relation for cumulative Benioff strain proportional to (tf-t)1/3. A simple approximate generalization of the latter for regional cumulative Benioff strain is obtained by adding to the result a linear function of time representing a stationary background release. To go beyond the analytical expectations, we examine results generated by various realizations of the regional lithospheric model producing seismicity following the characteristic frequency-size statistics, Gutenberg-Richter power-law distribution, and mode switching activity. We find that phases of ASR exist only when the seismicity preceding a given large event has broad frequency-size statistics. In such cases the simulated ASR phases can be fitted well by the singular analytical relation with m = -1/3, the nonsingular equation with m = 0.2, and the generalized version of the latter including a linear term with m = 1/3. The obtained good fits with all three relations highlight the difficulty of deriving reliable information on functional forms and parameter values from such data sets. The activation process in the simulated ASR phases is found to be accommodated both by increasing rates of moderate events and increasing average event size, with the former starting a few years earlier than the latter. The lack of ASR in portions of the seismicity not having broad frequency-size statistics may explain why some large earthquakes are preceded by ASR and other are not. The results suggest that observations of moderate and large events contain two complementary end-member predictive signals on the time of future large earthquakes. In portions of seismicity following the characteristic earthquake distribution, such information exists directly in the associated quasi-periodic temporal distribution of large events. In portions of seismicity having broad frequency-size statistics with random or clustered temporal distribution of large events, the ASR phases have predictive information. The extent to which natural seismicity may be understood in terms of these end-member cases remains to be clarified. Continuing studies of evolving stress and other dynamic variables in model calculations combined with advanced analyses of simulated and observed seismicity patterns may lead to improvements in existing forecasting strategies.

  16. Possibility of Earthquake-prediction by analyzing VLF signals

    NASA Astrophysics Data System (ADS)

    Ray, Suman; Chakrabarti, Sandip Kumar; Sasmal, Sudipta

    2016-07-01

    Prediction of seismic events is one of the most challenging jobs for the scientific community. Conventional ways for prediction of earthquakes are to monitor crustal structure movements, though this method has not yet yield satisfactory results. Furthermore, this method fails to give any short-term prediction. Recently, it is noticed that prior to any seismic event a huge amount of energy is released which may create disturbances in the lower part of D-layer/E-layer of the ionosphere. This ionospheric disturbance may be used as a precursor of earthquakes. Since VLF radio waves propagate inside the wave-guide formed by lower ionosphere and Earth's surface, this signal may be used to identify ionospheric disturbances due to seismic activity. We have analyzed VLF signals to find out the correlations, if any, between the VLF signal anomalies and seismic activities. We have done both the case by case study and also the statistical analysis using a whole year data. In both the methods we found that the night time amplitude of VLF signals fluctuated anomalously three days before the seismic events. Also we found that the terminator time of the VLF signals shifted anomalously towards night time before few days of any major seismic events. We calculate the D-layer preparation time and D-layer disappearance time from the VLF signals. We have observed that this D-layer preparation time and D-layer disappearance time become anomalously high 1-2 days before seismic events. Also we found some strong evidences which indicate that it may possible to predict the location of epicenters of earthquakes in future by analyzing VLF signals for multiple propagation paths.

  17. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... Staff Guidance on Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic... Seismic Margin Analysis for New Reactors Based on Probabilistic Risk Assessment,'' (Agencywide Documents.../COL-ISG-020 ``Implementation of a Seismic Margin Analysis for New Reactors Based on Probabilistic Risk...

  18. Using meta-information of a posteriori Bayesian solutions of the hypocentre location task for improving accuracy of location error estimation

    NASA Astrophysics Data System (ADS)

    Debski, Wojciech

    2015-06-01

    The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analysed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. Although estimating of the earthquake foci location is relatively simple, a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling and a priori uncertainties. In this paper, we addressed this task when statistics of observational and/or modelling errors are unknown. This common situation requires introduction of a priori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland, we propose an approach based on an analysis of Shanon's entropy calculated for the a posteriori distribution. We show that this meta-characteristic of the a posteriori distribution carries some information on uncertainties of the solution found.

  19. Analysis of the Seismic Activity During the Preparatory Phase of the Mw 8.2 Iquique Earthquake, Chile 2014

    NASA Astrophysics Data System (ADS)

    Aden-Antoniow, F.; Satriano, C.; Poiata, N.; Bernard, P.; Vilotte, J. P.; Aissaoui, E. M.; Ruiz, S.; Schurr, B.; Sobiesiak, M.

    2015-12-01

    The 2014 Iquique seismic crisis, culminating with the main Mw 8.2 Iquique earthquake (Chile), 1st of April 2014, and the largest Mw 7.7 aftershock, 3rd of April, highlighted a complex unlocking of the North Chile subduction interface. Indeed, during many months preceding this event, at least three large seismic clusters have been observed, in July 2013, in January and in March 2014. Their location and final migration towards the mainshock rupture area represents the main motivation of this work.We built a new, more complete catalogue for the period over December 2013 to March 2014 in Northern Chile, using a new automated array method for earthquake detection and location [Poiata et al. 2015]. With the data-set provided by the IPOC and ILN networks, we detected an average of 8000 events per month, forty times more than the catalogue produced by Centro Sismologico National del Chile. The new catalogue decreases the magnitude of completeness by more than two units, from 3.3 to 1.2. We observe two shallow clusters offshore of the cities of Iquique and Pisagua in January 2014, and a strong one covering the rupture zone of Mw 8.2 mainshock in March. A spatial-temporal statistical analysis of these three clusters allows us to better characterize the whole preparatory phase. We interpret our results in light of the location, timing and energy of several aseismic slip events, evidenced by Boudin et al. [AGU 2014], which coincide with the seismic clusters. We propose that the preparatory phase of the Iquique earthquake consists of a complex interplay of seismic and aseismic slip along the subduction surface. Furthermore, our analysis raises new questions regarding the complex slip during the Mw 7.7 aftershock, and the spatial variation of the effective coupling along the subduction interface, imaged by GPS studies, suggesting new research direction that will be outlined.

  20. Cultural noise and the night-day asymmetry of the seismic activity recorded at the Bunker-East (BKE) Vesuvian Station

    NASA Astrophysics Data System (ADS)

    Scafetta, Nicola; Mazzarella, Adriano

    2018-01-01

    Mazzarella and Scafetta (2016) showed that the seismic activity recorded at the Bunker-East (BKE) Vesuvian station from 1999 to 2014 suggests a higher nocturnal seismic activity. However, this station is located at about 50 m from the main road to the volcano's crater and since 2009 its seismograms also record a significant diurnal cultural noise due mostly to tourist tours to Mt. Vesuvius. Herein, we investigate whether the different seismic frequency between day and night times could be an artifact of the peculiar cultural noise that affects this station mostly from 9:00 am to 5:00 pm from spring to fall. This time-distributed cultural noise should evidently reduce the possibility to detect low magnitude earthquakes during those hours but not high magnitude events. Using hourly distributions referring to different magnitude thresholds from M = 0.2 to M = 2.0, the Gutenberg-Richter magnitude-frequency diagram applied to the day and night-time sub-catalogs and Montecarlo statistical modeling, we demonstrate that the day-night asymmetry persists despite an evident disruption induced by cultural noise during day-hours. In particular, for the period 1999-2017, and for earthquakes with M ≥ 2 we found a Gutenberg-Richter exponent b = 1.66 ± 0.07 for the night-time events and b = 2.06 ± 0.07 for day-time events. Moreover, we repeat the analysis also for an older BKE catalog covering the period from 1992 to 2000 when cultural noise was not present. The analysis confirms a higher seismic nocturnal activity that is also characterized by a smaller Gutenberg-Richter exponent b for M ≥ 2 earthquakes relative to the day-time activity. Thus, the found night-day seismic asymmetric behavior is likely due to a real physical feature affecting Mt. Vesuvius.

  1. The Research of Multiple Attenuation Based on Feedback Iteration and Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Xu, X.; Tong, S.; Wang, L.

    2017-12-01

    How to solve the problem of multiple suppression is a difficult problem in seismic data processing. The traditional technology for multiple attenuation is based on the principle of the minimum output energy of the seismic signal, this criterion is based on the second order statistics, and it can't achieve the multiple attenuation when the primaries and multiples are non-orthogonal. In order to solve the above problems, we combine the feedback iteration method based on the wave equation and the improved independent component analysis (ICA) based on high order statistics to suppress the multiple waves. We first use iterative feedback method to predict the free surface multiples of each order. Then, in order to predict multiples from real multiple in amplitude and phase, we design an expanded pseudo multi-channel matching filtering method to get a more accurate matching multiple result. Finally, we present the improved fast ICA algorithm which is based on the maximum non-Gauss criterion of output signal to the matching multiples and get better separation results of the primaries and the multiples. The advantage of our method is that we don't need any priori information to the prediction of the multiples, and can have a better separation result. The method has been applied to several synthetic data generated by finite-difference model technique and the Sigsbee2B model multiple data, the primaries and multiples are non-orthogonal in these models. The experiments show that after three to four iterations, we can get the perfect multiple results. Using our matching method and Fast ICA adaptive multiple subtraction, we can not only effectively preserve the effective wave energy in seismic records, but also can effectively suppress the free surface multiples, especially the multiples related to the middle and deep areas.

  2. Studies of the Correlation Between Ionospheric Anomalies and Seismic Activities in the Indian Subcontinent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sasmal, S.; Chakrabarti, S. K.; S. N. Bose National Centre for Basic Sciences, JD Block, Salt-Lake Kolkata-70098

    2010-10-20

    The VLF (Very Low Frequency) signals are long thought to give away important information about the Lithosphere-Ionosphere coupling. It is recently established that the ionosphere may be perturbed due to seismic activities. The effects of this perturbation can be detected through the VLF wave amplitude. There are several methods to find this correlations and these methods can be used for the prediction of these seismic events. In this paper, first we present a brief history of the use of VLF propagation method for the study of seismo-ionospheric correlations. Then we present different methods proposed by us to find out themore » seismo-ionospheric correlations. At the Indian Centre for Space Physics, Kolkata we have been monitoring the VTX station at Vijayanarayanam from 2002. In the initial stage, we received 17 kHz signal and latter we received 18.2 kHz signal. In this paper, first we present the results for the 17 kHz signal during Sumatra earthquake in 2004 obtained from the terminator time analysis method. Then we present much detailed and statistical analysis using some new methods and present the results for 18.2 kHz signal. In order to establish the correlation between the ionospheric activities and the earthquakes, we need to understand what are the reference signals throughout the year. We present the result of the sunrise and sunset terminators for the 18.2 kHz signal as a function of the day of the year for a period of four years, viz, 2005 to 2008 when the solar activity was very low. In this case, the signal would primarily be affected by the Sun due to normal sunrise and sunset effects. Any deviation from this standardized calibration curve would point to influences by terrestrial (such as earthquakes) and extra-terrestrial (such as solar activities and other high energy phenomena). We present examples of deviations which occur in a period of sixteen months and show that the correlations with seismic events is significant and typically the highest deviation in terminator shift takes place up to a couple of days prior to the seismic event. We introduce a new method where we find the effects of the seismic activities on D-layer preparation time (DLPT) and the D-layer disappearance time (DLDT). We identify those days in which DLPT and DLDT exhibit deviations from the average value and we correlate those days with seismic events. Separately, we compute the energy release by the earthquakes and using this, we compute the total energy released locally from distant earthquakes and find correlations of the deviations with them. In this case also we find pre-cursors a few days before the seismic events. In a third approach, we consider the nighttime fluctuation method (differently quantified than the conventional way). We analyzed the nighttime data for the year 2007 to check the correlation between the night time fluctuation of the signal amplitude and the seismic events. Using the statistical method for all the events of the year and for the individual individual earthquakes (Magnitude > 5) we found that the night time signal amplitude becomes very high on three days prior to the seismic events.« less

  3. New seismic study begins in Puerto Rico

    USGS Publications Warehouse

    Tarr, A.C.

    1974-01-01

    A new seismological project is now underway in Puerto Rico to provide information needed for accurate assessment of the island's seismic hazard. The project should also help to increase understanding of the tectonics and geologic evolution of the Caribbean region. The Puerto Rico Seismic Program is being conducted by the Geological Survey with support provided by the Puerto Rico Water Resources Authority, an agency responsible for generation and distribution of electric power throughout the Commonwealth. The Program will include the installation of a network of high quality seismograph stations to monitor seismic activity on and around Puerto Rico. These stations will be distributed across the island to record the seismicity as uniformly as possible. The detection and accurate location of small earthquakes, as well as moderate magnitude shocks, will aid in mapping active seismic zones and in compiling frequency of occurrence statistics which ultimately wil be useful in seismic risk-zoning of hte island. 

  4. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready-to-use functional model. In this connection, we recommend to use in hazard analyses non-parametric, kernel estimators of magnitude distribution. The earthquake occurrence process of IIS is not a Poisson process. When earthquakes' occurrences are influenced by a multitude of inducing factors, the interevent time distribution can be modelled by the Weibull distribution supporting a negative ageing property of the process. When earthquake occurrences are due to a specific injection activity, the earthquake rate directly depends on the injection rate and responds immediately to the changes of the injection rate. Furthermore, this response is not limited only to correlated variations of the seismic activity but it also concerns significant changes of the shape of interevent time distribution. Unlike the event rate, the shape of magnitude distribution does not exhibit correlation with the injection rate. This work was supported within SHEER: "Shale Gas Exploration and Exploitation Induced Risks" project funded from Horizon 2020 - R&I Framework Programme, call H2020-LCE 16-2014-1 and within statutory activities No3841/E-41/S/2016 of Ministry of Science and Higher Education of Poland.

  5. Isolated cases of remote dynamic triggering in Canada detected using cataloged earthquakes combined with a matched-filter approach

    USGS Publications Warehouse

    Bei, Wang; Harrington, Rebecca M.; Liu, Yajing; Yu, Hongyu; Carey, Alex; van der Elst, Nicholas

    2015-01-01

    Here we search for dynamically triggered earthquakes in Canada following global main shocks between 2004 and 2014 with MS > 6, depth < 100 km, and estimated peak ground velocity > 0.2 cm/s. We use the Natural Resources Canada (NRCan) earthquake catalog to calculate β statistical values in 1° × 1° bins in 10 day windows before and after the main shocks. The statistical analysis suggests that triggering may occur near Vancouver Island, along the border of the Yukon and Northwest Territories, in western Alberta, western Ontario, and the Charlevoix seismic zone. We also search for triggering in Alberta where denser seismic station coverage renders regional earthquake catalogs with lower completeness thresholds. We find remote triggering in Alberta associated with three main shocks using a matched-filter approach on continuous waveform data. The increased number of local earthquakes following the passage of main shock surface waves suggests local faults may be in a critically stressed state.

  6. The exponential rise of induced seismicity with increasing stress levels in the Groningen gas field and its implications for controlling seismic risk

    NASA Astrophysics Data System (ADS)

    Bourne, S. J.; Oates, S. J.; van Elk, J.

    2018-06-01

    Induced seismicity typically arises from the progressive activation of recently inactive geological faults by anthropogenic activity. Faults are mechanically and geometrically heterogeneous, so their extremes of stress and strength govern the initial evolution of induced seismicity. We derive a statistical model of Coulomb stress failures and associated aftershocks within the tail of the distribution of fault stress and strength variations to show initial induced seismicity rates will increase as an exponential function of induced stress. Our model provides operational forecasts consistent with the observed space-time-magnitude distribution of earthquakes induced by gas production from the Groningen field in the Netherlands. These probabilistic forecasts also match the observed changes in seismicity following a significant and sustained decrease in gas production rates designed to reduce seismic hazard and risk. This forecast capability allows reliable assessment of alternative control options to better inform future induced seismic risk management decisions.

  7. Epistemic uncertainty in California-wide synthetic seismicity simulations

    USGS Publications Warehouse

    Pollitz, Fred F.

    2011-01-01

    The generation of seismicity catalogs on synthetic fault networks holds the promise of providing key inputs into probabilistic seismic-hazard analysis, for example, the coefficient of variation, mean recurrence time as a function of magnitude, the probability of fault-to-fault ruptures, and conditional probabilities for foreshock–mainshock triggering. I employ a seismicity simulator that includes the following ingredients: static stress transfer, viscoelastic relaxation of the lower crust and mantle, and vertical stratification of elastic and viscoelastic material properties. A cascade mechanism combined with a simple Coulomb failure criterion is used to determine the initiation, propagation, and termination of synthetic ruptures. It is employed on a 3D fault network provided by Steve Ward (unpublished data, 2009) for the Southern California Earthquake Center (SCEC) Earthquake Simulators Group. This all-California fault network, initially consisting of 8000 patches, each of ∼12 square kilometers in size, has been rediscretized into Graphic patches, each of ∼1 square kilometer in size, in order to simulate the evolution of California seismicity and crustal stress at magnitude M∼5–8. Resulting synthetic seismicity catalogs spanning 30,000 yr and about one-half million events are evaluated with magnitude-frequency and magnitude-area statistics. For a priori choices of fault-slip rates and mean stress drops, I explore the sensitivity of various constructs on input parameters, particularly mantle viscosity. Slip maps obtained for the southern San Andreas fault show that the ability of segment boundaries to inhibit slip across the boundaries (e.g., to prevent multisegment ruptures) is systematically affected by mantle viscosity.

  8. Epistemic uncertainty in California-wide synthetic seismicity simulations

    USGS Publications Warehouse

    Pollitz, F.F.

    2011-01-01

    The generation of seismicity catalogs on synthetic fault networks holds the promise of providing key inputs into probabilistic seismic-hazard analysis, for example, the coefficient of variation, mean recurrence time as a function of magnitude, the probability of fault-to-fault ruptures, and conditional probabilities for foreshock-mainshock triggering. I employ a seismicity simulator that includes the following ingredients: static stress transfer, viscoelastic relaxation of the lower crust and mantle, and vertical stratification of elastic and viscoelastic material properties. A cascade mechanism combined with a simple Coulomb failure criterion is used to determine the initiation, propagation, and termination of synthetic ruptures. It is employed on a 3D fault network provided by Steve Ward (unpublished data, 2009) for the Southern California Earthquake Center (SCEC) Earthquake Simulators Group. This all-California fault network, initially consisting of 8000 patches, each of ~12 square kilometers in size, has been rediscretized into ~100;000 patches, each of ~1 square kilometer in size, in order to simulate the evolution of California seismicity and crustal stress at magnitude M ~ 5-8. Resulting synthetic seismicity catalogs spanning 30,000 yr and about one-half million events are evaluated with magnitude-frequency and magnitude-area statistics. For a priori choices of fault-slip rates and mean stress drops, I explore the sensitivity of various constructs on input parameters, particularly mantle viscosity. Slip maps obtained for the southern San Andreas fault show that the ability of segment boundaries to inhibit slip across the boundaries (e.g., to prevent multisegment ruptures) is systematically affected by mantle viscosity.

  9. A description of Seismicity based on Non-extensive Statistical Physics: An introduction to Non-extensive Statistical Seismology.

    NASA Astrophysics Data System (ADS)

    Vallianatos, Filippos

    2015-04-01

    Despite the extreme complexity that characterizes earthquake generation process, simple phenomenology seems to apply in the collective properties of seismicity. The best known is the Gutenberg-Richter relation. Short and long-term clustering, power-law scaling and scale-invariance have been exhibited in the spatio-temporal evolution of seismicity providing evidence for earthquakes as a nonlinear dynamic process. Regarding the physics of "many" earthquakes and how this can be derived from first principles, one may wonder, how can the collective properties of a set formed by all earthquakes in a given region, be derived and how does the structure of seismicity depend on its elementary constituents - the earthquakes? What are these properties? The physics of many earthquakes has to be studied with a different approach than the physics of one earthquake making the use of statistical physics necessary to understand the collective properties of earthquakes. Then a natural question arises. What type of statistical physics is appropriate to commonly describe effects from the microscale and crack opening level to the level of large earthquakes? An answer to the previous question could be non-extensive statistical physics, introduced by Tsallis (1988), as the appropriate methodological tool to describe entities with (multi) fractal distributions of their elements and where long-range interactions or intermittency are important, as in fracturing phenomena and earthquakes. In the present work, we review some fundamental properties of earthquake physics and how these are derived by means of non-extensive statistical physics. The aim is to understand aspects of the underlying physics that lead to the evolution of the earthquake phenomenon introducing the new topic of non-extensive statistical seismology. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project. References F. Vallianatos, "A non-extensive approach to risk assessment", Nat. Hazards Earth Syst. Sci., 9, 211-216, 2009 F. Vallianatos and P. Sammonds "Is plate tectonics a case of non-extensive thermodynamics?" Physica A: Statistical Mechanics and its Applications, 389 (21), 4989-4993, 2010, F. Vallianatos, G. Michas, G. Papadakis and P. Sammonds " A non extensive statistical physics view to the spatiotemporal properties of the June 1995, Aigion earthquake (M6.2) aftershock sequence (West Corinth rift, Greece)", Acta Geophysica, 60(3), 758-768, 2012 F. Vallianatos and L. Telesca, Statistical mechanics in earth physics and natural hazards (editorial), Acta Geophysica, 60, 3, 499-501, 2012 F. Vallianatos, G. Michas, G. Papadakis and A. Tzanis "Evidence of non-extensivity in the seismicity observed during the 2011-2012 unrest at the Santorini volcanic complex, Greece" Nat. Hazards Earth Syst. Sci.,13,177-185, 2013 F. Vallianatos and P. Sammonds, "Evidence of non-extensive statistical physics of the lithospheric instability approaching the 2004 Sumatran-Andaman and 2011 Honshu mega-earthquakes" Tectonophysics, 590 , 52-58, 2013 G. Papadakis, F. Vallianatos, P. Sammonds, " Evidence of Nonextensive Statistical Physics behavior of the Hellenic Subduction Zone seismicity" Tectonophysics, 608, 1037 -1048, 2013 G. Michas, F. Vallianatos, and P. Sammonds, Non-extensivity and long-range correlations in the earthquake activity at the West Corinth rift (Greece) Nonlin. Processes Geophys., 20, 713-724, 2013

  10. Stress shadows - a controversial topic

    NASA Astrophysics Data System (ADS)

    Lasocki, Stanislaw; Karakostas, Vassilis G.; Papadimitriou, Eletheria E.; Orlecka-Sikora, Beata

    2010-05-01

    The spatial correlation between the positive Coulomb stress changes and the subsequent seismic activity has been firmly confirmed in many recent studies. If, however, the static stress transfer is a consistent expression of interaction between earthquakes one should also observe a decrease of the activity in the zones of negative stress changes. Instead, the existence of stress shadows is poorly evidenced and may be questioned. We tested the influence of the static stress changes associated with the coseismic slip of the 1995 Mw6.5 Kozani-Grevena (Greece) earthquake on locations of its aftershocks. The study was based on a detailed slip model for the main shock and accurate locations and reliable fault plane solutions of an adequate number of the aftershocks. We developed a statistical testing method, which tested whether the proportions of aftershocks located inside areas determined by a selected criterion on the static stress change could be attained if there were no effect of the stress change due to the main shock on aftershock locations. The areas of stress change were determined at the focus of every aftershock. The distribution of test statistic was constructed with the use of a two-dimensional nonparametric, kernel density estimator of the reference epicenter distribution. The tests highly confidently indicated a rise in probability to locate aftershocks inside areas of positive static stress change, which supported the hypothesis on the triggering effect in these areas. Furthermore, it was evidenced that a larger stress increase caused a stronger triggering effect. The analysis, however, did not evidence the existence of stress shadows inside areas of negative stress change. Contrary to expectations, the tests indicated a significant increase of the probability of event location in the areas of a stress decrease of more than or equal to 5.0 and 10.0 bar. It turned out that for areas of larger absolute stress change this probability increased regardless of the sign of the change though distinctly more in areas of positive than of negative change. In the case of seismicity accompanying underground mining exploitation the coseismic stress changes expressed in terms of the Coulomb failure function are at least of one order smaller than those for earthquakes. Furthermore, they are only a small component of the total stress field variations in mining rockmass, which are mainly controlled by the mining process. Nevertheless, our studies of the induced seismicity in the Rudna mine in the Legnica-Głogow Copper District in Poland showed that the influence of the Coulomb stress changes on locations of subsequent events was statistically significant. We analyzed series of seismic events quantifying the triggering and inhibiting effect by the proportion of events in the series whose locations were consistent with the stress increased and stress decreased zones, respectively. It was found out that more than 60 per-cent of the analyzed seismic events occurred in areas where stress was enhanced due to the occurrence of previous events. The significance of this result was determined by comparing it with 2000 results of the same analysis carried out on the random permutations of the original series of events. The test indicated that the locations in positive stress changes areas were preferred statistically significantly when the stress changes exceeded 0.05 bar. However, no statistically significant inhibiting effect of negative static stress changes, within the considered range of these changes, was ascertained. Here we present details of these two studies and discuss possible reasons behind the negative conclusions on the existence of stress shadows.

  11. A western gray whale mitigation and monitoring program for a 3-D seismic survey, Sakhalin Island, Russia.

    PubMed

    Johnson, S R; Richardson, W J; Yazvenko, S B; Blokhin, S A; Gailey, G; Jenkerson, M R; Meier, S K; Melton, H R; Newcomer, M W; Perlov, A S; Rutenko, S A; Würsig, B; Martin, C R; Egging, D E

    2007-11-01

    The introduction of anthropogenic sounds into the marine environment can impact some marine mammals. Impacts can be greatly reduced if appropriate mitigation measures and monitoring are implemented. This paper concerns such measures undertaken by Exxon Neftegas Limited, as operator of the Sakhalin-1 Consortium, during the Odoptu 3-D seismic survey conducted during 17 August-9 September 2001. The key environmental issue was protection of the critically endangered western gray whale (Eschrichtius robustus), which feeds in summer and fall primarily in the Piltun feeding area off northeast Sakhalin Island. Existing mitigation and monitoring practices for seismic surveys in other jurisdictions were evaluated to identify best practices for reducing impacts on feeding activity by western gray whales. Two buffer zones were established to protect whales from physical injury or undue disturbance during feeding. A 1 km buffer protected all whales from exposure to levels of sound energy potentially capable of producing physical injury. A 4-5 km buffer was established to avoid displacing western gray whales from feeding areas. Trained Marine Mammal Observers (MMOs) on the seismic ship Nordic Explorer had the authority to shut down the air guns if whales were sighted within these buffers. Additional mitigation measures were also incorporated: Temporal mitigation was provided by rescheduling the program from June-August to August-September to avoid interference with spring arrival of migrating gray whales. The survey area was reduced by 19% to avoid certain waters <20 m deep where feeding whales concentrated and where seismic acquisition was a lower priority. The number of air guns and total volume of the air guns were reduced by about half (from 28 to 14 air guns and from 3,390 in(3) to 1,640 in(3)) relative to initial plans. "Ramp-up" (="soft-start") procedures were implemented. Monitoring activities were conducted as needed to implement some mitigation measures, and to assess residual impacts. Aerial and vessel-based surveys determined the distribution of whales before, during and after the seismic survey. Daily aerial reconnaissance helped verify whale-free areas and select the sequence of seismic lines to be surveyed. A scout vessel with MMOs aboard was positioned 4 km shoreward of the active seismic vessel to provide better visual coverage of the 4-5 km buffer and to help define the inshore edge of the 4-5 km buffer. A second scout vessel remained near the seismic vessel. Shore-based observers determined whale numbers, distribution, and behavior during and after the seismic survey. Acoustic monitoring documented received sound levels near and in the main whale feeding area. Statistical analyses of aerial survey data indicated that about 5-10 gray whales moved away from waters near (inshore of) the seismic survey during seismic operations. They shifted into the core gray whale feeding area farther south, and the proportion of gray whales observed feeding did not change over the study period. Five shutdowns of the air guns were invoked for gray whales seen within or near the buffer. A previously unknown gray whale feeding area (the Offshore feeding area) was discovered south and offshore from the nearshore Piltun feeding area. The Offshore area has subsequently been shown to be used by feeding gray whales during several years when no anthropogenic activity occurred near the Piltun feeding area.Shore-based counts indicated that whales continued to feed inshore of the Odoptu block throughout the seismic survey, with no significant correlation between gray whale abundance and seismic activity. Average values of most behavioral parameters were similar to those without seismic surveys. Univariate analysis showed no correlation between seismic sound levels and any behavioral parameter. Multiple regression analyses indicated that, after allowance for environmental covariates, 5 of 11 behavioral parameters were statistically correlated with estimated seismic survey-related variables; 6 of 11 behavioral parameters were not statistically correlated with seismic survey-related variables. Behavioral parameters that were correlated with seismic variables were transient and within the range of variation attributable to environmental effects. Acoustic monitoring determined that the 4-5 km buffer zone, in conjunction with reduction of the air gun array to 14 guns and 1,640 in(3), was effective in limiting sound exposure. Within the Piltun feeding area, these mitigation measures were designed to insure that western gray whales were not exposed to received levels exceeding the 163 dB re 1 microPa (rms) threshold. This was among the most complex and intensive mitigation programs ever conducted for any marine mammal. It provided valuable new information about underwater sounds and gray whale responses during a nearshore seismic program that will be useful in planning future work. Overall, the efforts in 2001 were successful in reducing impacts to levels tolerable by western gray whales. Research in 2002-2005 suggested no biologically significant or population-level impacts of the 2001 seismic survey.

  12. Statistical description of tectonic motions

    NASA Technical Reports Server (NTRS)

    Agnew, Duncan Carr

    1991-01-01

    The behavior of stochastic processes was studied whose power spectra are described by power-law behavior. The details of the analysis and the conclusions that were reached are presented. This analysis was extended to compare detection capabilities of different measurement techniques (e.g., gravimetry and GPS for the vertical, and seismometers and GPS for horizontal), both in general and for the specific case of the deformations produced by a dislocation in a half-space (which applies to seismic of preseismic sources). The time-domain behavior of power-law noises is also investigated.

  13. Statistical analysis of seismicity rate change in the Tokyo Metropolitan area due to the 2011 Tohoku Earthquake

    NASA Astrophysics Data System (ADS)

    Ishibe, T.; Sakai, S.; Shimazaki, K.; Satake, K.; Tsuruoka, H.; Nakagawa, S.; Hirata, N.

    2012-12-01

    We examined a relationship between the Coulomb Failure Function (ΔCFF) due to the Tohoku earthquake (March 11, 2011; MJMA 9.0) and the seismicity rate change in Tokyo Metropolitan area following March 2011. Because of large variation in focal mechanism in the Kanto region, the receiver faults for the ΔCFF were assumed to be two nodal planes of small (M ≥ 2.0) earthquakes which occurred before and after the Tohoku earthquake. The seismicity rate changes, particularly the rate increase, are well explained by ΔCFF due to the gigantic thrusting, while some other possible factors (e.g., dynamic stress changes, excess of fluid dehydration) may also contribute the rate changes. Among 30,746 previous events provided by the National Research Institute for Earth Science and Disaster Prevention (M ≥ 2.0, July 1979 - July 2003), we used as receiver faults, almost 16,000 events indicate significant increase in ΔCFF, while about 8,000 events show significant decrease. Positive ΔCFF predicts seismicity rate increase in southwestern Ibaraki and northern Chiba prefectures where intermediate-depth earthquakes occur, and in shallow crust of the Izu-Oshima and Hakone regions. In these regions, seismicity rates significantly increased after the Tohoku earthquake. The seismicity has increased since March 2011 with respect to the Epidemic Type of Aftershock Sequence (ETAS) model (Ogata, 1988), indicating that the rate change was due to the stress increase by the Tohoku earthquake. The activated seismicity in the Izu and Hakone regions rapidly decayed following the Omori-Utsu formula, while the increased rate of seismicity in the southwestern Ibaraki and northern Chiba prefectures is still continuing. We also calculated ΔCFF due to the 2011 Tohoku earthquake for the focal mechanism solutions of earthquakes between April 2008 and October 2011 recorded on the Metropolitan Seismic Observation network (MeSO-net). The ΔCFF values for the earthquakes after March 2011 show more positive values than those before March 2011, supporting a triggering hypothesis that the 2011 Tohoku earthquake triggered the seismicity changes in the Kanto region.

  14. Earthquake design criteria for small hydro projects in the Philippines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, P.P.; McCandless, D.H.; Asce, M.

    1995-12-31

    The definition of the seismic environment and seismic design criteria of more than twenty small hydro projects in the northern part of the island of Luzon in the Philippines took a special urgency on the wake of the Magnitude 7.7 earthquake that shook the island on July 17, 1990. The paper describes the approach followed to determine design shaking level criteria at each hydro site consistent with the seismic environment estimated at that same site. The approach consisted of three steps: (1) Seismicity: understanding the mechanisms and tectonic features susceptible to generate seismicity and estimating the associated seismicity levels, (2)more » Seismic Hazard: in the absence of an accurate historical record, using statistics to determine the expected level of ground shaking at a site during the operational 100-year design life of each Project, and (3) Criteria Selection: finally and most importantly, exercising judgment in estimating the final proposed level of shaking at each site. The resulting characteristics of estimated seismicity and seismic hazard and the proposed final earthquake design criteria are provided.« less

  15. Prediction for potential landslide zones using seismic amplitude in Liwan gas field, northern South China Sea

    NASA Astrophysics Data System (ADS)

    Li, Xishuang; Liu, Baohua; Liu, Lejun; Zheng, Jiewen; Zhou, Songwang; Zhou, Qingjie

    2017-12-01

    The Liwan (Lw) gas field located in the northern slope of the South China Sea (SCS) is extremely complex for its sea-floor topograghy, which is a huge challenge for the safety of subsea facilities. It is economically impractical to obtain parameters for risk assessment of slope stability through a large amount of sampling over the whole field. The linkage between soil shear strength and seabed peak amplitude derived from 2D/3D seismic data is helpful for understanding the regional slope-instability risk. In this paper, the relationships among seabed peak, acoustic impedance and shear strength of shallow soil in the study area were discussed based on statistical analysis results. We obtained a similar relationship to that obtained in other deep-water areas. There is a positive correlation between seabed peak amplitude and acoustic impedance and an exponential relationship between acoustic impedance and shear strength of sediment. The acoustic impedance is the key factor linking the seismic amplitude and shear strength. Infinite slope stability analysis results indicate the areas have a high potential of shallow landslide on slopes exceeding 15° when the thickness of loose sediments exceeds 8 m in the Lw gas field. Our prediction shows that they are mainly located in the heads and walls of submarine canyons.

  16. Development of damage probability matrices based on Greek earthquake damage data

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Karabinis, Athanasios I.

    2011-03-01

    A comprehensive study is presented for empirical seismic vulnerability assessment of typical structural types, representative of the building stock of Southern Europe, based on a large set of damage statistics. The observational database was obtained from post-earthquake surveys carried out in the area struck by the September 7, 1999 Athens earthquake. After analysis of the collected observational data, a unified damage database has been created which comprises 180,945 damaged buildings from/after the near-field area of the earthquake. The damaged buildings are classified in specific structural types, according to the materials, seismic codes and construction techniques in Southern Europe. The seismic demand is described in terms of both the regional macroseismic intensity and the ratio α g/ a o, where α g is the maximum peak ground acceleration (PGA) of the earthquake event and a o is the unique value PGA that characterizes each municipality shown on the Greek hazard map. The relative and cumulative frequencies of the different damage states for each structural type and each intensity level are computed in terms of damage ratio. Damage probability matrices (DPMs) and vulnerability curves are obtained for specific structural types. A comparison analysis is fulfilled between the produced and the existing vulnerability models.

  17. Quantitative characterization and modeling of lithologic heterogeneity

    NASA Astrophysics Data System (ADS)

    Deshpande, Anil

    The fundamental goal of this thesis is to gain a better understanding of the vertical and lateral stratigraphic heterogeneities in sedimentary deposits. Two approaches are taken: Statistical characterization of lithologic variation recorded by geophysical data such as reflection seismic and wireline logs, and stochastic forward modeling of sediment accumulation in basins. Analysis of reflection seismic and wireline log data from Pleistocene fluvial and deltaic deposits in the Eugene Island 330 field, offshore Gulf of Mexico reveal scale-invariant statistics and strong anisotropy in rock properties. Systematic quantification of lateral lithologic heterogeneity within a stratigraphic framework, using reflection seismic data, indicates that fluvial and deltaic depositional systems exhibit statistical behavior related to stratigraphic fabric. Well log and seismic data profiles show a decay in power spectra with wavenumber, k, according to ksp{-beta} with beta between 1 and 2.3. The question of how surface processes are recorded in bed thickness distributions as a function of basin accommodation space is addressed with stochastic sedimentation model. In zones of high accommodation, random, uncorrelated, driving events produce a range of spatially correlated lithology fields. In zones of low accommodation, bed thickness distributions deviate from the random forcing imposed (an exponential thickness distribution). Model results are similar to that of a shallowing upward parasequence recorded in 15 meters of offshore Gulf of Mexico Pleistocene core. These data record a deviation from exponentially distributed bed thicknesses from the deeper water part of the cycle to the shallow part of the cycle where bed amalgamation dominates. Finally, a stochastic basin-fill model is used to explore the primary controls on stratigraphic architecture of turbidite channel-fill in the South Timbalier 295 field, offshore Louisiana Gulf Coast. Spatial and temporal changes in topography and subsidence rate are shown to be the main controls on turbidite channel stacking pattern within this basin. The model predicts the deposition of thick, amalgamated turbidite channel sands in the basin during a period of high initial subsidence followed by deposition of thinner, less connected sands when basin subsidence rate and accommodation space are low.

  18. Complex Seismic Anisotropy at the Edges of a Very-low Velocity Province in the Lowermost Mantle

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Wen, L.

    2005-12-01

    A prominent very-low velocity province (VLVP) in the lowermost mantle is revealed, and has been extensively mapped out in recent seismic studies (e.g., Wang and Wen, 2004). Seismic evidence unambiguously indicates that the VLVP is compositionally distinct, and its seismic structure can be best explained by partial melting driven by a compositional change produced in the early Earth's history (Wen, 2001; Wen et. al, 2001; Wang and Wen, 2004). In this presentation, we study the seismic anisotropic behavior inside the VLVP and its surrounding area using SKS and SKKS waveform data. We collect 272 deep earthquakes recorded by more than 80 stations in the Kaapvaal seismic array in southern Africa from 1997 to 1999. Based on the data quality, we choose SKS and SKKS waveform data for 16 earthquakes to measure the anisotropic parameters: the fast polarization direction and the splitting time, using the method of Silver and Chan (1991). A total of 162 high-quality measurements are obtained based on the statistics analysis of shear wave splitting results. The obtained anisotropy exhibits different patterns for the SKS and SKKS phases sampling inside the VLVP and at the edges of the VLVP. When the SKS and SKKS phases sample inside the VLVP, their fast polarization directions exhibit a pattern that strongly correlates with stations, gradually changing from 11°N~to 80°N~across the seismic array from south to north and rotating back to the North direction over short distances for several northernmost stations. The anisotropy pattern obtained from the analysis of the SKKS phases is the same as that from the SKS phases. However, when the SKS and SKKS phases sample at the edges of the VLVP, the measured anisotropy exhibits a very complex pattern. The obtained fast polarization directions change rapidly over a small distance, and they no longer correlate with stations; the measurements obtained from the SKS analysis also differ with those from the SKKS analysis. As the SKS and SKKS phases have similar propagation paths in the lithosphere beneath the array, but different sampling points near the core mantle boundary. The anisotropy in the lithosphere should have a similar influence on SKS and SKKS phases. Therefore, the similar anisotropy obtained from the SKS and SKKS phases sampling inside the VLVP and its correlation with seismic stations suggest that the observed anisotropy variation across the seismic array is mainly due to the anisotropy in the lithosphere beneath the Kaapvaal seismic array, and the interior of the VLVP is isotropic or weakly anisotropic. On the other hand, for the SKS and SKKS phases sampling at the edges of the VLVP, the observed complex anisotropy pattern and the lack of correlation between the results from the SKS and SKKS analyses indicate that part of that anisotropy has to originate from the lowermost mantle near the exit points of these phases at the core mantle boundary, revealing a complex flow pattern at the edges of the VLVP.

  19. Relations among soil radon, environmental parameters, volcanic and seismic events at Mt. Etna (Italy)

    NASA Astrophysics Data System (ADS)

    Giammanco, S.; Ferrera, E.; Cannata, A.; Montalto, P.; Neri, M.

    2013-12-01

    From November 2009 to April 2011 soil radon activity was continuously monitored using a Barasol probe located on the upper NE flank of Mt. Etna volcano (Italy), close both to the Piano Provenzana fault and to the NE-Rift. Seismic, volcanological and radon data were analysed together with data on environmental parameters, such as air and soil temperature, barometric pressure, snow and rain fall. In order to find possible correlations among the above parameters, and hence to reveal possible anomalous trends in the radon time-series, we used different statistical methods: i) multivariate linear regression; ii) cross-correlation; iii) coherence analysis through wavelet transform. Multivariate regression indicated a modest influence on soil radon from environmental parameters (R2 = 0.31). When using 100-day time windows, the R2 values showed wide variations in time, reaching their maxima (~0.63-0.66) during summer. Cross-correlation analysis over 100-day moving averages showed that, similar to multivariate linear regression analysis, the summer period was characterised by the best correlation between radon data and environmental parameters. Lastly, the wavelet coherence analysis allowed a multi-resolution coherence analysis of the time series acquired. This approach allowed to study the relations among different signals either in the time or in the frequency domain. It confirmed the results of the previous methods, but also allowed to recognize correlations between radon and environmental parameters at different observation scales (e.g., radon activity changed during strong precipitations, but also during anomalous variations of soil temperature uncorrelated with seasonal fluctuations). Using the above analysis, two periods were recognized when radon variations were significantly correlated with marked soil temperature changes and also with local seismic or volcanic activity. This allowed to produce two different physical models of soil gas transport that explain the observed anomalies. Our work suggests that in order to make an accurate analysis of the relations among different signals it is necessary to use different techniques that give complementary analytical information. In particular, the wavelet analysis showed to be the most effective in discriminating radon changes due to environmental influences from those correlated with impending seismic or volcanic events.

  20. Evidence of the non-extensive character of Earth's ambient noise.

    NASA Astrophysics Data System (ADS)

    Koutalonis, Ioannis; Vallianatos, Filippos

    2017-04-01

    Investigation of dynamical features of ambient seismic noise is one of the important scientific and practical research challenges. In the same time there isgrowing interest concerning an approach to study Earth Physics based on thescience of complex systems and non extensive statistical mechanics which is a generalization of Boltzmann-Gibbs statistical physics (Vallianatos et al., 2016).This seems to be a promising framework for studying complex systems exhibitingphenomena such as, long-range interactions, and memory effects. Inthis work we use non-extensive statistical mechanics and signal analysis methodsto explore the nature of ambient noise as measured in the stations of the HSNC in South Aegean (Chatzopoulos et al., 2016). In the present work we analyzed the de-trended increments time series of ambient seismic noise X(t), in time windows of 20 minutes to 10 seconds within "calm time zones" where the human-induced noise presents a minimum. Following the non extensive statistical physics approach, the probability distribution function of the increments of ambient noise is investigated. Analyzing the probability density function (PDF)p(X), normalized to zero mean and unit varianceresults that the fluctuations of Earth's ambient noise follows a q-Gaussian distribution asdefined in the frame of non-extensive statisticalmechanics indicated the possible existence of memory effects in Earth's ambient noise. References: F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016. G. Chatzopoulos, I.Papadopoulos, F.Vallianatos, The Hellenic Seismological Network of Crete (HSNC): Validation and results of the 2013 aftershock,Advances in Geosciences, 41, 65-72, 2016.

  1. Seasonal patterns of seismicity and deformation at the Alutu geothermal reservoir, Ethiopia, induced by hydrological loading

    NASA Astrophysics Data System (ADS)

    Birhanu, Yelebe; Wilks, Matthew; Biggs, Juliet; Kendall, J.-Michael; Ayele, Atalay; Lewi, Elias

    2018-05-01

    Seasonal variations in the seismicity of volcanic and geothermal reservoirs are usually attributed to the hydrological cycle. Here, we focus on the Aluto-Langano geothermal system, Ethiopia, where the climate is monsoonal and there is abundant shallow seismicity. We deployed temporary networks of seismometers and GPS receivers to understand the drivers of unrest. First, we show that a statistically significant peak in seismicity occurred 2-3 months after the main rainy season, with a second, smaller peak of variable timing. Seasonal seismicity is commonly attributed to variations in either surface loading or reservoir pore pressure. As loading will cause subsidence and overpressure will cause uplift, comparing seismicity rates with continuous GPS, enables us to distinguish between mechanisms. At Aluto, the major peak in seismicity is coincident with the high stand of nearby lakes and maximum subsidence, indicating that it is driven by surface loading. The magnitude of loading is insufficient to trigger widespread crustal seismicity but the geothermal reservoir at Aluto is likely sensitive to small perturbations in the stress field. Thus we demonstrate that monsoonal loading can produce seismicity in geothermal reservoirs, and the likelihood of both triggered and induced seismicity varies seasonally.

  2. Scenarios for Evolving Seismic Crises: Possible Communication Strategies

    NASA Astrophysics Data System (ADS)

    Steacy, S.

    2015-12-01

    Recent advances in operational earthquake forecasting mean that we are very close to being able to confidently compute changes in earthquake probability as seismic crises develop. For instance, we now have statistical models such as ETAS and STEP which demonstrate considerable skill in forecasting earthquake rates and recent advances in Coulomb based models are also showing much promise. Communicating changes in earthquake probability is likely be very difficult, however, as the absolute probability of a damaging event is likely to remain quite small despite a significant increase in the relative value. Here, we use a hybrid Coulomb/statistical model to compute probability changes for a series of earthquake scenarios in New Zealand. We discuss the strengths and limitations of the forecasts and suggest a number of possible mechanisms that might be used to communicate results in an actual developing seismic crisis.

  3. A Statistical Investigation on a Seismic Transient Occurred in Italy Between the 17th and 20th Centuries

    NASA Astrophysics Data System (ADS)

    Bragato, P. L.

    2017-03-01

    According to the historical earthquake catalog of Italy, the country experienced a pulse of seismicity between the 17th century, when the rate of destructive events increased by more than 100%, and the 20th century, characterized by a symmetric decrease. In the present work, I performed a statistical analysis to verify the reliability of such transient, considering different sources of bias and uncertainty, such as completeness and declustering of the catalog, as well as errors on magnitude estimation. I also searched for a confirmation externally to the catalog, analyzing the correlation with the volcanic activity. The similarity is high for the eruptive history of Vesuvius, which agrees on both the main rate changes of the 17th and 20th centuries and on minor variations in the intermediate period. Of general interest, beyond the specific case of Italy, the observed rate changes suggest the existence of large-scale crustal processes taking place within decades and lasting for centuries, responsible for the synchronous activation/deactivation of remote, loosely connected faults in different tectonic domains. Although their origin is still unexplained (I discuss a possible link with the climate changes and the consequent variations of the sea level), their existence and long lasting is critical for seismic hazard computation. In fact, they introduce a hardly predictable time variability that undermines any hypothesis of regularity of the earthquake cycle on individual faults and systems of interconnected faults.

  4. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  5. Probabilistic seismic loss estimation via endurance time method

    NASA Astrophysics Data System (ADS)

    Tafakori, Ehsan; Pourzeynali, Saeid; Estekanchi, Homayoon E.

    2017-01-01

    Probabilistic Seismic Loss Estimation is a methodology used as a quantitative and explicit expression of the performance of buildings using terms that address the interests of both owners and insurance companies. Applying the ATC 58 approach for seismic loss assessment of buildings requires using Incremental Dynamic Analysis (IDA), which needs hundreds of time-consuming analyses, which in turn hinders its wide application. The Endurance Time Method (ETM) is proposed herein as part of a demand propagation prediction procedure and is shown to be an economical alternative to IDA. Various scenarios were considered to achieve this purpose and their appropriateness has been evaluated using statistical methods. The most precise and efficient scenario was validated through comparison against IDA driven response predictions of 34 code conforming benchmark structures and was proven to be sufficiently precise while offering a great deal of efficiency. The loss values were estimated by replacing IDA with the proposed ETM-based procedure in the ATC 58 procedure and it was found that these values suffer from varying inaccuracies, which were attributed to the discretized nature of damage and loss prediction functions provided by ATC 58.

  6. Results of an Analysis of Field Studies of the Intrinsic Dynamic Characteristics Important for the Safety of Nuclear Power Plant Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaznovsky, A. P., E-mail: kaznovskyap@atech.ru; Kasiyanov, K. G.; Ryasnyj, S. I.

    2015-01-15

    A classification of the equipment important for the safety of nuclear power plants is proposed in terms of its dynamic behavior under seismic loading. An extended bank of data from dynamic tests over the entire range of thermal and mechanical equipment in generating units with VVER-1000 and RBMK-1000 reactors is analyzed. Results are presented from a study of the statistical behavior of the distribution of vibrational frequencies and damping decrements with the “small perturbation” factor that affects the measured damping decrements taken into account. A need to adjust the regulatory specifications for choosing the values of the damping decrements withmore » specified inertial loads on equipment owing to seismic effects during design calculations is identified. Minimum values of the decrements are determined and proposed for all types of equipment as functions of the directions and natural vibration frequencies of the dynamic interactions to be adopted as conservative standard values in the absence of actual experimental data in the course of design studies of seismic resistance.« less

  7. The Canarian Seismic Monitoring Network: design, development and first result

    NASA Astrophysics Data System (ADS)

    D'Auria, Luca; Barrancos, José; Padilla, Germán D.; García-Hernández, Rubén; Pérez, Aaron; Pérez, Nemesio M.

    2017-04-01

    Tenerife is an active volcanic island which experienced several eruptions of moderate intensity in historical times, and few explosive eruptions in the Holocene. The increasing population density and the consistent number of tourists are constantly raising the volcanic risk. In June 2016 Instituto Volcanologico de Canarias started the deployment of a seismological volcano monitoring network consisting of 15 broadband seismic stations. The network began its full operativity in November 2016. The aim of the network are both volcano monitoring and scientific research. Currently data are continuously recorded and processed in real-time. Seismograms, hypocentral parameters, statistical informations about the seismicity and other data are published on a web page. We show the technical characteristics of the network and an estimate of its detection threshold and earthquake location performances. Furthermore we present other near-real time procedures on the data: analysis of the ambient noise for determining the shallow velocity model and temporal velocity variations, detection of earthquake multiplets through massive data mining of the seismograms and automatic relocation of events through double-difference location.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNally, K.C.; Minster, J.B.

    Revised estimates of seismic slip rates along the Middle America Trench are lower on the average than plate convergence rates but match them locally (for example, Oaxaca). Along the Cocos-North American plate boundary this can be explained by nonuniformities in slip at points of aseismic ridge or fracture zone subduction. For at least 81 yr (and possibly several hundred years), no major (M/sub s/> or =7.5) shallow earthquake is known to have occurred near the Orozco Fracture Zone and Tehuantepec Ridge areas. Compared with the average recurrence periods for large earthquakes (33 +- 8 yr since 1898 and 35 +-more » 24 yr between 1542 and 1979), this suggests that either a large (M> or =8.4) event may be anticipated at such locations, or that these are points of aseismic subduction. Large coastal terraces and evidence suggesting tectonic uplift are found onshore near the Orozco Fracture zone. The larger discrepancy between plate convergence and seismic slip rates along the Cocos-Carribbean plate boundary is more likely due to decoupling and downbending of the subducted plate. We used the limited statistical evidence available to characterize both spatial and temporal deficiencies in recent seismic slip. The observations appear consistent with a possible forthcoming episode of more intense seismic activity. Based on a series of comparisons with carefully delineated aftershock zones, we conclude that the zones of anomalous seismic activity can be indentified by a systematic, automated analysis of the worldwide earthquake catalog (m/sub b/> or =4).« less

  9. Systematic Detection of Remotely Triggered Seismicity in Africa Following Recent Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Ayorinde, A. O.; Peng, Z.; Yao, D.; Bansal, A. R.

    2016-12-01

    It is well known that large distant earthquakes can trigger micro-earthquakes/tectonic tremors during or immediately following their surface waves. Globally, triggered earthquakes have been mostly found in active plate boundary regions. It is not clear whether they could occur within stable intraplate regions in Africa as well as the active East African Rift Zone. In this study we conduct a systematic study of remote triggering in Africa following recent large earthquakes, including the 2004 Mw9.1 Sumatra and 2012 Mw8.6 Indian Ocean earthquakes. In particular, the 2012 Indian Ocean earthquake is the largest known strike slip earthquake and has triggered a global increase of magnitude larger than 5.5 earthquakes as well as numerous micro-earthquakes/tectonic tremors around the world. The entire Africa region was examined for possible remotely triggered seismicity using seismic data downloaded from the Incorporated Research Institutes for Seismology (IRIS) Data Management Center (DMC) and GFZ German Research Center for Geosciences. We apply a 5-Hz high-pass-filter to the continuous waveforms and visually identify high-frequency signals during and immediately after the large amplitude surface waves. Spectrograms are computed as additional tools to identify triggered seismicities and we further confirm them by statistical analysis comparing the high-frequency signals before and after the distant mainshocks. So far we have identified possible triggered seismicity in Botswana and northern Madagascar. This study could help to understand dynamic triggering in diverse tectonic settings of the African continent.

  10. Bayesian inference on earthquake size distribution: a case study in Italy

    NASA Astrophysics Data System (ADS)

    Licia, Faenza; Carlo, Meletti; Laura, Sandri

    2010-05-01

    This paper is focused on the study of earthquake size statistical distribution by using Bayesian inference. The strategy consists in the definition of an a priori distribution based on instrumental seismicity, and modeled as a power law distribution. By using the observed historical data, the power law is then modified in order to obtain the posterior distribution. The aim of this paper is to define the earthquake size distribution using all the seismic database available (i.e., instrumental and historical catalogs) and a robust statistical technique. We apply this methodology to the Italian seismicity, dividing the territory in source zones as done for the seismic hazard assessment, taken here as a reference model. The results suggest that each area has its own peculiar trend: while the power law is able to capture the mean aspect of the earthquake size distribution, the posterior emphasizes different slopes in different areas. Our results are in general agreement with the ones used in the seismic hazard assessment in Italy. However, there are areas in which a flattening in the curve is shown, meaning a significant departure from the power law behavior and implying that there are some local aspects that a power law distribution is not able to capture.

  11. Pre-seismic anomalous geomagnetic signature related to M8.3 earthquake occurred in Chile on September 16-th, 2015

    NASA Astrophysics Data System (ADS)

    Armand Stanica, Dragos, ,, Dr.; Stanica, Dumitru, ,, Dr.; Vladimirescu, Nicoleta

    2016-04-01

    In this paper, we retrospectively analyzed the geomagnetic data collected, via internet (www.intermagnet.com), on the interval 01 July-30 September 2015 at the observatories Easter Island (IMP) and Pilar (PIL), placed in Chile and Argentina, respectively, to emphasize a possible relationship between the pre-seismic anomalous behavior of the normalized function Bzn and M8.3 earthquake, that occurred in Offshore Coquimbo (Chile) on September 16-th, 2015. The daily mean distributions of the normalized function Bzn=Bz/Bperp (where Bz is vertical component of the geomagnetic field; Bperp is geomagnetic component perpendicular to the geoelectrical strike) and its standard deviation (STDEV) are performed in the ULF frequency range 0.001Hz to 0.0083Hz by using the FFT band-pass filter analysis. It was demonstrated that in pre-seismic conditions the Bzn has a significant enhancement due to the crustal electrical conductivity changes, possibly associated with the earthquake-induced rupture-processes and high-pressure fluid flow through the faulting system developed inside the foci and its neighboring area. After analyzing the anomalous values of the normalized function Bzn obtained at Easter Island and Pilar observatories, the second one taken as reference, we used a statistical analysis, based on a standardized random variable equation, to identify on 1-2 September 2015 a pre-seismic signature related to the M8.3 earthquake. The lead time was 14 days before the M8.3 earthquake occurrence. The final conclusion is that the proposed geomagnetic methodology might be used to provide suitable information for the extreme earthquake hazard assessment.

  12. Seismic facies analysis based on self-organizing map and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian

    2015-01-01

    Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.

  13. Pole tide triggering of seismicity

    NASA Astrophysics Data System (ADS)

    Gorshkov, V.

    2015-08-01

    The influence of the pole tide (PT) on intensity of seismic process is searched on base of Harvard Centroid-moment tensors catalogue (CMT). The normal and shear stresses excited by PT were calculated for each earthquake (EQ) from CMT (32.3 thousands of EQ events after for- and aftershock declustering). There was revealed that there are two maxima of PT influence on weak (less 5.5 magnitudes) thrust-slip EQ near the both extrema (min and max) of shear stress. This influence has 95 % level of statistical significance by Schuster and χ^2 criteria and could explain the 0.6-year periodicity in seismic intensity spectrum. The PT influence on seismicity becomes negligible when PT variations decrease up to 100~mas. This could explain 6-7 years periodicity in seismic intensity spectrum.

  14. Improvement of a picking algorithm real-time P-wave detection by kurtosis

    NASA Astrophysics Data System (ADS)

    Ishida, H.; Yamada, M.

    2016-12-01

    Earthquake early warning (EEW) requires fast and accurate P-wave detection. The current EEW system in Japan uses the STA/LTAalgorithm (Allen, 1978) to detect P-wave arrival.However, some stations did not trigger during the 2011 Great Tohoku Earthquake due to the emergent onset. In addition, accuracy of the P-wave detection is very important: on August 1, 2016, the EEW issued a false alarm with M9 in Tokyo region due to a thunder noise.To solve these problems, we use a P-wave detection method using kurtosis statistics. It detects the change of statistic distribution of the waveform amplitude. This method was recently developed (Saragiotis et al., 2002) and used for off-line analysis such as making seismic catalogs. To apply this method for EEW, we need to remove an acausal calculation and enable a real-time processing. Here, we propose a real-time P-wave detection method using kurtosis statistics with a noise filter.To avoid false triggering by a noise, we incorporated a simple filter to classify seismic signal and noise. Following Kong et al. (2016), we used the interquartilerange and zero cross rate for the classification. The interquartile range is an amplitude measure that is equal to the middle 50% of amplitude in a certain time window. The zero cross rate is a simple frequency measure that counts the number of times that the signal crosses baseline zero. A discriminant function including these measures was constructed by the linear discriminant analysis.To test this kurtosis method, we used strong motion records for 62 earthquakes between April, 2005 and July, 2015, which recorded the seismic intensity greater equal to 6 lower in the JMA intensity scale. The records with hypocentral distance < 200km were used for the analysis. An attached figure shows the error of P-wave detection speed for STA/LTA and kurtosis methods against manual picks. It shows that the median error is 0.13 sec and 0.035 sec for STA/LTA and kurtosis method. The kurtosis method tends to be more sensitive to small changes in amplitude.Our approach will contribute to improve the accuracy of source location determination of earthquakes and improve the shaking intensity estimation for an earthquake early warning.

  15. Study of time dynamics of seismicity for the Mexican subduction zone by means of the visibility graph method.

    NASA Astrophysics Data System (ADS)

    Ramírez-Rojas, Alejandro; Telesca, Luciano; Lovallo, Michele; Flores, Leticia

    2015-04-01

    By using the method of the visibility graph (VG), five magnitude time series extracted from the seismic catalog of the Mexican subduction zone were investigated. The five seismic sequences represent the seismicity which occurred between 2005 and 2012 in five seismic areas: Guerrero, Chiapas, Oaxaca, Jalisco and Michoacan. Among the five seismic sequences, the Jalisco sequence shows VG properties significantly different from those shown by the other four. Such a difference could be inherent in the different tectonic settings of Jalisco with respect to those characterizing the other four areas. The VG properties of the seismic sequences have been put in relationship with the more typical seismological characteristics (b-value and a-value of the Gutenberg-Richter law). The present study was supported by the Bilateral Project Italy-Mexico "Experimental Stick-slip models of tectonic faults: innovative statistical approaches applied to synthetic seismic sequences", jointly funded by MAECI (Italy) and AMEXCID (Mexico) in the framework of the Bilateral Agreement for Scientific and Technological Cooperation PE 2014-2016

  16. Dominant seismic sources for the cities in South Sumatra

    NASA Astrophysics Data System (ADS)

    Sunardi, Bambang; Sakya, Andi Eka; Masturyono, Murjaya, Jaya; Rohadi, Supriyanto; Sulastri, Putra, Ade Surya

    2017-07-01

    Subduction zone along west of Sumatra and Sumatran fault zone are active seismic sources. Seismotectonically, South Sumatra could be affected by earthquakes triggered by these seismic sources. This paper discussed contribution of each seismic source to earthquake hazards for cities of Palembang, Prabumulih, Banyuasin, OganIlir, Ogan Komering Ilir, South Oku, Musi Rawas and Empat Lawang. These hazards are presented in form of seismic hazard curves. The study was conducted by using Probabilistic Seismic Hazard Analysis (PSHA) of 2% probability of exceedance in 50 years. Seismic sources used in analysis included megathrust zone M2 of Sumatra and South Sumatra, background seismic sources and shallow crustal seismic sources consist of Ketaun, Musi, Manna and Kumering faults. The results of the study showed that for cities relatively far from the seismic sources, subduction / megathrust seismic source with a depth ≤ 50 km greatly contributed to the seismic hazard and the other areas showed deep background seismic sources with a depth of more than 100 km dominate to seismic hazard respectively.

  17. Using a coupled hydro-mechanical fault model to better understand the risk of induced seismicity in deep geothermal projects

    NASA Astrophysics Data System (ADS)

    Abe, Steffen; Krieger, Lars; Deckert, Hagen

    2017-04-01

    The changes of fluid pressures related to the injection of fluids into the deep underground, for example during geothermal energy production, can potentially reactivate faults and thus cause induced seismic events. Therefore, an important aspect in the planning and operation of such projects, in particular in densely populated regions such as the Upper Rhine Graben in Germany, is the estimation and mitigation of the induced seismic risk. The occurrence of induced seismicity depends on a combination of hydraulic properties of the underground, mechanical and geometric parameters of the fault, and the fluid injection regime. In this study we are therefore employing a numerical model to investigate the impact of fluid pressure changes on the dynamics of the faults and the resulting seismicity. The approach combines a model of the fluid flow around a geothermal well based on a 3D finite difference discretisation of the Darcy-equation with a 2D block-slider model of a fault. The models are coupled so that the evolving pore pressure at the relevant locations of the hydraulic model is taken into account in the calculation of the stick-slip dynamics of the fault model. Our modelling approach uses two subsequent modelling steps. Initially, the fault model is run by applying a fixed deformation rate for a given duration and without the influence of the hydraulic model in order to generate the background event statistics. Initial tests have shown that the response of the fault to hydraulic loading depends on the timing of the fluid injection relative to the seismic cycle of the fault. Therefore, multiple snapshots of the fault's stress- and displacement state are generated from the fault model. In a second step, these snapshots are then used as initial conditions in a set of coupled hydro-mechanical model runs including the effects of the fluid injection. This set of models is then compared with the background event statistics to evaluate the change in the probability of seismic events. The event data such as location, magnitude, and source characteristics can be used as input for numerical wave propagation models. This allows the translation of seismic event statistics generated by the model into ground shaking probabilities.

  18. Computing the Distribution of Pareto Sums Using Laplace Transformation and Stehfest Inversion

    NASA Astrophysics Data System (ADS)

    Harris, C. K.; Bourne, S. J.

    2017-05-01

    In statistical seismology, the properties of distributions of total seismic moment are important for constraining seismological models, such as the strain partitioning model (Bourne et al. J Geophys Res Solid Earth 119(12): 8991-9015, 2014). This work was motivated by the need to develop appropriate seismological models for the Groningen gas field in the northeastern Netherlands, in order to address the issue of production-induced seismicity. The total seismic moment is the sum of the moments of individual seismic events, which in common with many other natural processes, are governed by Pareto or "power law" distributions. The maximum possible moment for an induced seismic event can be constrained by geomechanical considerations, but rather poorly, and for Groningen it cannot be reliably inferred from the frequency distribution of moment magnitude pertaining to the catalogue of observed events. In such cases it is usual to work with the simplest form of the Pareto distribution without an upper bound, and we follow the same approach here. In the case of seismicity, the exponent β appearing in the power-law relation is small enough for the variance of the unbounded Pareto distribution to be infinite, which renders standard statistical methods concerning sums of statistical variables, based on the central limit theorem, inapplicable. Determinations of the properties of sums of moderate to large numbers of Pareto-distributed variables with infinite variance have traditionally been addressed using intensive Monte Carlo simulations. This paper presents a novel method for accurate determination of the properties of such sums that is accurate, fast and easily implemented, and is applicable to Pareto-distributed variables for which the power-law exponent β lies within the interval [0, 1]. It is based on shifting the original variables so that a non-zero density is obtained exclusively for non-negative values of the parameter and is identically zero elsewhere, a property that is shared by the sum of an arbitrary number of such variables. The technique involves applying the Laplace transform to the normalized sum (which is simply the product of the Laplace transforms of the densities of the individual variables, with a suitable scaling of the Laplace variable), and then inverting it numerically using the Gaver-Stehfest algorithm. After validating the method using a number of test cases, it was applied to address the distribution of total seismic moment, and the quantiles computed for various numbers of seismic events were compared with those obtained in the literature using Monte Carlo simulation. Excellent agreement was obtained. As an application, the method was applied to the evolution of total seismic moment released by tremors due to gas production in the Groningen gas field in the northeastern Netherlands. The speed, accuracy and ease of implementation of the method allows the development of accurate correlations for constraining statistical seismological models using, for example, the maximum-likelihood method. It should also be of value in other natural processes governed by Pareto distributions with exponent less than unity.

  19. Geothermal production and reduced seismicity: Correlation and proposed mechanism

    NASA Astrophysics Data System (ADS)

    Cardiff, Michael; Lim, David D.; Patterson, Jeremy R.; Akerley, John; Spielman, Paul; Lopeman, Janice; Walsh, Patrick; Singh, Ankit; Foxall, William; Wang, Herbert F.; Lord, Neal E.; Thurber, Clifford H.; Fratta, Dante; Mellors, Robert J.; Davatzes, Nicholas C.; Feigl, Kurt L.

    2018-01-01

    At Brady Hot Springs, a geothermal field in Nevada, heated fluids have been extracted, cooled, and re-injected to produce electrical power since 1992. Analysis of daily pumping records and catalogs of microseismicity between 2010 and 2015 indicates a statistically significant correlation between days when the daily volume of production was at or above its long-term average rate and days when no seismic event was detected. Conversely, shutdowns in pumping for plant maintenance correlate with increased microseismicity. We hypothesize that the effective stress in the subsurface has adapted to the long-term normal operations (deep extraction) at the site. Under this hypothesis, extraction of fluids inhibits fault slip by increasing the effective stress on faults; in contrast, brief pumping cessations represent times when effective stress is decreased below its long-term average, increasing the likelihood of microseismicity.

  20. Surface-wave and refraction tomography at the FACT Site, Sandia National Laboratories, Albuquerque, New Mexico.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbott, Robert E.; Bartel, Lewis Clark; Pullammanappallil, Satish

    2006-08-01

    We present a technique that allows for the simultaneous acquisition and interpretation of both shear-wave and compressive-wave 3-D velocities. The technique requires no special seismic sources or array geometries, and is suited to studies with small source-receiver offsets. The method also effectively deals with unwanted seismic arrivals by using the statistical properties of the data itself to discriminate against spurious picks. We demonstrate the technique with a field experiment at the Facility for Analysis, Calibration, and Testing at Sandia National Laboratories, Albuquerque, New Mexico. The resulting 3-D shear-velocity and compressive-velocity distributions are consistent with surface geologic mapping. The averaged velocitiesmore » and V{sub p}/V{sub s} ratio in the upper 30 meters are also consistent with examples found in the scientific literature.« less

  1. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2004

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Estes, Steve; Prejean, Stephanie; Sanchez, John J.; Sanches, Rebecca; McNutt, Stephen R.; Paskievitch, John

    2005-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988. The primary objectives of the seismic program are the real-time seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents the calculated earthquake hypocenter and phase arrival data, and changes in the seismic monitoring program for the period January 1 through December 31, 2004.These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai volcanic cluster (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Mount Peulik, Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Okmok Caldera, Great Sitkin Volcano, Kanaga Volcano, Tanaga Volcano, and Mount Gareloi. Over the past year, formal monitoring of Okmok, Tanaga and Gareloi were announced following an extended period of monitoring to determine the background seismicity at each volcanic center. The seismicity at Mount Peulik was still being studied at the end of 2004 and has yet to be added to the list of monitored volcanoes in the AVO weekly update. AVO located 6928 earthquakes in 2004.Monitoring highlights in 2004 include: (1) an earthquake swarm at Westdahl Peak in January; (2) an increase in seismicity at Mount Spurr starting in February continuing through the end of the year into 2005; (4) low-level tremor, and low-frequency events related to intermittent ash and steam emissions at Mount Veniaminof between April and October; (4) low-level tremor at Shishaldin Volcano between April and October; (5) an earthquake swarm at Akutan in July; and (6) low-level tremor at Okmok Caldera throughout the year (Table 2). Instrumentation and data acquisition highlights in 2004 were the installation of subnetworks on Mount Peulik and Korovin Volcano and the installation of broadband stations to augment the Katmai and Spurr subnetworks.This catalog includes: (1) a description of instruments deployed in the field and their locations; (2) a description of earthquake detection, recording, analysis, and data archival systems; (3) a description of velocity models used for earthquake locations; (4) a summary of earthquakes located in 2004; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, and location quality statistics; daily station usage statistics; and all HYPOELLIPSE files used to determine the earthquake locations in 2004.

  2. Implementing the effect of the rupture directivity on PSHA maps: Application to the Marmara Region (Turkey)

    NASA Astrophysics Data System (ADS)

    Herrero, Andre; Spagnuolo, Elena; Akinci, Aybige; Pucci, Stefano

    2016-04-01

    In the present study we attempted to improve the seismic hazard assessment taking into account possible sources of epistemic uncertainty and the azimuthal variability of the ground motions which, at a particular site, is significantly influenced by the rupture mechanism and the rupture direction relative to the site. As a study area we selected Marmara Region (Turkey), especially the city of Istanbul which is characterized by one of the highest levels of seismic risk in Europe and the Mediterranean region. The seismic hazard in the city is mainly associated with two active fault segments which are located at about 20-30 km south of Istanbul. In this perspective first we proposed a methodology to incorporate this new information such as nucleation point in a probabilistic seismic hazard analysis (PSHA) framework. Secondly we introduced information about those fault segments by focusing on the fault rupture characteristics which affect the azimuthal variations of the ground motion spatial distribution i.e. source directivity effect and its influence on the probabilistic seismic hazard analyses (PSHA). An analytical model developed by Spudich and Chiou (2008) is used as a corrective factor that modifies the Next Generation Attenuation (NGA, Power et al. 2008) ground motion predictive equations (GMPEs) introducing rupture related parameters that generally lump together into the term directivity effect. We used the GMPEs as derived by the Abrahamson and Silva (2008) and the Boore and Atkinson (2008); our results are given in terms of 10% probability of exceedance of PSHA (at several periods from 0.5 s to 10 s) in 50 years on rock site condition; the correction for directivity introduces a significant contribution to the percentage ratio between the seismic hazards computed using the directivity model respect to the seismic hazard standard practice. In particular, we benefited the dynamic simulation from a previous study (Aochi & Utrich, 2015) aimed at evaluating the seismic potential of the Marmara region to derive a statistical distribution for nucleation position. Our results suggest that accounting for rupture related parameters in a PSHA using deterministic information from dynamic models is feasible and in particular, the use of a non-uniform statistical distribution for nucleation position has serious consequences on the hazard assessment. Since the directivity effect is conditional on the nucleation position the hazard map changes with the assumptions made. A worst case scenario (both the faults are rupturing towards the city of Istanbul) predicts up to 25% change than the standard formulation at 2 sec and increases with longer periods. The former result is heavily different if a deterministically based nucleation position is assumed.

  3. Induced Seismicity Monitoring System

    NASA Astrophysics Data System (ADS)

    Taylor, S. R.; Jarpe, S.; Harben, P.

    2014-12-01

    There are many seismological aspects associated with monitoring of permanent storage of carbon dioxide (CO2) in geologic formations. Many of these include monitoring underground gas migration through detailed tomographic studies of rock properties, integrity of the cap rock and micro seismicity with time. These types of studies require expensive deployments of surface and borehole sensors in the vicinity of the CO2 injection wells. Another problem that may exist in CO2 sequestration fields is the potential for damaging induced seismicity associated with fluid injection into the geologic reservoir. Seismic hazard monitoring in CO2 sequestration fields requires a seismic network over a spatially larger region possibly having stations in remote settings. Expensive observatory-grade seismic systems are not necessary for seismic hazard deployments or small-scale tomographic studies. Hazard monitoring requires accurate location of induced seismicity to magnitude levels only slightly less than that which can be felt at the surface (e.g. magnitude 1), and the frequencies of interest for tomographic analysis are ~1 Hz and greater. We have developed a seismo/acoustic smart sensor system that can achieve the goals necessary for induced seismicity monitoring in CO2 sequestration fields. The unit is inexpensive, lightweight, easy to deploy, can operate remotely under harsh conditions and features 9 channels of recording (currently 3C 4.5 Hz geophone, MEMS accelerometer and microphone). An on-board processor allows for satellite transmission of parameter data to a processing center. Continuous or event-detected data is kept on two removable flash SD cards of up to 64+ Gbytes each. If available, data can be transmitted via cell phone modem or picked up via site visits. Low-power consumption allows for autonomous operation using only a 10 watt solar panel and a gel-cell battery. The system has been successfully tested for long-term (> 6 months) remote operations over a wide range of environments including summer in Arizona to winter above 9000' in the mountains of southern Colorado. Statistically based on-board processing is used for detection, arrival time picking, back azimuth estimation and magnitude estimates from coda waves and acoustic signals.

  4. An application of synthetic seismicity in earthquake statistics - The Middle America Trench

    NASA Technical Reports Server (NTRS)

    Ward, Steven N.

    1992-01-01

    The way in which seismicity calculations which are based on the concept of fault segmentation incorporate the physics of faulting through static dislocation theory can improve earthquake recurrence statistics and hone the probabilities of hazard is shown. For the Middle America Trench, the spread parameters of the best-fitting lognormal or Weibull distributions (about 0.75) are much larger than the 0.21 intrinsic spread proposed in the Nishenko Buland (1987) hypothesis. Stress interaction between fault segments disrupts time or slip predictability and causes earthquake recurrence to be far more aperiodic than has been suggested.

  5. Quantification of depositional changes and paleo-seismic activities from laminated sediments using outcrop data

    NASA Astrophysics Data System (ADS)

    Weidlich, O.; Bernecker, M.

    2004-04-01

    Measurements of laminations from marine and limnic sediments are commonly a time-consuming procedure. However, the resulting quantitative proxies are of importance for the interpretation of both, climate changes and paleo-seismic activities. Digital image analysis accelerates the generation and interpretation of large data sets from laminated sediments based on contrasting grey values of dark and light laminae. Statistical transformation and correlation of the grey value signals reflect high frequency cycles due to changing mean laminae thicknesses, and thus provide data monitoring climate change. Perturbations (e.g., slumping structures, seismites, and tsunamites) of the commonly continuous laminae record seismic activities and obtain proxies for paleo-earthquake frequency. Using outcrop data from (i) the Pleistocene Lisan Formation of Jordan (Dead Sea Basin) and (ii) the Carboniferous-Permian Copacabana Formation of Bolivia (Lake Titicaca), we present a two-step approach to gain high-resolution time series based on field data for both purposes from unconsolidated and lithified outcrops. Step 1 concerns the construction of a continuous digital phototransect and step 2 covers the creation of a grey density curve based on digital photos along a line transect using image analysis. The applied automated image analysis technique provides a continuous digital record of the studied sections and, therefore, serves as useful tool for the evaluation of further proxy data. Analysing the obtained grey signal of the light and dark laminae of varves using phototransects, we discuss the potential and limitations of the proposed technique.

  6. Statistical characterization of Earth’s heterogeneities from seismic scattering

    NASA Astrophysics Data System (ADS)

    Zheng, Y.; Wu, R.

    2009-12-01

    The distortion of a teleseismic wavefront carries information about the heterogeneities through which the wave propagates and it is manifestited as logarithmic amplitude (logA) and phase fluctuations of the direct P wave recorded by a seismic network. By cross correlating the fluctuations (e.g., logA-logA or phase-phase), we obtain coherence functions, which depend on spatial lags between stations and incident angles between the incident waves. We have mathematically related the depth-dependent heterogeneity spectrum to the observable coherence functions using seismic scattering theory. We will show that our method has sharp depth resolution. Using the HiNet seismic network data in Japan, we have inverted power spectra for two depth ranges, ~0-120km and below ~120km depth. The coherence functions formed by different groups of stations or by different groups of earthquakes at different back azimuths are similar. This demonstrates that the method is statistically stable and the inhomogeneities are statistically stationary. In both depth intervals, the trend of the spectral amplitude decays from large scale to small scale in a power-law fashion with exceptions at ~50km for the logA data. Due to the spatial spacing of the seismometers, only information from length scale 15km to 200km is inverted. However our scattering method provides new information on small to intermediate scales that are comparable to scales of the recycled materials and thus is complimentary to the global seismic tomography which reveals mainly large-scale heterogeneities on the order of ~1000km. The small-scale heterogeneities revealed here are not likely of pure thermal origin. Therefore, the length scale and strength of heterogeneities as a function of depth may provide important constraints in mechanical mixing of various components in the mantle convection.

  7. Fast principal component analysis for stacking seismic data

    NASA Astrophysics Data System (ADS)

    Wu, Juan; Bai, Min

    2018-04-01

    Stacking seismic data plays an indispensable role in many steps of the seismic data processing and imaging workflow. Optimal stacking of seismic data can help mitigate seismic noise and enhance the principal components to a great extent. Traditional average-based seismic stacking methods cannot obtain optimal performance when the ambient noise is extremely strong. We propose a principal component analysis (PCA) algorithm for stacking seismic data without being sensitive to noise level. Considering the computational bottleneck of the classic PCA algorithm in processing massive seismic data, we propose an efficient PCA algorithm to make the proposed method readily applicable for industrial applications. Two numerically designed examples and one real seismic data are used to demonstrate the performance of the presented method.

  8. Earthquake forecasting during the complex Amatrice-Norcia seismic sequence

    PubMed Central

    Marzocchi, Warner; Taroni, Matteo; Falcone, Giuseppe

    2017-01-01

    Earthquake forecasting is the ultimate challenge for seismologists, because it condenses the scientific knowledge about the earthquake occurrence process, and it is an essential component of any sound risk mitigation planning. It is commonly assumed that, in the short term, trustworthy earthquake forecasts are possible only for typical aftershock sequences, where the largest shock is followed by many smaller earthquakes that decay with time according to the Omori power law. We show that the current Italian operational earthquake forecasting system issued statistically reliable and skillful space-time-magnitude forecasts of the largest earthquakes during the complex 2016–2017 Amatrice-Norcia sequence, which is characterized by several bursts of seismicity and a significant deviation from the Omori law. This capability to deliver statistically reliable forecasts is an essential component of any program to assist public decision-makers and citizens in the challenging risk management of complex seismic sequences. PMID:28924610

  9. Earthquake forecasting during the complex Amatrice-Norcia seismic sequence.

    PubMed

    Marzocchi, Warner; Taroni, Matteo; Falcone, Giuseppe

    2017-09-01

    Earthquake forecasting is the ultimate challenge for seismologists, because it condenses the scientific knowledge about the earthquake occurrence process, and it is an essential component of any sound risk mitigation planning. It is commonly assumed that, in the short term, trustworthy earthquake forecasts are possible only for typical aftershock sequences, where the largest shock is followed by many smaller earthquakes that decay with time according to the Omori power law. We show that the current Italian operational earthquake forecasting system issued statistically reliable and skillful space-time-magnitude forecasts of the largest earthquakes during the complex 2016-2017 Amatrice-Norcia sequence, which is characterized by several bursts of seismicity and a significant deviation from the Omori law. This capability to deliver statistically reliable forecasts is an essential component of any program to assist public decision-makers and citizens in the challenging risk management of complex seismic sequences.

  10. Anomalous decrease in relatively large shocks and increase in the p and b values preceding the April 16, 2016, M7.3 earthquake in Kumamoto, Japan

    NASA Astrophysics Data System (ADS)

    Nanjo, K. Z.; Yoshida, A.

    2017-01-01

    The 2016 Kumamoto earthquakes in Kyushu, Japan, started with a magnitude ( M) 6.5 quake on April 14 on the Hinagu fault zone (FZ), followed by active seismicity including an M6.4 quake. Eventually, an M7.3 quake occurred on April 16 on the Futagawa FZ. We investigated if any sign indicative of the M7.3 quake could be found in the space-time changes in seismicity after the M6.5 quake. As a quality control, we determined in advance the threshold magnitude, above which all earthquakes are completely recorded. We then showed that the occurrence rate of relatively large ( M ≥ 3) earthquakes significantly decreased 1 day before the M7.3 quake. Significance of this decrease was evaluated by one standard deviation of sampled changes in the rate of occurrence. We next confirmed that seismicity with M ≥ 3 was well modeled by the Omori-Utsu law with p 1.5 ± 0.3, which indicates that the temporal decay of seismicity was significantly faster than a typical decay with p = 1. The larger p value was obtained when we used data of the longer time period in the analysis. This significance was confirmed by a bootstrapping approach. Our detailed analysis shows that the large p value was caused by the rapid decay of the seismicity in the northern area around the Futagawa FZ. Application of the slope (the b value) of the Gutenberg-Richter frequency-magnitude distribution to the spatiotemporal change in the seismicity revealed that the b value in the northern area increased significantly, the increase being Δ b = 0.3-0.5. Significance was verified by a statistical test of Δ b and a test using bootstrapping errors. Based on our findings, combined with the results obtained by a stress inversion analysis performed by the National Research Institute for Earth Science and Disaster Resilience, we suggested that stress near the Futagawa FZ had reduced just prior to the occurrence of the M7.3 quake. We proposed, with some other observations, that a reduction in stress might have been induced by growth of the slow slips on the Futagawa FZ.[Figure not available: see fulltext.

  11. Analysis and Simulation of Far-Field Seismic Data from the Source Physics Experiment

    DTIC Science & Technology

    2012-09-01

    ANALYSIS AND SIMULATION OF FAR-FIELD SEISMIC DATA FROM THE SOURCE PHYSICS EXPERIMENT Arben Pitarka, Robert J. Mellors, Arthur J. Rodgers, Sean...Security Site (NNSS) provides new data for investigating the excitation and propagation of seismic waves generated by buried explosions. A particular... seismic model. The 3D seismic model includes surface topography. It is based on regional geological data, with material properties constrained by shallow

  12. Short-term earthquake forecasting based on an epidemic clustering model

    NASA Astrophysics Data System (ADS)

    Console, Rodolfo; Murru, Maura; Falcone, Giuseppe

    2016-04-01

    The application of rigorous statistical tools, with the aim of verifying any prediction method, requires a univocal definition of the hypothesis, or the model, characterizing the concerned anomaly or precursor, so as it can be objectively recognized in any circumstance and by any observer. This is mandatory to build up on the old-fashion approach consisting only of the retrospective anecdotic study of past cases. A rigorous definition of an earthquake forecasting hypothesis should lead to the objective identification of particular sub-volumes (usually named alarm volumes) of the total time-space volume within which the probability of occurrence of strong earthquakes is higher than the usual. The test of a similar hypothesis needs the observation of a sufficient number of past cases upon which a statistical analysis is possible. This analysis should be aimed to determine the rate at which the precursor has been followed (success rate) or not followed (false alarm rate) by the target seismic event, or the rate at which a target event has been preceded (alarm rate) or not preceded (failure rate) by the precursor. The binary table obtained from this kind of analysis leads to the definition of the parameters of the model that achieve the maximum number of successes and the minimum number of false alarms for a specific class of precursors. The mathematical tools suitable for this purpose may include the definition of Probability Gain or the R-Score, as well as the application of popular plots such as the Molchan error-diagram and the ROC diagram. Another tool for evaluating the validity of a forecasting method is the concept of the likelihood ratio (also named performance factor) of occurrence and non-occurrence of seismic events under different hypotheses. Whatever is the method chosen for building up a new hypothesis, usually based on retrospective data, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this step could be problematic for seismicity characterized by long-term recurrence. However, the separation of the data base of the data base collected in the past in two separate sections (one on which the best fit of the parameters is carried out, and the other on which the hypothesis is tested) can be a viable solution, known as retrospective-forward testing. In this study we show examples of application of the above mentioned concepts to the analysis of the Italian catalog of instrumental seismicity, making use of an epidemic algorithm developed to model short-term clustering features. This model, for which a precursory anomaly is just the occurrence of seismic activity, doesn't need the retrospective categorization of earthquakes in terms of foreshocks, mainshocks and aftershocks. It was introduced more than 15 years ago and tested so far in a number of real cases. It is now being run by several seismological centers around the world in forward real-time mode for testing purposes.

  13. Relationship between seismic status of Earth and relative position of bodies in sun-earth-moon system

    NASA Astrophysics Data System (ADS)

    Kulanin, N. V.

    1985-03-01

    The time spectrum of variations in seismicity is quite broad. There are seismic seasons, as well as multiannual variations. The range of characteristic times of variation from days to about one year is studied. Seismic activity as a function of the position of the moon relative to the Earth and the direction toward the Sun is studied. The moments of strong earthquakes, over 5.8 on the Richter scale, between 1968 and June 1980 are plotted in time coordinates relating them to the relative positions of the three bodies in the sun-earth-moon system. Methods of mathematical statistics are applied to the points produced, indicating at least 99% probability that the distribution was not random. a periodicity of the earth's seismic state of 413 days is observed.

  14. Dynamic of the volcanic activity of La Soufrière volcano (Guadeloupe, Lesser Antillles): Evidence for shallow fluid seismic sources

    NASA Astrophysics Data System (ADS)

    Ucciani, G.; Beauducel, F.; Bouin, M. P.; Nercessian, A.

    2015-12-01

    La Soufrière is one of the many hazardous volcanoes in the inner arc of Lesser Antilles. Located South of Basse-Terre island, it is the only active volcano of the Guadeloupe archipelago. Since the last significant magmatic eruption in 1535 AD, the activity has been exculsively phreatic. Since 1992 and the abrupt renewal of seismic and fumarollic activities, the Guadeloupe Volcanological and Seismological Observatory (OVSG-IPGP) has recorded a progressive increasing of seismicity and degassing that led scientists and authorities to set the alert level ``Vigilance'' and hold it until today. According to the recent geophysical, geochemical and geological studies, the current volcanic activity of la Soufrière volcano seems to be exclusively associated to the hydrothermal system, while the link with seismic activity is still poorly studied. In this context of possible pre-eruptive unrest, we investigated the spatial and temporal variations of the seismicity recorded between 1981 and 2013. From a consistent seismological framework coupling spectral, statistical, signal processing, clustering, and inverse problems methods, we demonstrate that this seismicity is largely generated by shallow hydrothermal fluid sources located in a complex plumbing system. Spatial variations of Vp/Vs ratio and B-value in seismogenic structures allow us to document three main seismic zones associated to : (1) migration of magmatic gas, (2) the storage and mixing of underground water and gas and (3) the shallow migration of hydrothermal fluids in high fractured and heterogeneous system. Waveform analysis revealed a low number of significant families consistent with fracturing process, and the temporal evolution of multiplet activities highlighted several variations associated with surface manifestations and brutal dynamic changes after major local tectonic earthquakes of Les Saintes (21 November 2004, Mw=6.3), its main aftershock (14 February 2005, Mw=5.7) and the last major earthquake of la Martinique (29 November 2007, Mw=7.4).

  15. Combining active and passive seismic methods for the characterization of urban sites in Cairo, Egypt

    NASA Astrophysics Data System (ADS)

    Adly, Ashraf; Poggi, Valerio; Fäh, Donat; Hassoup, Awad; Omran, Awad

    2017-07-01

    The geology at Kottamiya, Rehab City and Zahraa-Madinat-Nasr to the East of Cairo (Egypt) is composed of low-velocity sediments on top of a rigid rock basement. Such sediments include the loose sands of the Gebel Ahmar formation, marl and shales of Maadi formation, in addition to sparse quaternary soil covers. Due to the contrast of the seismic impedance with the underlying bedrock, these soft sediments have the potential of considerably amplifying the ground motion during an earthquake. For the evaluation of site-specific seismic hazard, we computed the seismic site response in these areas by developing 1-D velocity models and derived average seismic velocities, including Vs30. To do that, we applied different active and passive source techniques, including the horizontal to vertical Fourier spectral ratio of ambient vibration recordings and multichannel analysis of artificially generated surface waves. A set of models representing the velocity structure of the site is then obtained by combined inversion of Rayleigh wave dispersion curves and ellipticity functions. While dispersion curves are used to constrain the uppermost low-velocity part of the soil profile, ellipticity helps in resolving the structure at the depth of the sediment-bedrock interface. From the retrieved velocity models, numerical ground-motion amplification is finally derived using 1-D SH-wave transfer function. We account for uncertainty in amplification by using a statistical model that accounts for the misfit of all the inverted velocity profiles. The study reveals that the different sites experience an important frequency-dependent amplification, with largest amplification occurring at the resonance frequencies of the sites. Amplification up to a factor of 5 is found, with some variability depending on the soil type (Vs30 ranges between 340 and 415 m s-2). Moreover, amplification is expected in the frequency range that is important for buildings (0.8-10 Hz), which is additional confirmation for the need of microzonation analysis of the area. The obtained results will be used for the development of a new seismic hazard model.

  16. Seismic clusters analysis in Northeastern Italy by the nearest-neighbor approach

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Gentili, Stefania

    2018-01-01

    The main features of earthquake clusters in Northeastern Italy are explored, with the aim to get new insights on local scale patterns of seismicity in the area. The study is based on a systematic analysis of robustly and uniformly detected seismic clusters, which are identified by a statistical method, based on nearest-neighbor distances of events in the space-time-energy domain. The method permits us to highlight and investigate the internal structure of earthquake sequences, and to differentiate the spatial properties of seismicity according to the different topological features of the clusters structure. To analyze seismicity of Northeastern Italy, we use information from local OGS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics since 1977. A preliminary reappraisal of the earthquake bulletins is carried out and the area of sufficient completeness is outlined. Various techniques are considered to estimate the scaling parameters that characterize earthquakes occurrence in the region, namely the b-value and the fractal dimension of epicenters distribution, required for the application of the nearest-neighbor technique. Specifically, average robust estimates of the parameters of the Unified Scaling Law for Earthquakes, USLE, are assessed for the whole outlined region and are used to compute the nearest-neighbor distances. Clusters identification by the nearest-neighbor method turn out quite reliable and robust with respect to the minimum magnitude cutoff of the input catalog; the identified clusters are well consistent with those obtained from manual aftershocks identification of selected sequences. We demonstrate that the earthquake clusters have distinct preferred geographic locations, and we identify two areas that differ substantially in the examined clustering properties. Specifically, burst-like sequences are associated with the north-western part and swarm-like sequences with the south-eastern part of the study region. The territorial heterogeneity of earthquakes clustering is in good agreement with spatial variability of scaling parameters identified by the USLE. In particular, the fractal dimension is higher to the west (about 1.2-1.4), suggesting a spatially more distributed seismicity, compared to the eastern parte of the investigated territory, where fractal dimension is very low (about 0.8-1.0).

  17. Subspace Dimensionality: A Tool for Automated QC in Seismic Array Processing

    NASA Astrophysics Data System (ADS)

    Rowe, C. A.; Stead, R. J.; Begnaud, M. L.

    2013-12-01

    Because of the great resolving power of seismic arrays, the application of automated processing to array data is critically important in treaty verification work. A significant problem in array analysis is the inclusion of bad sensor channels in the beamforming process. We are testing an approach to automated, on-the-fly quality control (QC) to aid in the identification of poorly performing sensor channels prior to beam-forming in routine event detection or location processing. The idea stems from methods used for large computer servers, when monitoring traffic at enormous numbers of nodes is impractical on a node-by node basis, so the dimensionality of the node traffic is instead monitoried for anomalies that could represent malware, cyber-attacks or other problems. The technique relies upon the use of subspace dimensionality or principal components of the overall system traffic. The subspace technique is not new to seismology, but its most common application has been limited to comparing waveforms to an a priori collection of templates for detecting highly similar events in a swarm or seismic cluster. In the established template application, a detector functions in a manner analogous to waveform cross-correlation, applying a statistical test to assess the similarity of the incoming data stream to known templates for events of interest. In our approach, we seek not to detect matching signals, but instead, we examine the signal subspace dimensionality in much the same way that the method addresses node traffic anomalies in large computer systems. Signal anomalies recorded on seismic arrays affect the dimensional structure of the array-wide time-series. We have shown previously that this observation is useful in identifying real seismic events, either by looking at the raw signal or derivatives thereof (entropy, kurtosis), but here we explore the effects of malfunctioning channels on the dimension of the data and its derivatives, and how to leverage this effect for identifying bad array elements through a jackknifing process to isolate the anomalous channels, so that an automated analysis system might discard them prior to FK analysis and beamforming on events of interest.

  18. A model of seismic coda arrivals to suppress spurious events.

    NASA Astrophysics Data System (ADS)

    Arora, N.; Russell, S.

    2012-04-01

    We describe a model of coda arrivals which has been added to NET-VISA (Network processing Vertically Integrated Seismic Analysis) our probabilistic generative model of seismic events, their transmission, and detection on a global seismic network. The scattered energy that follows a seismic phase arrival tends to deceive typical STA/LTA based arrival picking software into believing that a real seismic phase has been detected. These coda arrivals which tend to follow all seismic phases cause most network processing software including NET-VISA to believe that multiple events have taken place. It is not a simple matter of ignoring closely spaced arrivals since arrivals from multiple events can indeed overlap. The current practice in NET-VISA of pruning events within a small space-time neighborhood of a larger event works reasonably well, but it may mask real events produced in an after-shock sequence. Our new model allows any seismic arrival, even coda arrivals, to trigger a subsequent coda arrival. The probability of such a triggered arrival depends on the amplitude of the triggering arrival. Although real seismic phases are more likely to generate such coda arrivals. Real seismic phases also tend to generate coda arrivals with more strongly correlated parameters, for example azimuth and slowness. However, the SNR (Signal to Noise Ratio) of a coda arrival immediately following a phase arrival tends to be lower because of the nature of the SNR calculation. We have calibrated our model on historical statistics of such triggered arrivals and our inference accounts for them while searching for the best explanation of seismic events their association to the arrivals and the coda arrivals. We have tested our new model on one week of global seismic data spanning March 22, 2009 to March 29, 2009. Our model was trained on two and half months of data from April 5, 2009 to June 20, 2009. We use the LEB bulletin produced by the IDC (International Data Center) as the ground truth and computed the precision (percentage of reported events which are true) and recall (percentage of true events which are reported). The existing model has a precision of 32.2 and recall of 88.6 which changes to a precision of 50.7 and recall of 88.5 after pruning. The new model has a precision of 56.8 and recall of 86.9 without any pruning and the corresponding precision recall curve is dramatically improved. In contrast, the performance of the current automated bulletin at the IDC, SEL3, has a precision of 46.2 and recall of 69.7.

  19. 78 FR 13911 - Proposed Revision to Design of Structures, Components, Equipment and Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-01

    ... Analysis Reports for Nuclear Power Plants: LWR Edition,'' Section 3.7.1, ``Seismic Design Parameters,'' Section 3.7.2, ``Seismic System Analysis,'' Section 3.7.3, ``Seismic Subsystem Analysis,'' Section 3.8.1... and analysis issues, (2) updates to review interfaces to improve the efficiency and consistency of...

  20. Long Term RST Analyses of TIR Satellite Radiances in Different Geotectonic Contexts: Results and Implications for a Time-Dependent Assessment of Seismic Hazard (t-DASH)

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Armandi, B.; Coviello, I.; Eleftheriou, A.; Filizzola, C.; Genzano, N.; Lacava, T.; Lisi, M.; Paciello, R.; Pergola, N.; Satriano, V.; Vallianatos, F.

    2014-12-01

    A large scientific documentation is to-date available about the appearance of anomalous space-time patterns of geophysical parameters measured from days to week before earthquakes occurrence. Nevertheless up to now no one measurable parameter, no one observational methodology has demonstrated to be sufficiently reliable and effective for the implementation of an operational earthquake prediction system. In this context PRE-EARTHQUAKES EU-FP7 project (www.pre-earthquakes.org), investigated to which extent the combined use of different observations/parameters together with the refinement of data analysis methods, can reduce false alarm rates and improve reliability and precision (in the space-time domain) of predictions. Among the different parameters/methodologies proposed to provide useful information in the earthquake prediction system, since 2001 a statistical approach named RST (Robust Satellite Technique) has been used to identify the space-time fluctuations of Earth's emitted Thermal Infrared (TIR) radiation observed from satellite in seismically active regions. In this paper RST-based long-term analysis of TIR satellite record collected by MSG/SEVIRI over European (Italy and Greece) and by GOES/IMAGER over American (California) regions will be presented. Its enhanced potential, when applied in the framework of time-Dependent Assessment of Seismic Hazard (t-DASH) system continuously integrating independent observations, will be moreover discussed.

  1. The use of multiwavelets for uncertainty estimation in seismic surface wave dispersion.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poppeliers, Christian

    This report describes a new single-station analysis method to estimate the dispersion and uncer- tainty of seismic surface waves using the multiwavelet transform. Typically, when estimating the dispersion of a surface wave using only a single seismic station, the seismogram is decomposed into a series of narrow-band realizations using a bank of narrow-band filters. By then enveloping and normalizing the filtered seismograms and identifying the maximum power as a function of frequency, the group velocity can be estimated if the source-receiver distance is known. However, using the filter bank method, there is no robust way to estimate uncertainty. In thismore » report, I in- troduce a new method of estimating the group velocity that includes an estimate of uncertainty. The method is similar to the conventional filter bank method, but uses a class of functions, called Slepian wavelets, to compute a series of wavelet transforms of the data. Each wavelet transform is mathematically similar to a filter bank, however, the time-frequency tradeoff is optimized. By taking multiple wavelet transforms, I form a population of dispersion estimates from which stan- dard statistical methods can be used to estimate uncertainty. I demonstrate the utility of this new method by applying it to synthetic data as well as ambient-noise surface-wave cross-correlelograms recorded by the University of Nevada Seismic Network.« less

  2. On the Diurnal Periodicity of Representative Earthquakes in Greece: Comparison of Data from Different Observation Systems

    NASA Astrophysics Data System (ADS)

    Desherevskii, A. V.; Sidorin, A. Ya.

    2017-12-01

    Due to the initiation of the Hellenic Unified Seismic Network (HUSN) in late 2007, the quality of observation significantly improved by 2011. For example, the representative magnitude level considerably has decreased and the number of annually recorded events has increased. The new observational system highly expanded the possibilities for studying regularities in seismicity. In view of this, the authors revisited their studies of the diurnal periodicity of representative earthquakes in Greece that was revealed earlier in the earthquake catalog before 2011. We use 18 samples of earthquakes of different magnitudes taken from the catalog of Greek earthquakes from 2011 to June 2016 to derive a series of the number of earthquakes for each of them and calculate its average diurnal course. To increase the reliability of the results, we compared the data for two regions. With a high degree of statistical significance, we have obtained that no diurnal periodicity can be found for strongly representative earthquakes. This finding differs from the estimates obtained earlier from an analysis of the catalog of earthquakes at the same area for 1995-2004 and 2005-2010, i.e., before the initiation of the Hellenic Unified Seismic Network. The new results are consistent with the hypothesis of noise discrimination (observational selection) explaining the cause of the diurnal variation of earthquakes with different sensitivity of the seismic network in daytime and nighttime periods.

  3. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    NASA Astrophysics Data System (ADS)

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar seismic moment and discrimination for shallow sources are small and can be understood in a systematic manner. We are presently investigating the frequency dependence of vanishing traction of a very shallow (10m depth) M2+ chemical explosion recorded at several kilometer distances, and preliminary results indicate at the typical frequency passband we employ the bias does not affect our ability to retrieve the correct source mechanism but may affect the retrieval of the correct scalar seismic moment. Finally, we assess discrimination capability in a composite P-value statistical framework.

  4. Flexible kinematic earthquake rupture inversion of tele-seismic waveforms: Application to the 2013 Balochistan, Pakistan earthquake

    NASA Astrophysics Data System (ADS)

    Shimizu, K.; Yagi, Y.; Okuwaki, R.; Kasahara, A.

    2017-12-01

    The kinematic earthquake rupture models are useful to derive statistics and scaling properties of the large and great earthquakes. However, the kinematic rupture models for the same earthquake are often different from one another. Such sensitivity of the modeling prevents us to understand the statistics and scaling properties of the earthquakes. Yagi and Fukahata (2011) introduces the uncertainty of Green's function into the tele-seismic waveform inversion, and shows that the stable spatiotemporal distribution of slip-rate can be obtained by using an empirical Bayesian scheme. One of the unsolved problems in the inversion rises from the modeling error originated from an uncertainty of a fault-model setting. Green's function near the nodal plane of focal mechanism is known to be sensitive to the slight change of the assumed fault geometry, and thus the spatiotemporal distribution of slip-rate should be distorted by the modeling error originated from the uncertainty of the fault model. We propose a new method accounting for the complexity in the fault geometry by additionally solving the focal mechanism on each space knot. Since a solution of finite source inversion gets unstable with an increasing of flexibility of the model, we try to estimate a stable spatiotemporal distribution of focal mechanism in the framework of Yagi and Fukahata (2011). We applied the proposed method to the 52 tele-seismic P-waveforms of the 2013 Balochistan, Pakistan earthquake. The inverted-potency distribution shows unilateral rupture propagation toward southwest of the epicenter, and the spatial variation of the focal mechanisms shares the same pattern as the fault-curvature along the tectonic fabric. On the other hand, the broad pattern of rupture process, including the direction of rupture propagation, cannot be reproduced by an inversion analysis under the assumption that the faulting occurred on a single flat plane. These results show that the modeling error caused by simplifying the fault model is non-negligible in the tele-seismic waveform inversion of the 2013 Balochistan, Pakistan earthquake.

  5. Vibrational modes of hydraulic fractures: Inference of fracture geometry from resonant frequencies and attenuation

    NASA Astrophysics Data System (ADS)

    Lipovsky, Bradley P.; Dunham, Eric M.

    2015-02-01

    Oscillatory seismic signals arising from resonant vibrations of hydraulic fractures are observed in many geologic systems, including volcanoes, glaciers and ice sheets, and hydrocarbon and geothermal reservoirs. To better quantify the physical dimensions of fluid-filled cracks and properties of the fluids within them, we study wave motion along a thin hydraulic fracture waveguide. We present a linearized analysis, valid at wavelengths greater than the fracture aperture, that accounts for quasi-static elastic deformation of the fracture walls, as well as fluid viscosity, inertia, and compressibility. In the long-wavelength limit, anomalously dispersed guided waves known as crack or Krauklis waves propagate with restoring force from fracture wall elasticity. At shorter wavelengths, the waves become sound waves within the fluid channel. Wave attenuation in our model is due to fluid viscosity, rather than seismic radiation from crack tips or fracture wall roughness. We characterize viscous damping at both low frequencies, where the flow is always fully developed, and at high frequencies, where the flow has a nearly constant velocity profile away from viscous boundary layers near the fracture walls. Most observable seismic signals from resonating fractures likely arise in the boundary layer crack wave limit, where fluid-solid coupling is pronounced and attenuation is minimal. We present a method to estimate the aperture and length of a resonating hydraulic fracture using both the seismically observed quality factor and characteristic frequency. Finally, we develop scaling relations between seismic moment and characteristic frequency that might be useful when interpreting the statistics of hydraulic fracture events.

  6. Estimating the Depth of Stratigraphic Units from Marine Seismic Profiles Using Nonstationary Geostatistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chihi, Hayet; Galli, Alain; Ravenne, Christian

    2000-03-15

    The object of this study is to build a three-dimensional (3D) geometric model of the stratigraphic units of the margin of the Rhone River on the basis of geophysical investigations by a network of seismic profiles at sea. The geometry of these units is described by depth charts of each surface identified by seismic profiling, which is done by geostatistics. The modeling starts by a statistical analysis by which we determine the parameters that enable us to calculate the variograms of the identified surfaces. After having determined the statistical parameters, we calculate the variograms of the variable Depth. By analyzingmore » the behavior of the variogram we then can deduce whether the situation is stationary and if the variable has an anisotropic behavior. We tried the following two nonstationary methods to obtain our estimates: (a) The method of universal kriging if the underlying variogram was directly accessible. (b) The method of increments if the underlying variogram was not directly accessible. After having modeled the variograms of the increments and of the variable itself, we calculated the surfaces by kriging the variable Depth on a small-mesh estimation grid. The two methods then are compared and their respective advantages and disadvantages are discussed, as well as their fields of application. These methods are capable of being used widely in earth sciences for automatic mapping of geometric surfaces or for variables such as a piezometric surface or a concentration, which are not 'stationary,' that is, essentially, possess a gradient or a tendency to develop systematically in space.« less

  7. Interactive Model Visualization for NET-VISA

    NASA Astrophysics Data System (ADS)

    Kuzma, H. A.; Arora, N. S.

    2013-12-01

    NET-VISA is a probabilistic system developed for seismic network processing of data measured on the International Monitoring System (IMS) of the Comprehensive nuclear Test Ban Treaty Organization (CTBTO). NET-VISA is composed of a Generative Model (GM) and an Inference Algorithm (IA). The GM is an explicit mathematical description of the relationships between various factors in seismic network analysis. Some of the relationships inside the GM are deterministic and some are statistical. Statistical relationships are described by probability distributions, the exact parameters of which (such as mean and standard deviation) are found by training NET-VISA using recent data. The IA uses the GM to evaluate the probability of various events and associations, searching for the seismic bulletin which has the highest overall probability and is consistent with a given set of measured arrivals. An Interactive Model Visualization tool (IMV) has been developed which makes 'peeking into' the GM simple and intuitive through a web-based interfaced. For example, it is now possible to access the probability distributions for attributes of events and arrivals such as the detection rate for each station for each of 14 phases. It also clarifies the assumptions and prior knowledge that are incorporated into NET-VISA's event determination. When NET-VISA is retrained, the IMV will be a visual tool for quality control both as a means of testing that the training has been accomplished correctly and that the IMS network has not changed unexpectedly. A preview of the IMV will be shown at this poster presentation. Homepage for the IMV IMV shows current model file and reference image.

  8. Identifying tectonic parameters that affect tsunamigenesis

    NASA Astrophysics Data System (ADS)

    van Zelst, I.; Brizzi, S.; Heuret, A.; Funiciello, F.; van Dinther, Y.

    2016-12-01

    The role of tectonics in tsunami generation is at present poorly understood. However, the fact thatsome regions produce more tsunamis than others indicates that tectonics could influencetsunamigenesis. Here, we complement a global earthquake database that contains geometrical,mechanical, and seismicity parameters of subduction zones with tsunami data. We statisticallyanalyse the database to identify the tectonic parameters that affect tsunamigenesis. The Pearson'sproduct-moment correlation coefficients reveal high positive correlations of 0.65 between,amongst others, the maximum water height of tsunamis and the seismic coupling in a subductionzone. However, these correlations are mainly caused by outliers. The Spearman's rank correlationcoefficient results in statistically significant correlations of 0.60 between the number of tsunamisin a subduction zone and subduction velocity (positive correlation) and the sediment thickness atthe trench (negative correlation). Interestingly, there is a positive correlation between the latter andtsunami magnitude. These bivariate statistical methods are extended to a binary decision tree(BDT) and multivariate analysis. Using the BDT, the tectonic parameters that distinguish betweensubduction zones with tsunamigenic and non-tsunamigenic earthquakes are identified. To assessphysical causality of the tectonic parameters with regard to tsunamigenesis, we complement ouranalysis by a numerical study of the most promising parameters using a geodynamic seismic cyclemodel. We show that the inclusion of sediments on the subducting plate results in an increase insplay fault activity, which could lead to larger vertical seafloor displacements due to their steeperdips and hence a larger tsunamigenic potential. We also show that the splay fault is the preferredrupture path for a strongly velocity strengthening friction regime in the shallow part of thesubduction zone, which again increases the tsunamigenic potential.

  9. Dynamic triggering of low magnitude earthquakes in the Middle American Subduction Zone

    NASA Astrophysics Data System (ADS)

    Escudero, C. R.; Velasco, A. A.

    2010-12-01

    We analyze global and Middle American Subduction Zone (MASZ) seismicity from 1998 to 2008 to quantify the transient stresses effects at teleseismic distances. We use the Bulletin of the International Seismological Centre Catalog (ISCCD) published by the Incorporated Research Institutions for Seismology (IRIS). To identify MASZ seismicity changes due to distant, large (Mw >7) earthquakes, we first identify local earthquakes that occurred before and after the mainshocks. We then group the local earthquakes within a cluster radius between 75 to 200 km. We obtain statistics based on characteristics of both mainshocks and local earthquakes clusters, such as local cluster-mainshock azimuth, mainshock focal mechanism, and local earthquakes clusters within the MASZ. Due to lateral variations of the dip along the subducted oceanic plate, we divide the Mexican subduction zone in four segments. We then apply the Paired Samples Statistical Test (PSST) to the sorted data to identify increment, decrement or either in the local seismicity associated with distant large earthquakes. We identify dynamic triggering for all MASZ segments produced by large earthquakes emerging from specific azimuths, as well as, a decrease for some cases. We find no depend of seismicity changes due to focal mainshock mechanism.

  10. Investigation of seismicity after the initiation of a Seismic Electric Signal activity until the main shock

    PubMed Central

    Sarlis, N. V.; Skordas, E. S.; Lazaridou, M. S.; Varotsos, P. A.

    2008-01-01

    The behavior of seismicity in the area candidate to suffer a main shock is investigated after the observation of the Seismic Electric Signal activity until the impending main shock. This is based on the view that the occurrence of earthquakes is a critical phenomenon to which statistical dynamics may be applied. In the present work, analysing the time series of small earthquakes, the concept of natural time χ was used and the results revealed that the approach to criticality itself can be manifested by the probability density function (PDF) of κ1 calculated over an appropriate statistical ensemble. Here, κ1 is the variance κ1(= 〈χ2〉 − 〈χ〉2) resulting from the power spectrum of a function defined as Φ(ω)=∑k=1Npkexp(iωχk), where pk is the normalized energy of the k-th small earthquake and ω the natural frequency. This PDF exhibits a maximum at κ1 ≈ 0.070 a few days before the main shock. Examples are presented, referring to the magnitude 6∼7 class earthquakes that occurred in Greece. PMID:18941306

  11. Statistical Analysis of Seismicity in the Sumatra Region

    NASA Astrophysics Data System (ADS)

    Bansal, A.; Main, I.

    2007-12-01

    We examine the effect of the great M=9.0 Boxing day 2004 earthquake on the statistics of seismicity in the Sumatra region by dividing data from the NEIC catalogue into two time windows before and after the earthquake. First we determine a completeness threshold of magnitude 4.5 for the whole dataset from the stability of the maximum likelihood b-value with respect to changes in the threshold. The split data sets have similar statistical sampling, with 2563 events before and 3701 after the event. Temporal clustering is first quantified broadly by the fractal dimension of the time series to be respectively 0.137, 0.259 and 0.222 before, after and for the whole dataset, compared to a Poisson null hypothesis of 0, indicating a significant increase in temporal clustering after the event associated with aftershocks. To quantify this further we apply the Epidemic Type Aftershock Sequence (ETAS) model. The background random seismicity rate £g and the coefficient Ñ, a measure of an efficiency of a magnitude of an earthquake in generating its aftershocks, do not change significantly when averaged over the two time periods. In contrast the amplitude A of aftershock generation changes by a factor 4 or so, and there is a small but statistically significant increase in the Omori decay exponent p, indicating a faster decay rate of the aftershocks after the Sumatra earthquake. The ETAS model parameters are calculated for different magnitude threshold (i.e. 4.5, 5.0, 5.5) with similar results for the different magnitude thresholds. The Ñ values increases from near 1 to near 1.5, possibly reflecting known changes in the scaling exponent between scalar moment and magnitude with increasing magnitude. A simple relation of magnitude and span of aftershock activity indicates that detectable aftershock activity of the Sumatra earthquake may last up to 8.7 years. Earthquakes are predominantly in the depth range 30-40 km before 20-30 km after the mainshock, compared to a CMT centroid depth of the earthquake of 28.6 km.

  12. Synthetic Earthquake Statistics From Physical Fault Models for the Lower Rhine Embayment

    NASA Astrophysics Data System (ADS)

    Brietzke, G. B.; Hainzl, S.; Zöller, G.

    2012-04-01

    As of today, seismic risk and hazard estimates mostly use pure empirical, stochastic models of earthquake fault systems tuned specifically to the vulnerable areas of interest. Although such models allow for reasonable risk estimates they fail to provide a link between the observed seismicity and the underlying physical processes. Solving a state-of-the-art fully dynamic description set of all relevant physical processes related to earthquake fault systems is likely not useful since it comes with a large number of degrees of freedom, poor constraints on its model parameters and a huge computational effort. Here, quasi-static and quasi-dynamic physical fault simulators provide a compromise between physical completeness and computational affordability and aim at providing a link between basic physical concepts and statistics of seismicity. Within the framework of quasi-static and quasi-dynamic earthquake simulators we investigate a model of the Lower Rhine Embayment (LRE) that is based upon seismological and geological data. We present and discuss statistics of the spatio-temporal behavior of generated synthetic earthquake catalogs with respect to simplification (e.g. simple two-fault cases) as well as to complication (e.g. hidden faults, geometric complexity, heterogeneities of constitutive parameters).

  13. Geothermal production and reduced seismicity: Correlation and proposed mechanism

    DOE PAGES

    Cardiff, Michael; Lim, David D.; Patterson, Jeremy R.; ...

    2018-01-15

    At Brady Hot Springs, a geothermal field in Nevada, heated fluids have been extracted, cooled, and re-injected to produce electrical power since 1992. Analysis of daily pumping records and catalogs of microseismicity between 2010 and 2015 indicates a statistically significant correlation between days when the daily volume of production was at or above its long-term average rate and days when no seismic event was detected. Conversely, shutdowns in pumping for plant maintenance correlate with increased microseismicity. Our hypothesis is that the effective stress in the subsurface has adapted to the long-term normal operations (deep extraction) at the site. Under thismore » hypothesis, extraction of fluids inhibits fault slip by increasing the effective stress on faults; in contrast, brief pumping cessations represent times when effective stress is decreased below its long-term average, increasing the likelihood of microseismicity.« less

  14. Geothermal production and reduced seismicity: Correlation and proposed mechanism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cardiff, Michael; Lim, David D.; Patterson, Jeremy R.

    At Brady Hot Springs, a geothermal field in Nevada, heated fluids have been extracted, cooled, and re-injected to produce electrical power since 1992. Analysis of daily pumping records and catalogs of microseismicity between 2010 and 2015 indicates a statistically significant correlation between days when the daily volume of production was at or above its long-term average rate and days when no seismic event was detected. Conversely, shutdowns in pumping for plant maintenance correlate with increased microseismicity. Our hypothesis is that the effective stress in the subsurface has adapted to the long-term normal operations (deep extraction) at the site. Under thismore » hypothesis, extraction of fluids inhibits fault slip by increasing the effective stress on faults; in contrast, brief pumping cessations represent times when effective stress is decreased below its long-term average, increasing the likelihood of microseismicity.« less

  15. Annual modulation of seismicity along the San Andreas Fault near Parkfield, CA

    USGS Publications Warehouse

    Christiansen, L.B.; Hurwitz, S.; Ingebritsen, S.E.

    2007-01-01

    We analyze seismic data from the San Andreas Fault (SAF) near Parkfield, California, to test for annual modulation in seismicity rates. We use statistical analyses to show that seismicity is modulated with an annual period in the creeping section of the fault and a semiannual period in the locked section of the fault. Although the exact mechanism for seasonal triggering is undetermined, it appears that stresses associated with the hydrologic cycle are sufficient to fracture critically stressed rocks either through pore-pressure diffusion or crustal loading/ unloading. These results shed additional light on the state of stress along the SAF, indicating that hydrologically induced stress perturbations of ???2 kPa may be sufficient to trigger earthquakes.

  16. A seismic survey of the Manson disturbed area

    NASA Technical Reports Server (NTRS)

    Sendlein, L. V. A.; Smith, T. A.

    1971-01-01

    The region in north-central Iowa referred to as the Manson disturbed area was investigated with the seismic refraction method and the bedrock configuration mapped. The area is approximately 30 km in diameter and is not detectable from the surface topography; however, water wells that penetrate the bedrock indicate that the bedrock is composed of disturbed Cretaceous sediments with a central region approximately 6 km in diameter composed of Precambrian crystalline rock. Seismic velocity differences between the overlying glacial till and the Cretaceous sediments were so small that a statistical program was developed to analyze the data. The program developed utilizes existing 2 segment regression analyses and extends the method to fit 3 or more regression lines to seismic data.

  17. A multidisciplinary approach to understand interactions of red wood ants (Formica rufa-group) and geotectonic processes

    NASA Astrophysics Data System (ADS)

    Berberich, G. M.; Berberich, M. B.; Grumpe, A.; Becker, A.; Tejeda, A.; Simpson, H.; Obamwonyi, S.; Schumann, M.; Hartmann, J.; Wöhler, C.; Ellison, A. M.

    2016-12-01

    Red wood ants (RWA; Formica rufa-group) are biological indicators of seismically active, gas-permeable faults and nest most successfully atop of them. Exploratory testing of gases in and around RWA nests revealed that geochemical anomalies were absent from nearby, tectonically inactive, areas. Changes in activity patterns of RWA were correlated with regularly changing gas concentrations and tectonic events. Field work was done from March to September 2016 in the seismically active East Eifel Volcanic Field (western Germany) to investigate relationships at time scales of two weeks and, during one month, eight hours, respectively, between activity patterns of F. polyctena recorded and analyzed with an image-based monitoring system (AntCam); gas concentrations (CO2, He, Rn, H2S, CH4) in nests, soil, and nearby mineral springs; CH4 concentrations in nest gas to determine the origin (biogenic, geogenic) of d13CCH4; geophysical processes (seismic events, earth-tides); influences from space weather on Earth's magnetic field (e.g., Kp-index, hourly mean values of the magnetic variations); local weather and climatic conditions. We analyzed geochemical, geophysical, and biological data with spatiotemporal Bayesian statistics and principal component analysis to identify possible causes of associations among RWA activity, degassing, and earthquakes. We observed significantly increased He and Rn concentrations in mineral gas and moderate increases in nest gas after two low-magnitude earthquakes. We expect more unknown geo-bio-correlations following additional analysis on the acquired data. The combination of seismically active fault zones and biological activity in RWA nests may contribute significantly to greenhouse gas emissions and ongoing climatic change. Funded by VW Foundation-Initiative "Experiment!" (Az 91 140).

  18. Evolution of the Lian River coastal basin in response to Quaternary marine transgressions in Southeast China

    NASA Astrophysics Data System (ADS)

    Tang, Yongjie; Zheng, Zhuo; Chen, Cong; Wang, Mengyuan; Chen, Bishan

    2018-04-01

    The coastal basin deposit in the Lian River plain is among the thickest Quaternary sequences along the southeastern coast of China. The clastic sediment accumulated in a variety of environmental settings including fluvial, channel, estuary/coastal and marine conditions. Detailed investigation of lithofacies, grain-size distributions, magnetic susceptibility, microfossils and chronology of marine core CN01, compared with regional cores, and combined with offshore seismic reflection profiles, has allowed us to correlate the spatial stratigraphy in the inner and outer plain and the seismic units. Grain size distribution analysis of core CN-01 through compositional data analysis and multivariate statistics were applied to clastic sedimentary facies and sedimentary cycles. Results show that these methods are able to derive a robust proxy information for the depositional environment of the Lian River plain. We have also been able to reconstruct deltaic evolution in response to marine transgressions. On the basis of dating results and chronostratigraphy, the estimated age of the onset of deposition in the Lian River coastal plain was more than 260 kyr BP. Three transgressive sedimentary cycles revealed in many regional cores support this age model. Detailed lithological and microfossil studies confirm that three marine (M3, M2 and M1) and three terrestrial (T3, T2 and T1) units can be distinguished. Spatial correlation between the inner plain, outer plain (typical cores characterized by marine transgression cycles) and offshore seismic reflectors reveals coherent sedimentary sequences. Two major boundaries (unconformity and erosion surfaces) can be recognized in the seismic profiles, and these correspond to weathered reddish and/or variegated clay in the study core, suggesting that Quaternary sediment changes on the Lian River plain were largely controlled by sea-level variations and coastline shift during glacial/interglacial cycles.

  19. Drift Reliability Assessment of a Four Storey Frame Residential Building Under Seismic Loading Considering Multiple Factors

    NASA Astrophysics Data System (ADS)

    Sil, Arjun; Longmailai, Thaihamdau

    2017-09-01

    The lateral displacement of Reinforced Concrete (RC) frame building during an earthquake has an important impact on the structural stability and integrity. However, seismic analysis and design of RC building needs more concern due to its complex behavior as the performance of the structure links to the features of the system having many influencing parameters and other inherent uncertainties. The reliability approach takes into account the factors and uncertainty in design influencing the performance or response of the structure in which the safety level or the probability of failure could be ascertained. This present study, aims to assess the reliability of seismic performance of a four storey residential RC building seismically located in Zone-V as per the code provisions given in the Indian Standards IS: 1893-2002. The reliability assessment performed by deriving an explicit expression for maximum roof-lateral displacement as a failure function by regression method. A total of 319, four storey RC buildings were analyzed by linear static method using SAP2000. However, the change in the lateral-roof displacement with the variation of the parameters (column dimension, beam dimension, grade of concrete, floor height and total weight of the structure) was observed. A generalized relation established by regression method which could be used to estimate the expected lateral displacement owing to those selected parameters. A comparison made between the displacements obtained from analysis with that of the equation so formed. However, it shows that the proposed relation could be used directly to determine the expected maximum lateral displacement. The data obtained from the statistical computations was then used to obtain the probability of failure and the reliability.

  20. Site-conditions map for Portugal based on VS measurements: methodology and final model

    NASA Astrophysics Data System (ADS)

    Vilanova, Susana; Narciso, João; Carvalho, João; Lopes, Isabel; Quinta Ferreira, Mario; Moura, Rui; Borges, José; Nemser, Eliza; Pinto, carlos

    2017-04-01

    In this paper we present a statistically significant site-condition model for Portugal based on shear-wave velocity (VS) data and surface geology. We also evaluate the performance of commonly used Vs30 proxies based on exogenous data and analyze the implications of using those proxies for calculating site amplification in seismic hazard assessment. The dataset contains 161 Vs profiles acquired in Portugal in the context of research projects, technical reports, academic thesis and academic papers. The methodologies involved in characterizing the Vs structure at the sites in the database include seismic refraction, multichannel analysis of seismic waves and refraction microtremor. Invasive measurements were performed in selected locations in order to compare the Vs profiles obtained from both invasive and non-invasive techniques. In general there was good agreement in the subsurface structure of Vs30 obtained from the different methodologies. The database flat-file includes information on Vs30, surface geology at 1:50.000 and 1:500.000 scales, elevation and topographic slope and based on SRTM30 topographic dataset. The procedure used to develop the site-conditions map is based on a three-step process that includes defining a preliminary set of geological units based on the literature, performing statistical tests to assess whether or not the differences in the distributions of Vs30 are statistically significant, and merging of the geological units accordingly. The dataset was, to some extent, affected by clustering and/or preferential sampling and therefore a declustering algorithm was applied. The final model includes three geological units: 1) Igneous, metamorphic and old (Paleogene and Mesozoic) sedimentary rocks; 2) Neogene and Pleistocene formations, and 3) Holocene formations. The evaluation of proxies indicates that although geological analogues and topographic slope are in general unbiased, the latter shows significant bias for particular geological units and subsequently for some geographical regions.

  1. Rethinking the problem of ionosphere-lithosphere coupling

    NASA Astrophysics Data System (ADS)

    Ruzhin, Yuri; Novikov, Victor

    2014-05-01

    An overview of research of possible relations between variations of geomagnetic field and seismicity is presented, including Sq-variations and geomagnetic storms. There are many papers demonstrating positive correlations between geomagnetic field variations and subsequent earthquake occurrence that allows to authors to talk about earthquake triggering impact provided by ionospheric processes on lithosphere. Nevertheless, there is another opinion on negligible impact of geomagnetic disturbances on the earthquake source supported by statistical analysis of correlation between variations of geomagnetic field and global and regional seismicity. Mainly, the both points of view on this problem are based on statistical research without detailed consideration of possible physical mechanisms which may be involved into the supposed earthquake triggering, or very rough estimations of possible increase of stresses in the faults under critical (near to failure) state were made. Recently it was shown that the fluids may play very important role in the electromagnetic earthquake triggering, and the secondary triggering mechanism should be considered when the fluid migrating into the fault under electromagnetic action may provide fault weakening up to earthquake triggering threshold. At the same time, depending on fault orientation, local hydrological structure of the crust around the fault, location of fluid reservoirs, etc. it may be possible that fluid migration from the fault may provide the fault strengthening, and in this case the impact of variation of geomagnetic field may provide an opposite effect. In so doing, it is useless to apply only statistical approach for the problem of ionosphere-lithosphere coupling, and in each case the possible behavior of fluids should be considered under electromagnetic impact on lithosphere. Experimental results supporting this idea and obtained at the spring-block model simulating the seismic cycle (slow accumulation and sharp drop of stresses in the fault gauge), as well as field observations of water level variations in the well during ionospheric disturbances are presented and discussed.

  2. Rescaled earthquake recurrence time statistics: application to microrepeaters

    NASA Astrophysics Data System (ADS)

    Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru

    2009-01-01

    Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.

  3. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  4. Search for Long Period Solar Normal Modes in Ambient Seismic Noise

    NASA Astrophysics Data System (ADS)

    Caton, R.; Pavlis, G. L.

    2016-12-01

    We search for evidence of solar free oscillations (normal modes) in long period seismic data through multitaper spectral analysis of array stacks. This analysis is similar to that of Thomson & Vernon (2015), who used data from the most quiet single stations of the global seismic network. Our approach is to use stacks of large arrays of noisier stations to reduce noise. Arrays have the added advantage of permitting the use of nonparametic statistics (jackknife errors) to provide objective error estimates. We used data from the Transportable Array, the broadband borehole array at Pinyon Flat, and the 3D broadband array in Homestake Mine in Lead, SD. The Homestake Mine array has 15 STS-2 sensors deployed in the mine that are extremely quiet at long periods due to stable temperatures and stable piers anchored to hard rock. The length of time series used ranged from 50 days to 85 days. We processed the data by low-pass filtering with a corner frequency of 10 mHz, followed by an autoregressive prewhitening filter and median stack. We elected to use the median instead of the mean in order to get a more robust stack. We then used G. Prieto's mtspec library to compute multitaper spectrum estimates on the data. We produce delete-one jackknife error estimates of the uncertainty at each frequency by computing median stacks of all data with one station removed. The results from the TA data show tentative evidence for several lines between 290 μHz and 400 μHz, including a recurring line near 379 μHz. This 379 μHz line is near the Earth mode 0T2 and the solar mode 5g5, suggesting that 5g5 could be coupling into the Earth mode. Current results suggest more statistically significant lines may be present in Pinyon Flat data, but additional processing of the data is underway to confirm this observation.

  5. Clues on the origin of post-2000 earthquakes at Campi Flegrei caldera (Italy).

    PubMed

    Chiodini, G; Selva, J; Del Pezzo, E; Marsan, D; De Siena, L; D'Auria, L; Bianco, F; Caliro, S; De Martino, P; Ricciolino, P; Petrillo, Z

    2017-06-30

    The inter-arrival times of the post 2000 seismicity at Campi Flegrei caldera are statistically distributed into different populations. The low inter-arrival times population represents swarm events, while the high inter-arrival times population marks background seismicity. Here, we show that the background seismicity is increasing at the same rate of (1) the ground uplift and (2) the concentration of the fumarolic gas specie more sensitive to temperature. The seismic temporal increase is strongly correlated with the results of recent simulations, modelling injection of magmatic fluids in the Campi Flegrei hydrothermal system. These concurrent variations point to a unique process of temperature-pressure increase of the hydrothermal system controlling geophysical and geochemical signals at the caldera. Our results thus show that the occurrence of background seismicity is an excellent parameter to monitor the current unrest of the caldera.

  6. Computational Software to Fit Seismic Data Using Epidemic-Type Aftershock Sequence Models and Modeling Performance Comparisons

    NASA Astrophysics Data System (ADS)

    Chu, A.

    2016-12-01

    Modern earthquake catalogs are often analyzed using spatial-temporal point process models such as the epidemic-type aftershock sequence (ETAS) models of Ogata (1998). My work implements three of the homogeneous ETAS models described in Ogata (1998). With a model's log-likelihood function, my software finds the Maximum-Likelihood Estimates (MLEs) of the model's parameters to estimate the homogeneous background rate and the temporal and spatial parameters that govern triggering effects. EM-algorithm is employed for its advantages of stability and robustness (Veen and Schoenberg, 2008). My work also presents comparisons among the three models in robustness, convergence speed, and implementations from theory to computing practice. Up-to-date regional seismic data of seismic active areas such as Southern California and Japan are used to demonstrate the comparisons. Data analysis has been done using computer languages Java and R. Java has the advantages of being strong-typed and easiness of controlling memory resources, while R has the advantages of having numerous available functions in statistical computing. Comparisons are also made between the two programming languages in convergence and stability, computational speed, and easiness of implementation. Issues that may affect convergence such as spatial shapes are discussed.

  7. Kinematics of the 2015 San Ramon, California earthquake swarm: Implications for fault zone structure and driving mechanisms

    NASA Astrophysics Data System (ADS)

    Xue, Lian; Bürgmann, Roland; Shelly, David R.; Johnson, Christopher W.; Taira, Taka'aki

    2018-05-01

    Earthquake swarms represent a sudden increase in seismicity that may indicate a heterogeneous fault-zone, the involvement of crustal fluids and/or slow fault slip. Swarms sometimes precede major earthquake ruptures. An earthquake swarm occurred in October 2015 near San Ramon, California in an extensional right step-over region between the northern Calaveras Fault and the Concord-Mt. Diablo fault zone, which has hosted ten major swarms since 1970. The 2015 San Ramon swarm is examined here from 11 October through 18 November using template matching analysis. The relocated seismicity catalog contains ∼4000 events with magnitudes between - 0.2

  8. The generalized truncated exponential distribution as a model for earthquake magnitudes

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2015-04-01

    The random distribution of small, medium and large earthquake magnitudes follows an exponential distribution (ED) according to the Gutenberg-Richter relation. But a magnitude distribution is truncated in the range of very large magnitudes because the earthquake energy is finite and the upper tail of the exponential distribution does not fit well observations. Hence the truncated exponential distribution (TED) is frequently applied for the modelling of the magnitude distributions in the seismic hazard and risk analysis. The TED has a weak point: when two TEDs with equal parameters, except the upper bound magnitude, are mixed, then the resulting distribution is not a TED. Inversely, it is also not possible to split a TED of a seismic region into TEDs of subregions with equal parameters, except the upper bound magnitude. This weakness is a principal problem as seismic regions are constructed scientific objects and not natural units. It also applies to alternative distribution models. The presented generalized truncated exponential distribution (GTED) overcomes this weakness. The ED and the TED are special cases of the GTED. Different issues of the statistical inference are also discussed and an example of empirical data is presented in the current contribution.

  9. Ground Motion Prediction Models for Caucasus Region

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Godoladze, Tea; Tvaradze, Nino; Tumanova, Nino

    2016-04-01

    Ground motion prediction models (GMPMs) relate ground motion intensity measures to variables describing earthquake source, path, and site effects. Estimation of expected ground motion is a fundamental earthquake hazard assessment. The most commonly used parameter for attenuation relation is peak ground acceleration or spectral acceleration because this parameter gives useful information for Seismic Hazard Assessment. Since 2003 development of Georgian Digital Seismic Network has started. In this study new GMP models are obtained based on new data from Georgian seismic network and also from neighboring countries. Estimation of models is obtained by classical, statistical way, regression analysis. In this study site ground conditions are additionally considered because the same earthquake recorded at the same distance may cause different damage according to ground conditions. Empirical ground-motion prediction models (GMPMs) require adjustment to make them appropriate for site-specific scenarios. However, the process of making such adjustments remains a challenge. This work presents a holistic framework for the development of a peak ground acceleration (PGA) or spectral acceleration (SA) GMPE that is easily adjustable to different seismological conditions and does not suffer from the practical problems associated with adjustments in the response spectral domain.

  10. Study of changes in the lineament structure, caused by earthquakes in South America by applying the lineament analysis to the Aster (Terra) satellite data

    NASA Astrophysics Data System (ADS)

    Arellano-Baeza, A. A.; Zverev, A. T.; Malinnikov, V. A.

    The region between Southern Peru and Northern Chile is one of the most seismically and volcanically active regions in South America. This is caused by a constant subduction of the South American Plate, converging with the Nazca Plate in the extreme North of Chile. We used the 15 and 30 m resolution satellite images, provided by the ASTER (VNIR and SWIR) instrument onboard the Terra satellite to study changes in the geological faults close to earthquake epicenters in southern Peru. Visible and infrared spectral bands were analysed using “The Lineament Extraction and Stripes Statistic Analysis” (LESSA) software package to examine changes in the lineament features and stripe density fields caused by seismic activity. We used the satellite images 128 and 48 days before and 73 days after a 5.2 Richter scale magnitude earthquake. The fact that the seasonal variations in the South of Peru and North of Chile are very small, and the vegetation is very limited, allowed us to establish substantial changes in the lineament and the stripe density field features. We develop a methodology that allows to evaluate the seismic risk in this region for the future.

  11. Use of high resolution satellite images for tracking of changes in the lineament structure, caused by earthquakes, situated nearly the Pacific coast of the North and South America.

    NASA Astrophysics Data System (ADS)

    Arellano-Baeza, A. A.; Garcia, R. V.; Trejo-Soto, M.

    The Pacific coast of the North and South America is one of the most seismically and volcanically active regions in the world forming part of the so-called Ring of Fire More than 10 earthquakes with the Richter scale magnitude 4 5 were analyzed They were located in the regions with small seasonal variations and limited vegetation to facilitate the tracking of features associated with the seismic activity only High resolution Aster satellite images were used to extract the principal lineaments using The Lineament Extraction and Stripes Statistic Analysis LESSA software package It was found that the number and orientation of lineaments changed significantly about one month before an earthquake approximately and a few months later the system returns to its initial state This effect increases with the earthquake magnitude and it is much more easily detectable in case of convergent plate boundaries for example Nasca and South American plates The results obtained open the possibility to develop a methodology able to evaluate the seismic risk in the regions with similar geological conditions

  12. Seismic analysis of a LNG storage tank isolated by a multiple friction pendulum system

    NASA Astrophysics Data System (ADS)

    Zhang, Ruifu; Weng, Dagen; Ren, Xiaosong

    2011-06-01

    The seismic response of an isolated vertical, cylindrical, extra-large liquefied natural gas (LNG) tank by a multiple friction pendulum system (MFPS) is analyzed. Most of the extra-large LNG tanks have a fundamental frequency which involves a range of resonance of most earthquake ground motions. It is an effective way to decrease the response of an isolation system used for extra-large LNG storage tanks under a strong earthquake. However, it is difficult to implement in practice with common isolation bearings due to issues such as low temperature, soft site and other severe environment factors. The extra-large LNG tank isolated by a MFPS is presented in this study to address these problems. A MFPS is appropriate for large displacements induced by earthquakes with long predominant periods. A simplified finite element model by Malhotra and Dunkerley is used to determine the usefulness of the isolation system. Data reported and statistically sorted include pile shear, wave height, impulsive acceleration, convective acceleration and outer tank acceleration. The results show that the isolation system has excellent adaptability for different liquid levels and is very effective in controlling the seismic response of extra-large LNG tanks.

  13. Analysis of landslide hazard area in Ludian earthquake based on Random Forests

    NASA Astrophysics Data System (ADS)

    Xie, J.-C.; Liu, R.; Li, H.-W.; Lai, Z.-L.

    2015-04-01

    With the development of machine learning theory, more and more algorithms are evaluated for seismic landslides. After the Ludian earthquake, the research team combine with the special geological structure in Ludian area and the seismic filed exploration results, selecting SLOPE(PODU); River distance(HL); Fault distance(DC); Seismic Intensity(LD) and Digital Elevation Model(DEM), the normalized difference vegetation index(NDVI) which based on remote sensing images as evaluation factors. But the relationships among these factors are fuzzy, there also exists heavy noise and high-dimensional, we introduce the random forest algorithm to tolerate these difficulties and get the evaluation result of Ludian landslide areas, in order to verify the accuracy of the result, using the ROC graphs for the result evaluation standard, AUC covers an area of 0.918, meanwhile, the random forest's generalization error rate decreases with the increase of the classification tree to the ideal 0.08 by using Out Of Bag(OOB) Estimation. Studying the final landslides inversion results, paper comes to a statistical conclusion that near 80% of the whole landslides and dilapidations are in areas with high susceptibility and moderate susceptibility, showing the forecast results are reasonable and adopted.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarlis, N. V., E-mail: nsarlis@phys.uoa.gr; Christopoulos, S.-R. G.; Skordas, E. S.

    It has been recently shown [N. V. Sarlis, Phys. Rev. E 84, 022101 (2011) and N. V. Sarlis and S.-R. G. Christopoulos, Chaos 22, 023123 (2012)] that earthquakes of magnitude M greater or equal to 7 are globally correlated. Such correlations were identified by studying the variance κ{sub 1} of natural time which has been proposed as an order parameter for seismicity. Here, we study the fluctuations of this order parameter using the Global Centroid Moment Tensor catalog for a magnitude threshold M{sub thres} = 5.0 and focus on its behavior before major earthquakes. Natural time analysis reveals that distinct minima ofmore » the fluctuations of the order parameter of seismicity appear within almost five and a half months on average before all major earthquakes of magnitude larger than 8.4. This phenomenon corroborates the recent finding [N. V. Sarlis et al., Proc. Natl. Acad. Sci. U.S.A. 110, 13734 (2013)] that similar minima of the seismicity order parameter fluctuations had preceded all major shallow earthquakes in Japan. Moreover, on the basis of these minima a statistically significant binary prediction method for earthquakes of magnitude larger than 8.4 with hit rate 100% and false alarm rate 6.67% is suggested.« less

  15. Seismic and Tectonic Regionalization of the State of Michoacan.

    NASA Astrophysics Data System (ADS)

    Vazquez Rosas, R.; Aguirre, J.; Garduño-Monroy, V. H.; Ramirez-Guzman, L.

    2017-12-01

    In Mexico it is a country with seismically active regions, mainly the zones that are next to the pacific where the zone of subduction is located, in this work we focus on the state of Michoacán, since this has not been completely studied in the last 30 years after the earthquake in Michoacán in 1985. The first most important step is to know the region which are the most seismic zones within the state and one way is to carry out the regionalization of Michoacán identifying the sources of earthquakes as well as where occur more frequently.If we could know each of the factors that influence seismicity and describe every point of the terrain, every rupture, every rock, etc., then we could describe in an analytical way the seismic process and predict the occurrence of earthquakes such as eclipses. Unfortunately the number of parameters is so enormous that we cannot arrive at an exact description; however, we can take advantage of statistical properties to evaluate probabilities, even in the case of small systems such as a particular seismic zone.In this paper, epicenter data were collected from 1970 to 2014, and with them a statistical study was carried out and the epicenter data plotted using data reported by the National Seismological Service and the IRIS catalog as well as some data from the Institute of engineering UNAM. Where earthquakes of equal and greater than M = 4 were used. Graphing these in function with the depth and with that it was graficaron and was made an overlapping the faults of the state and with that it was divided in 4 seismic zones in function of the faults and the localized seismicity.Zone A. is located within the Michoacán Block set of faults, as well as part of the subduction zone on the coast of the state. Seismicity in this area is high. Zone B-1. This is located between the limits of Jalisco and Michoacán in the set of faults called Tepalcatepec depression and limits with the Jorullo-Tacámbaro fracture. At this site seismicity is relatively moderate. The Zone B-2 is located in the limits of Michoacán and Guerrero, within the fault complex Michoacán Oaxaca, and the faults Zitzio and Villa de Santiago. With relatively moderate seismicity. Zone C This zone is located in the limits of Guanajuato, Querétaro and State of Mexico, within the Acambay fault complex and the Morelia fault system. With relatively low seismicity.

  16. Wavelet analysis for the study of the relations among soil radon anomalies, volcanic and seismic events: the case of Mt. Etna (Italy)

    NASA Astrophysics Data System (ADS)

    Ferrera, Elisabetta; Giammanco, Salvatore; Cannata, Andrea; Montalto, Placido

    2013-04-01

    From November 2009 to April 2011 soil radon activity was continuously monitored using a Barasol® probe located on the upper NE flank of Mt. Etna volcano, close either to the Piano Provenzana fault or to the NE-Rift. Seismic and volcanological data have been analyzed together with radon data. We also analyzed air and soil temperature, barometric pressure, snow and rain fall data. In order to find possible correlations among the above parameters, and hence to reveal possible anomalies in the radon time-series, we used different statistical methods: i) multivariate linear regression; ii) cross-correlation; iii) coherence analysis through wavelet transform. Multivariate regression indicated a modest influence on soil radon from environmental parameters (R2 = 0.31). When using 100-days time windows, the R2 values showed wide variations in time, reaching their maxima (~0.63-0.66) during summer. Cross-correlation analysis over 100-days moving averages showed that, similar to multivariate linear regression analysis, the summer period is characterised by the best correlation between radon data and environmental parameters. Lastly, the wavelet coherence analysis allowed a multi-resolution coherence analysis of the time series acquired. This approach allows to study the relations among different signals either in time or frequency domain. It confirmed the results of the previous methods, but also allowed to recognize correlations between radon and environmental parameters at different observation scales (e.g., radon activity changed during strong precipitations, but also during anomalous variations of soil temperature uncorrelated with seasonal fluctuations). Our work suggests that in order to make an accurate analysis of the relations among distinct signals it is necessary to use different techniques that give complementary analytical information. In particular, the wavelet analysis showed to be very effective in discriminating radon changes due to environmental influences from those correlated with impending seismic or volcanic events.

  17. The discrimination of man-made explosions from earthquakes using seismo-acoustic analysis in the Korean Peninsula

    NASA Astrophysics Data System (ADS)

    Che, Il-Young; Jeon, Jeong-Soo

    2010-05-01

    Korea Institute of Geoscience and Mineral Resources (KIGAM) operates an infrasound network consisting of seven seismo-acoustic arrays in South Korea. Development of the arrays began in 1999, partially in collaboration with Southern Methodist University, with the goal of detecting distant infrasound signals from natural and anthropogenic phenomena in and around the Korean Peninsula. The main operational purpose of this network is to discriminate man-made seismic events from seismicity including thousands of seismic events per year in the region. The man-made seismic events are major cause of error in estimating the natural seismicity, especially where the seismic activity is weak or moderate such as in the Korean Peninsula. In order to discriminate the man-made explosions from earthquakes, we have applied the seismo-acoustic analysis associating seismic and infrasonic signals generated from surface explosion. The observations of infrasound at multiple arrays made it possible to discriminate surface explosion, because small or moderate size earthquake is not sufficient to generate infrasound. Till now we have annually discriminated hundreds of seismic events in seismological catalog as surface explosions by the seismo-acoustic analysis. Besides of the surface explosions, the network also detected infrasound signals from other sources, such as bolide, typhoons, rocket launches, and underground nuclear test occurred in and around the Korean Peninsula. In this study, ten years of seismo-acoustic data are reviewed with recent infrasonic detection algorithm and association method that finally linked to the seismic monitoring system of the KIGAM to increase the detection rate of surface explosions. We present the long-term results of seismo-acoustic analysis, the detection capability of the multiple arrays, and implications for seismic source location. Since the seismo-acoustic analysis is proved as a definite method to discriminate surface explosion, the analysis will be continuously used for estimating natural seismicity and understanding infrasonic sources.

  18. Signal Quality and the Reliability of Seismic Observations

    NASA Astrophysics Data System (ADS)

    Zeiler, C. P.; Velasco, A. A.; Pingitore, N. E.

    2009-12-01

    The ability to detect, time and measure seismic phases depends on the location, size, and quality of the recorded signals. Additional constraints are an analyst’s familiarity with a seismogenic zone and with the seismic stations that record the energy. Quantification and qualification of an analyst’s ability to detect, time and measure seismic signals has not been calculated or fully assessed. The fundamental measurement for computing the accuracy of a seismic measurement is the signal quality. Several methods have been proposed to measure signal quality; however, the signal-to-noise ratio (SNR) has been adopted as a short-term average over the long-term average. While the standard SNR is an easy and computationally inexpensive term, the overall statistical significance has not been computed for seismic measurement analysis. The prospect of canonizing the process of cataloging seismic arrivals hinges on the ability to repeat measurements made by different methods and analysts. The first step in canonizing phase measurements has been done by the IASPEI, which established a reference for accepted practices in naming seismic phases. The New Manual for Seismological Observatory Practices (NMSOP, 2002) outlines key observations for seismic phases recorded at different distances and proposes to quantify timing uncertainty with a user-specified windowing technique. However, this added measurement would not completely remove bias introduced by different techniques used by analysts to time seismic arrivals. The general guideline to time a seismic arrival is to record the time where a noted change in frequency and/or amplitude begins. This is generally achieved by enhancing the arrivals through filtering or beam forming. However, these enhancements can alter the characteristics of the arrival and how the arrival will be measured. Furthermore, each enhancement has user-specified parameters that can vary between analysts and this results in reduced ability to repeat measurements between analysts. The SPEAR project (Zeiler and Velasco, 2009) has started to explore the effects of comparing measurements from the same seismograms. Initial results showed that experience and the signal quality are the leading contributors to pick differences. However, the traditional SNR method of measuring signal quality was replaced by a Wide-band Spectral Ratio (WSR) due to a decrease in scatter. This observation brings up an important question of what is the best way to measure signal quality. We compare various methods (traditional SNR, WSR, power spectral density plots, Allan Variance) that have been proposed to measure signal quality and discuss which method provides the best tool to compare arrival time uncertainty.

  19. Surface-Source Downhole Seismic Analysis in R

    USGS Publications Warehouse

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the inversion for the velocity of each layer. The analyst usually picks layer-interfaces by visual inspection of the travel-time data. I have also developed an algorithm that automatically finds boundaries which can save a significant amount of the time when analyzing a large number of sites. The results of the automatic routines should be reviewed to check that they are reasonable. The interactivity of these scripts allows the user to add and to remove layers quickly, thus allowing rapid feedback on how the residuals are affected by each additional parameter in the inversion. In addition, the script allows many models to be compared at the same time.

  20. Analysis of seismic patterns observed at Nevado del Ruiz volcano, Colombia during August September 1985

    NASA Astrophysics Data System (ADS)

    Martinelli, Bruno

    1990-07-01

    The seismic activity of the Nevado del Ruiz volcano was monitored during August-September 1985 using a three-component portable seismograph station placed on the upper part of the volcano. The objective was to investigate the frequency content of the seismic signals and the possible sources of the volcanic tremor. The seismicity showed a wide spectrum of signals, especially at the beginning of September. Some relevant patterns from the collected records, which have been analyzed by spectrum analysis, are presented. For the purpose of analysis, the records have been divided into several categories such as long-period events, tremor, cyclic tremor episodes, and strong seismic activity on September 8, 1985. The origin of the seismic signals must be considered in relation to the dynamical and acoustical properties of fluids and the shape and dimensions of the volcano's conduits. The main results of the present experiment and analysis show that the sources of the seismic signals are within the volcanic edifice. The signal characteristics indicate that the sources lie in fluid-phase interactions rather than in brittle fracturing of solid components.

  1. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  2. Seismic Analysis Capability in NASTRAN

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.

    1984-01-01

    Seismic analysis is a technique which pertains to loading described in terms of boundary accelerations. Earthquake shocks to buildings is the type of excitation which usually comes to mind when one hears the word seismic, but this technique also applied to a broad class of acceleration excitations which are applied at the base of a structure such as vibration shaker testing or shocks to machinery foundations. Four different solution paths are available in NASTRAN for seismic analysis. They are: Direct Seismic Frequency Response, Direct Seismic Transient Response, Modal Seismic Frequency Response, and Modal Seismic Transient Response. This capability, at present, is invoked not as separate rigid formats, but as pre-packaged ALTER packets to existing RIGID Formats 8, 9, 11, and 12. These ALTER packets are included with the delivery of the NASTRAN program and are stored on the computer as a library of callable utilities. The user calls one of these utilities and merges it into the Executive Control Section of the data deck to perform any of the four options are invoked by setting parameter values in the bulk data.

  3. Earthquake risk perception in Bucharest, Romania.

    PubMed

    Armaş, Iuliana

    2006-10-01

    The Municipality of Bucharest is one of the capitals with the highest seismic risk in the world. Bucharest is particularly vulnerable to seismic hazard due to: the high density of inhabitants, especially within the residential districts with blocks of flats; the old public utility fund; the out-of-date infrastructure; the numerous industrial parks that are undergoing a restructuring process, not to mention the inefficient organization of civil protection and poor education of the population regarding seismic risk. This research was designed to examine the attitudes and perceptions of people living with the risk of an earthquake hazard in Bucharest. We were interested in how attitudes and perceptions differ depending on gender, age, education, residential area and socioeconomic status, characteristics of seismic hazard, degree of risk exposure, degree of danger, and casualty awareness. At the same time, we compare the results of this study with those from a previous and similar enquiry in 1997. The statistical processing has indicated a significant difference between the declared perception of seismic risk and the independent variables of gender, age, level of education, level of attachment to the residential area, and degree to which the subjects consider they may be affected and could retrieve their losses. Due to the continuous decrease of their living standard, the most vulnerable is the aged population. The feelings toward the residential area is another factor of statistical significance for the population's seismic danger perception. A strong affective bond offers a feeling of safety and leads to the neglect and even total denial of the hazard. In the case of independent variables regarding the type of dwelling, its age, and property form, deviations of empiric values from the theoretical distribution are not relevant for the correlation searched for, which indicates that this issue goes beyond the above-mentioned criteria.

  4. Violations of Gutenberg-Richter Relation in Anthropogenic Seismicity

    NASA Astrophysics Data System (ADS)

    Urban, Pawel; Lasocki, Stanislaw; Blascheck, Patrick; do Nascimento, Aderson Farias; Van Giang, Nguyen; Kwiatek, Grzegorz

    2016-05-01

    Anthropogenic seismicity (AS) is the undesired dynamic rockmass response to technological processes. AS environments are shallow hence their heterogeneities have important impact on AS. Moreover, AS is controlled by complex and changeable technological factors. This complicated origin of AS explains why models used in tectonic seismicity may be not suitable for AS. We study here four cases of AS, testing statistically whether the magnitudes follow the Gutenberg-Richter relation or not. The considered cases include the data from Mponeng gold mine in South Africa, the data observed during stimulation of geothermal well Basel 1 in Switzerland, the data from Acu water reservoir region in Brazil and the data from Song Tranh 2 hydropower plant region in Vietnam. The cases differ in inducing technologies, in the duration of periods in which they were recorded, and in the ranges of magnitudes. In all four cases the observed frequency-magnitude distributions statistically significantly differ from the Gutenberg-Richter relation. Although in all cases the Gutenberg-Richter b value changed in time, this factor turns out to be not responsible for the discovered deviations from the Gutenberg-Richter-born exponential distribution model. Though the deviations from Gutenberg-Richter law are not big, they substantially diminish the accuracy of assessment of seismic hazard parameters. It is demonstrated that the use of non-parametric kernel estimators of magnitude distribution functions improves significantly the accuracy of hazard estimates and, therefore, these estimators are recommended to be used in probabilistic analyses of seismic hazard caused by AS.

  5. Multicomponent ensemble models to forecast induced seismicity

    NASA Astrophysics Data System (ADS)

    Király-Proag, E.; Gischig, V.; Zechar, J. D.; Wiemer, S.

    2018-01-01

    In recent years, human-induced seismicity has become a more and more relevant topic due to its economic and social implications. Several models and approaches have been developed to explain underlying physical processes or forecast induced seismicity. They range from simple statistical models to coupled numerical models incorporating complex physics. We advocate the need for forecast testing as currently the best method for ascertaining if models are capable to reasonably accounting for key physical governing processes—or not. Moreover, operational forecast models are of great interest to help on-site decision-making in projects entailing induced earthquakes. We previously introduced a standardized framework following the guidelines of the Collaboratory for the Study of Earthquake Predictability, the Induced Seismicity Test Bench, to test, validate, and rank induced seismicity models. In this study, we describe how to construct multicomponent ensemble models based on Bayesian weightings that deliver more accurate forecasts than individual models in the case of Basel 2006 and Soultz-sous-Forêts 2004 enhanced geothermal stimulation projects. For this, we examine five calibrated variants of two significantly different model groups: (1) Shapiro and Smoothed Seismicity based on the seismogenic index, simple modified Omori-law-type seismicity decay, and temporally weighted smoothed seismicity; (2) Hydraulics and Seismicity based on numerically modelled pore pressure evolution that triggers seismicity using the Mohr-Coulomb failure criterion. We also demonstrate how the individual and ensemble models would perform as part of an operational Adaptive Traffic Light System. Investigating seismicity forecasts based on a range of potential injection scenarios, we use forecast periods of different durations to compute the occurrence probabilities of seismic events M ≥ 3. We show that in the case of the Basel 2006 geothermal stimulation the models forecast hazardous levels of seismicity days before the occurrence of felt events.

  6. Multiparametric statistical investigation of seismicity occurred at El Hierro (Canary Islands) from 2011 to 2014

    NASA Astrophysics Data System (ADS)

    Telesca, Luciano; Lovallo, Michele; Lopez, Carmen; Marti Molist, Joan

    2016-03-01

    A detailed statistical investigation of the seismicity occurred at El Hierro volcano (Canary Islands) from 2011 to 2014 has been performed by analysing the time variation of four parameters: the Gutenberg-Richter b-value, the local coefficient of variation, the scaling exponent of the magnitude distribution and the main periodicity of the earthquake sequence calculated by using the Schuster's test. These four parameters are good descriptors of the time and magnitude distributions of the seismic sequence, and their variation indicate dynamical changes in the volcanic system. These variations can be attributed to the causes and types of seismicity, thus allowing to distinguish between different host-rock fracturing processes caused by intrusions of magma at different depths and overpressures. The statistical patterns observed among the studied unrest episodes and between them and the eruptive episode of 2011-2012 indicate that the response of the host rock to the deformation imposed by magma intrusion did not differ significantly from one episode to the other, thus suggesting that no significant local stress changes induced by magma intrusion occurred when comparing between all them. Therefore, despite the studied unrest episodes were caused by intrusions of magma at different depths and locations below El Hierro island, the mechanical response of the lithosphere was similar in all cases. This suggests that the reason why the first unrest culminated in an eruption while the other did not, may be related to the role of the regional/local tectonics acting at that moment, rather than to the forceful of magma intrusion.

  7. Moon meteoritic seismic hum: Steady state prediction

    USGS Publications Warehouse

    Lognonne, P.; Feuvre, M.L.; Johnson, C.L.; Weber, R.C.

    2009-01-01

    We use three different statistical models describing the frequency of meteoroid impacts on Earth to estimate the seismic background noise due to impacts on the lunar surface. Because of diffraction, seismic events on the Moon are typically characterized by long codas, lasting 1 h or more. We find that the small but frequent impacts generate seismic signals whose codas overlap in time, resulting in a permanent seismic noise that we term the "lunar hum" by analogy with the Earth's continuous seismic background seismic hum. We find that the Apollo era impact detection rates and amplitudes are well explained by a model that parameterizes (1) the net seismic impulse due to the impactor and resulting ejecta and (2) the effects of diffraction and attenuation. The formulation permits the calculation of a composite waveform at any point on the Moon due to simulated impacts at any epicentral distance. The root-mean-square amplitude of this waveform yields a background noise level that is about 100 times lower than the resolution of the Apollo long-period seismometers. At 2 s periods, this noise level is more than 1000 times lower than the low noise model prediction for Earth's microseismic noise. Sufficiently sensitive seismometers will allow the future detection of several impacts per day at body wave frequencies. Copyright 2009 by the American Geophysical Union.

  8. Evaluation of seismic design spectrum based on UHS implementing fourth-generation seismic hazard maps of Canada

    NASA Astrophysics Data System (ADS)

    Ahmed, Ali; Hasan, Rafiq; Pekau, Oscar A.

    2016-12-01

    Two recent developments have come into the forefront with reference to updating the seismic design provisions for codes: (1) publication of new seismic hazard maps for Canada by the Geological Survey of Canada, and (2) emergence of the concept of new spectral format outdating the conventional standardized spectral format. The fourth -generation seismic hazard maps are based on enriched seismic data, enhanced knowledge of regional seismicity and improved seismic hazard modeling techniques. Therefore, the new maps are more accurate and need to incorporate into the Canadian Highway Bridge Design Code (CHBDC) for its next edition similar to its building counterpart National Building Code of Canada (NBCC). In fact, the code writers expressed similar intentions with comments in the commentary of CHBCD 2006. During the process of updating codes, NBCC, and AASHTO Guide Specifications for LRFD Seismic Bridge Design, American Association of State Highway and Transportation Officials, Washington (2009) lowered the probability level from 10 to 2% and 10 to 5%, respectively. This study has brought five sets of hazard maps corresponding to 2%, 5% and 10% probability of exceedance in 50 years developed by the GSC under investigation. To have a sound statistical inference, 389 Canadian cities are selected. This study shows the implications of the changes of new hazard maps on the design process (i.e., extent of magnification or reduction of the design forces).

  9. Unsupervised seismic facies analysis with spatial constraints using regularized fuzzy c-means

    NASA Astrophysics Data System (ADS)

    Song, Chengyun; Liu, Zhining; Cai, Hanpeng; Wang, Yaojun; Li, Xingming; Hu, Guangmin

    2017-12-01

    Seismic facies analysis techniques combine classification algorithms and seismic attributes to generate a map that describes main reservoir heterogeneities. However, most of the current classification algorithms only view the seismic attributes as isolated data regardless of their spatial locations, and the resulting map is generally sensitive to noise. In this paper, a regularized fuzzy c-means (RegFCM) algorithm is used for unsupervised seismic facies analysis. Due to the regularized term of the RegFCM algorithm, the data whose adjacent locations belong to same classification will play a more important role in the iterative process than other data. Therefore, this method can reduce the effect of seismic data noise presented in discontinuous regions. The synthetic data with different signal/noise values are used to demonstrate the noise tolerance ability of the RegFCM algorithm. Meanwhile, the fuzzy factor, the neighbour window size and the regularized weight are tested using various values, to provide a reference of how to set these parameters. The new approach is also applied to a real seismic data set from the F3 block of the Netherlands. The results show improved spatial continuity, with clear facies boundaries and channel morphology, which reveals that the method is an effective seismic facies analysis tool.

  10. A preliminary census of engineering activities located in Sicily (Southern Italy) which may "potentially" induce seismicity

    NASA Astrophysics Data System (ADS)

    Aloisi, Marco; Briffa, Emanuela; Cannata, Andrea; Cannavò, Flavio; Gambino, Salvatore; Maiolino, Vincenza; Maugeri, Roberto; Palano, Mimmo; Privitera, Eugenio; Scaltrito, Antonio; Spampinato, Salvatore; Ursino, Andrea; Velardita, Rosanna

    2015-04-01

    The seismic events caused by human engineering activities are commonly termed as "triggered" and "induced". This class of earthquakes, though characterized by low-to-moderate magnitude, have significant social and economical implications since they occur close to the engineering activity responsible for triggering/inducing them and can be felt by the inhabitants living nearby, and may even produce damage. One of the first well-documented examples of induced seismicity was observed in 1932 in Algeria, when a shallow magnitude 3.0 earthquake occurred close to the Oued Fodda Dam. By the continuous global improvement of seismic monitoring networks, numerous other examples of human-induced earthquakes have been identified. Induced earthquakes occur at shallow depths and are related to a number of human activities, such as fluid injection under high pressure (e.g. waste-water disposal in deep wells, hydrofracturing activities in enhanced geothermal systems and oil recovery, shale-gas fracking, natural and CO2 gas storage), hydrocarbon exploitation, groundwater extraction, deep underground mining, large water impoundments and underground nuclear tests. In Italy, induced/triggered seismicity is suspected to have contributed to the disaster of the Vajont dam in 1963. Despite this suspected case and the presence in the Italian territory of a large amount of engineering activities "capable" of inducing seismicity, no extensive researches on this topic have been conducted to date. Hence, in order to improve knowledge and correctly assess the potential hazard at a specific location in the future, here we started a preliminary study on the entire range of engineering activities currently located in Sicily (Southern Italy) which may "potentially" induce seismicity. To this end, we performed: • a preliminary census of all engineering activities located in the study area by collecting all the useful information coming from available on-line catalogues; • a detailed compilation of instrumental and historical seismicity, focal mechanisms solutions, multidisciplinary stress indicators, GPS-based ground deformation field, mapped faults, etc by merging data from on-line catalogues with those reported in literature. Finally, for each individual site, we analysed: i) long-term statistic behaviour of instrumental seismicity (magnitude of completeness, seismic release above a threshold magnitude, depth distribution, focal plane solutions); ii) long-term statistic behaviour of historical seismicity (maximum magnitude estimation, recurrence time interval, etc); iii) properties and orientation of faults (length, estimated geological slip, kinematics, etc); iv) regional stress (from borehole, seismological and geological observations) and strain (from GPS-based observations) fields.

  11. Comparing Low-Frequency Earthquakes During Triggered and Ambient Tremor in Taiwan

    NASA Astrophysics Data System (ADS)

    Alvarado Lara, F., Sr.; Ledezma, C., Sr.

    2014-12-01

    In South America, larger magnitude seismic events originate in the subduction zone between the Nazca and Continental plates, as opposed to crustal events. Crustal seismic events are important in areas very close to active fault lines; however, seismic hazard analyses incorporate crust events related to a maximum distance from the site under study. In order to use crustal events as part of a seismic hazard analysis, it is necessary to use the attenuation relationships which represent the seismic behavior of the site under study. Unfortunately, in South America the amount of compiled crustal event historical data is not yet sufficient to generate a firm regional attenuation relationship. In the absence of attenuation relationships for crustal earthquakes in the region, the conventional approach is to use attenuation relationships from other regions which have a large amount of compiled data and which have similar seismic conditions to the site under study. This practice permits the development of seismic hazard analysis work with a certain margin of accuracy. In South America, in the engineering practice, new generation attenuation relationships (NGA-W) are used among other alternatives in order to incorporate the effect of crustal events in a seismic hazard analysis. In 2014, the NGA-W Version 2 (NGA-W2) was presented with a database containing information from Taiwan, Turkey, Iran, USA, Mexico, Japan, and Alaska. This paper examines whether it is acceptable to utilize the NGA-W2 in seismic hazard analysis in South America. A comparison between response spectrums of the seismic risk prepared in accordance with NGA-W2 and actual response spectrums of crustal events from Argentina is developed in order to support the examination. The seismic data were gathered from equipment installed in the cities of Santiago, Chile and Mendoza, Argentina.

  12. Revision of the Applicability of the NGA's in South America, Chile - Argentina.

    NASA Astrophysics Data System (ADS)

    Alvarado Lara, F., Sr.; Ledezma, C., Sr.

    2015-12-01

    In South America, larger magnitude seismic events originate in the subduction zone between the Nazca and Continental plates, as opposed to crustal events. Crustal seismic events are important in areas very close to active fault lines; however, seismic hazard analyses incorporate crust events related to a maximum distance from the site under study. In order to use crustal events as part of a seismic hazard analysis, it is necessary to use the attenuation relationships which represent the seismic behavior of the site under study. Unfortunately, in South America the amount of compiled crustal event historical data is not yet sufficient to generate a firm regional attenuation relationship. In the absence of attenuation relationships for crustal earthquakes in the region, the conventional approach is to use attenuation relationships from other regions which have a large amount of compiled data and which have similar seismic conditions to the site under study. This practice permits the development of seismic hazard analysis work with a certain margin of accuracy. In South America, in the engineering practice, new generation attenuation relationships (NGA-W) are used among other alternatives in order to incorporate the effect of crustal events in a seismic hazard analysis. In 2014, the NGA-W Version 2 (NGA-W2) was presented with a database containing information from Taiwan, Turkey, Iran, USA, Mexico, Japan, and Alaska. This paper examines whether it is acceptable to utilize the NGA-W2 in seismic hazard analysis in South America. A comparison between response spectrums of the seismic risk prepared in accordance with NGA-W2 and actual response spectrums of crustal events from Argentina is developed in order to support the examination. The seismic data were gathered from equipment installed in the cities of Santiago, Chile and Mendoza, Argentina.

  13. Slope Stability Analysis In Seismic Areas Of The Northern Apennines (Italy)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lo Presti, D.; Fontana, T.; Marchetti, D.

    2008-07-08

    Several research works have been published on the slope stability in the northern Tuscany (central Italy) and particularly in the seismic areas of Garfagnana and Lunigiana (Lucca and Massa-Carrara districts), aimed at analysing the slope stability under static and dynamic conditions and mapping the landslide hazard. In addition, in situ and laboratory investigations are available for the study area, thanks to the activities undertaken by the Tuscany Seismic Survey. Based on such a huge information the co-seismic stability of few ideal slope profiles have been analysed by means of Limit equilibrium method LEM - (pseudo-static) and Newmark sliding block analysismore » (pseudo-dynamic). The analysis--results gave indications about the most appropriate seismic coefficient to be used in pseudo-static analysis after establishing allowable permanent displacement. Such indications are commented in the light of the Italian and European prescriptions for seismic stability analysis with pseudo-static approach. The stability conditions, obtained from the previous analyses, could be used to define microzonation criteria for the study area.« less

  14. Spatial pattern recognition of seismic events in South West Colombia

    NASA Astrophysics Data System (ADS)

    Benítez, Hernán D.; Flórez, Juan F.; Duque, Diana P.; Benavides, Alberto; Lucía Baquero, Olga; Quintero, Jiber

    2013-09-01

    Recognition of seismogenic zones in geographical regions supports seismic hazard studies. This recognition is usually based on visual, qualitative and subjective analysis of data. Spatial pattern recognition provides a well founded means to obtain relevant information from large amounts of data. The purpose of this work is to identify and classify spatial patterns in instrumental data of the South West Colombian seismic database. In this research, clustering tendency analysis validates whether seismic database possesses a clustering structure. A non-supervised fuzzy clustering algorithm creates groups of seismic events. Given the sensitivity of fuzzy clustering algorithms to centroid initial positions, we proposed a methodology to initialize centroids that generates stable partitions with respect to centroid initialization. As a result of this work, a public software tool provides the user with the routines developed for clustering methodology. The analysis of the seismogenic zones obtained reveals meaningful spatial patterns in South-West Colombia. The clustering analysis provides a quantitative location and dispersion of seismogenic zones that facilitates seismological interpretations of seismic activities in South West Colombia.

  15. Signal-to-noise ratio application to seismic marker analysis and fracture detection

    NASA Astrophysics Data System (ADS)

    Xu, Hui-Qun; Gui, Zhi-Xian

    2014-03-01

    Seismic data with high signal-to-noise ratios (SNRs) are useful in reservoir exploration. To obtain high SNR seismic data, significant effort is required to achieve noise attenuation in seismic data processing, which is costly in materials, and human and financial resources. We introduce a method for improving the SNR of seismic data. The SNR is calculated by using the frequency domain method. Furthermore, we optimize and discuss the critical parameters and calculation procedure. We applied the proposed method on real data and found that the SNR is high in the seismic marker and low in the fracture zone. Consequently, this can be used to extract detailed information about fracture zones that are inferred by structural analysis but not observed in conventional seismic data.

  16. Towards Improved Considerations of Risk in Seismic Design (Plinius Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Sullivan, T. J.

    2012-04-01

    The aftermath of recent earthquakes is a reminder that seismic risk is a very relevant issue for our communities. Implicit within the seismic design standards currently in place around the world is that minimum acceptable levels of seismic risk will be ensured through design in accordance with the codes. All the same, none of the design standards specify what the minimum acceptable level of seismic risk actually is. Instead, a series of deterministic limit states are set which engineers then demonstrate are satisfied for their structure, typically through the use of elastic dynamic analyses adjusted to account for non-linear response using a set of empirical correction factors. From the early nineties the seismic engineering community has begun to recognise numerous fundamental shortcomings with such seismic design procedures in modern codes. Deficiencies include the use of elastic dynamic analysis for the prediction of inelastic force distributions, the assignment of uniform behaviour factors for structural typologies irrespective of the structural proportions and expected deformation demands, and the assumption that hysteretic properties of a structure do not affect the seismic displacement demands, amongst other things. In light of this a number of possibilities have emerged for improved control of risk through seismic design, with several innovative displacement-based seismic design methods now well developed. For a specific seismic design intensity, such methods provide a more rational means of controlling the response of a structure to satisfy performance limit states. While the development of such methodologies does mark a significant step forward for the control of seismic risk, they do not, on their own, identify the seismic risk of a newly designed structure. In the U.S. a rather elaborate performance-based earthquake engineering (PBEE) framework is under development, with the aim of providing seismic loss estimates for new buildings. The PBEE framework consists of the following four main analysis stages: (i) probabilistic seismic hazard analysis to give the mean occurrence rate of earthquake events having an intensity greater than a threshold value, (ii) structural analysis to estimate the global structural response, given a certain value of seismic intensity, (iii) damage analysis, in which fragility functions are used to express the probability that a building component exceeds a damage state, as a function of the global structural response, (iv) loss analysis, in which the overall performance is assessed based on the damage state of all components. This final step gives estimates of the mean annual frequency with which various repair cost levels (or other decision variables) are exceeded. The realisation of this framework does suggest that risk-based seismic design is now possible. However, comparing current code approaches with the proposed PBEE framework, it becomes apparent that mainstream consulting engineers would have to go through a massive learning curve in order to apply the new procedures in practice. With this in mind, it is proposed that simplified loss-based seismic design procedures are a logical means of helping the engineering profession transition from what are largely deterministic seismic design procedures in current codes, to more rational risk-based seismic design methodologies. Examples are provided to illustrate the likely benefits of adopting loss-based seismic design approaches in practice.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphy, J.R.; Marshall, M.E.; Barker, B.W.

    In situations where cavity decoupling of underground nuclear explosions is a plausible evasion scenario, comprehensive seismic monitoring of any eventual CTBT will require the routine identification of many small seismic events with magnitudes in the range 2.0 < m sub b < 3.5. However, since such events are not expected to be detected teleseismically, their magnitudes will have to be estimated from regional recordings using seismic phases and frequency bands which are different from those employed in the teleseismic m sub b scale which is generally used to specify monitoring capability. Therefore, it is necessary to establish the m submore » b equivalences of any selected regional magnitude measures in order to estimate the expected detection statistics and thresholds of proposed CTBT seismic monitoring networks. In the investigations summarized in this report, this has been accomplished through analyses of synthetic data obtained by theoretically scaling observed regional seismic data recorded in Scandinavia and Central Asia from various tamped nuclear tests to obtain estimates of the corresponding seismic signals to be expected from small cavity decoupled nuclear tests at those same source locations.« less

  18. A Bayesian approach to the modelling of α Cen A

    NASA Astrophysics Data System (ADS)

    Bazot, M.; Bourguignon, S.; Christensen-Dalsgaard, J.

    2012-12-01

    Determining the physical characteristics of a star is an inverse problem consisting of estimating the parameters of models for the stellar structure and evolution, and knowing certain observable quantities. We use a Bayesian approach to solve this problem for α Cen A, which allows us to incorporate prior information on the parameters to be estimated, in order to better constrain the problem. Our strategy is based on the use of a Markov chain Monte Carlo (MCMC) algorithm to estimate the posterior probability densities of the stellar parameters: mass, age, initial chemical composition, etc. We use the stellar evolutionary code ASTEC to model the star. To constrain this model both seismic and non-seismic observations were considered. Several different strategies were tested to fit these values, using either two free parameters or five free parameters in ASTEC. We are thus able to show evidence that MCMC methods become efficient with respect to more classical grid-based strategies when the number of parameters increases. The results of our MCMC algorithm allow us to derive estimates for the stellar parameters and robust uncertainties thanks to the statistical analysis of the posterior probability densities. We are also able to compute odds for the presence of a convective core in α Cen A. When using core-sensitive seismic observational constraints, these can rise above ˜40 per cent. The comparison of results to previous studies also indicates that these seismic constraints are of critical importance for our knowledge of the structure of this star.

  19. Microearthquake sequences along the Irpinia normal fault system in Southern Apennines, Italy

    NASA Astrophysics Data System (ADS)

    Orefice, Antonella; Festa, Gaetano; Alfredo Stabile, Tony; Vassallo, Maurizio; Zollo, Aldo

    2013-04-01

    Microearthquakes reflect a continuous readjustment of tectonic structures, such as faults, under the action of local and regional stress fields. Low magnitude seismicity in the vicinity of active fault zones may reveal insights into the mechanics of the fault systems during the inter-seismic period and shine a light on the role of fluids and other physical parameters in promoting or disfavoring the nucleation of larger size events in the same area. Here we analyzed several earthquake sequences concentrated in very limited regions along the 1980 Irpinia earthquake fault zone (Southern Italy), a complex system characterized by normal stress regime, monitored by the dense, multi-component, high dynamic range seismic network ISNet (Irpinia Seismic Network). On a specific single sequence, the May 2008 Laviano swarm, we performed accurate absolute and relative locations and estimated source parameters and scaling laws that were compared with standard stress-drops computed for the area. Additionally, from EGF deconvolution, we computed a slip model for the mainshock and investigated the space-time evolution of the events in the sequence to reveal possible interactions among earthquakes. Through the massive analysis of cross-correlation based on the master event scanning of the continuous recording, we also reconstructed the catalog of repeated earthquakes and recognized several co-located sequences. For these events, we analyzed the statistical properties, location and source parameters and their space-time evolution with the aim of inferring the processes that control the occurrence and the size of microearthquakes in a swarm.

  20. Investigation on the Possible Relationship between Magnetic Pulsations and Earthquakes

    NASA Astrophysics Data System (ADS)

    Jusoh, M.; Liu, H.; Yumoto, K.; Uozumi, T.; Takla, E. M.; Yousif Suliman, M. E.; Kawano, H.; Yoshikawa, A.; Asillam, M.; Hashim, M.

    2012-12-01

    The sun is the main source of energy to the solar system, and it plays a major role in affecting the ionosphere, atmosphere and the earth surface. The connection between solar wind and the ground magnetic pulsations has been proven empirically by several researchers previously (H. J. Singer et al., 1977, E. W. Greenstadt, 1979, I. A. Ansari 2006 to name a few). In our preliminary statistical analysis on relationship between solar and seismic activities (Jusoh and Yumoto, 2011, Jusoh et al., 2012), we observed a high possibility of solar-terrestrial coupling. We observed high tendency of earthquakes to occur during lower phase solar cycles which significantly related with solar wind parameters (i.e solar wind dynamic pressure, speed and input energy). However a clear coupling mechanism was not established yet. To connect the solar impact on seismicity, we investigate the possibility of ground magnetic pulsations as one of the connecting agent. In our analysis, the recorded ground magnetic pulsations are analyzed at different ranges of ultra low frequency; Pc3 (22-100 mHz), Pc4 (6.7-22 mHz) and Pc5 (1.7-6.7 mHz) with the occurrence of local earthquake events at certain time periods. This analysis focuses at 2 different major seismic regions; north Japan (mid latitude) and north Sumatera, Indonesia (low latitude). Solar wind parameters were obtained from the Goddard Space Flight Center, NASA via the OMNIWeb Data Explorer and the Space Physics Data Facility. Earthquake events were extracted from the Advanced National Seismic System (ANSS) database. The localized Pc3-Pc5 magnetic pulsations data were extracted from Magnetic Data Acquisition System (MAGDAS)/Circum Pan Magnetic Network (CPMN) located at Ashibetsu (Japan); for earthquakes monitored at north Japan and Langkawi (Malaysia); for earthquakes observed at north Sumatera. This magnetometer arrays has established by International Center for Space Weather Science and Education, Kyushu University, Japan. From the results, we observed significant correlations between ground magnetic pulsations and solar wind speed at difference earthquake epicenter depths. The details of the analysis will be discussed in the presentation.

  1. Seismogenic response to fluid injection operations in Oklahoma and California: Implications for crustal stresses

    NASA Astrophysics Data System (ADS)

    Goebel, T.; Aminzadeh, F.

    2015-12-01

    The seismogenic response to induced pressure changes provides insight into the proximity to failure of faults close to injection sites. Here, we examine possible seismicity rate changes in response to wastewater disposal and enhanced oil recovery operations in hydrocarbon basins in California and Oklahoma. We test whether a statistically significant rate increase exists within these areas and determine the corresponding timing and location based on nonparametric modeling of background seismicity rates. Annual injection volumes increased monotonically since ~2001 in California and ~1998 in Oklahoma. While OK experienced a recent surge in seismic activity which exceeded the 95% confidence limit of a stationary Poisson process in ~2010, seismicity in CA showed no increase in background rates between 1980 and 2014. A systematic analysis of frequency-magnitude-distributions (FMDs) of likely induced earthquakes in OK indicates that FMDs are depleted in large-magnitude events. Seismicity in CA hydrocarbon basins, on the other hand, shows Gutenberg-Richter type FMDs and b~1. Moreover, the earthquakes and injection operations occur preferably in distinct areas in CA whereas in OK earthquakes occur closer to injection wells than expected from a random uniform process. To test whether injection operations may be responsible for the strongly different seismicity characteristics in CA and OK, we compare overall well density, wellhead pressures, peak and cumulative rates as well as injection depths. We find that average injection rates, pressures and volumes are comparable between CA and OK and that injection occurs on average 0.5 km deeper in CA than in OK. Thus, the here tested operational parameters can not easily explain the vastly different seismogenic response to injection operations in CA and OK, and may only be of secondary importance for the resulting earthquake activity. The potential to induce earthquakes by fluid injection operations is likely controlled by the specific geologic setting and stress state on nearby faults.

  2. Earthquake Occurrence in Bangladesh and Surrounding Region

    NASA Astrophysics Data System (ADS)

    Al-Hussaini, T. M.; Al-Noman, M.

    2011-12-01

    The collision of the northward moving Indian plate with the Eurasian plate is the cause of frequent earthquakes in the region comprising Bangladesh and neighbouring India, Nepal and Myanmar. Historical records indicate that Bangladesh has been affected by five major earthquakes of magnitude greater than 7.0 (Richter scale) during 1869 to 1930. This paper presents some statistical observations of earthquake occurrence in fulfilment of a basic groundwork for seismic hazard assessment of this region. An up to date catalogue covering earthquake information in the region bounded within 17°-30°N and 84°-97°E for the period of historical period to 2010 is derived from various reputed international sources including ISC, IRIS, Indian sources and available publications. Careful scrutiny is done to remove duplicate or uncertain earthquake events. Earthquake magnitudes in the range of 1.8 to 8.1 have been obtained and relationships between different magnitude scales have been studied. Aftershocks are removed from the catalogue using magnitude dependent space window and time window. The main shock data are then analyzed to obtain completeness period for different magnitudes evaluating their temporal homogeneity. Spatial and temporal distribution of earthquakes, magnitude-depth histograms and other statistical analysis are performed to understand the distribution of seismic activity in this region.

  3. The 2013–2016 induced earthquakes in Harper and Sumner Counties, southern Kansas

    USGS Publications Warehouse

    Rubinstein, Justin L.; Ellsworth, William L.; Dougherty, Sara L.

    2018-01-01

    We examine the first four years (2013–2016) of the ongoing seismicity in southern Kansas using high‐precision locations derived from a local seismometer network. The earthquakes occur almost exclusively in the shallow crystalline basement, below the wastewater injection horizon of the Arbuckle Group at the base of the sedimentary section. Multiple lines of evidence lead us to conclude that disposal of wastewater from the production of oil and gas by deep injection is the probable cause for the surge of seismicity that began in 2013. First, the seismicity correlates in space and time with the injection. We observe increases in seismicity subsequent to increases in injection and decreases in seismicity in response to decreases in injection. Second, the earthquake‐rate change is statistically improbable to be of natural origin. From 1974 through the time of the injection increase in 2012, no ML">ML 4 or larger earthquakes occurred in the study area, while six occurred between 2012 and 2016. The probability of this rate change occurring randomly is ∼0.16%">∼0.16%. Third, the other potential industrial drivers of seismicity (hydraulic fracturing and oil production) do not correlate in space or time with seismicity. Local geological conditions are important in determining whether injection operations will induce seismicity, as shown by absence of seismicity near the largest injection operations in the southwest portion of our study area. In addition to local operations, the presence of seismicity 10+ km from large injection wells indicates that regional injection operations also need to be considered to understand the effects of injection on seismicity.

  4. Pseudo-dynamic source characterization accounting for rough-fault effects

    NASA Astrophysics Data System (ADS)

    Galis, Martin; Thingbaijam, Kiran K. S.; Mai, P. Martin

    2016-04-01

    Broadband ground-motion simulations, ideally for frequencies up to ~10Hz or higher, are important for earthquake engineering; for example, seismic hazard analysis for critical facilities. An issue with such simulations is realistic generation of radiated wave-field in the desired frequency range. Numerical simulations of dynamic ruptures propagating on rough faults suggest that fault roughness is necessary for realistic high-frequency radiation. However, simulations of dynamic ruptures are too expensive for routine applications. Therefore, simplified synthetic kinematic models are often used. They are usually based on rigorous statistical analysis of rupture models inferred by inversions of seismic and/or geodetic data. However, due to limited resolution of the inversions, these models are valid only for low-frequency range. In addition to the slip, parameters such as rupture-onset time, rise time and source time functions are needed for complete spatiotemporal characterization of the earthquake rupture. But these parameters are poorly resolved in the source inversions. To obtain a physically consistent quantification of these parameters, we simulate and analyze spontaneous dynamic ruptures on rough faults. First, by analyzing the impact of fault roughness on the rupture and seismic radiation, we develop equivalent planar-fault kinematic analogues of the dynamic ruptures. Next, we investigate the spatial interdependencies between the source parameters to allow consistent modeling that emulates the observed behavior of dynamic ruptures capturing the rough-fault effects. Based on these analyses, we formulate a framework for pseudo-dynamic source model, physically consistent with the dynamic ruptures on rough faults.

  5. Structural vibration passive control and economic analysis of a high-rise building in Beijing

    NASA Astrophysics Data System (ADS)

    Chen, Yongqi; Cao, Tiezhu; Ma, Liangzhe; Luo, Chaoying

    2009-12-01

    Performance analysis of the Pangu Plaza under earthquake and wind loads is described in this paper. The plaza is a 39-story steel high-rise building, 191 m high, located in Beijing close to the 2008 Olympic main stadium. It has both fluid viscous dampers (FVDs) and buckling restrained braces or unbonded brace (BRB or UBB) installed. A repeated iteration procedure in its design and analysis was adopted for optimization. Results from the seismic response analysis in the horizontal and vertical directions show that the FVDs are highly effective in reducing the response of both the main structure and the secondary system. A comparative analysis of structural seismic performance and economic impact was conducted using traditional methods, i.e., increased size of steel columns and beams and/or use of an increased number of seismic braces versus using FVD. Both the structural response and economic analysis show that using FVD to absorb seismic energy not only satisfies the Chinese seismic design code for a “rare” earthquake, but is also the most economical way to improve seismic performance both for one-time direct investment and long term maintenance.

  6. Improvements of the Regional Seismic network of Northwestern Italy in the framework of ALCoTra program activities

    NASA Astrophysics Data System (ADS)

    Bosco, Fabrizio

    2014-05-01

    Arpa Piemonte (Regional Agency for Environmental Protection), in partnership with University of Genoa, manages the regional seismic network, which is part of the Regional Seismic network of Northwestern Italy (RSNI). The network operates since the 80s and, over the years, it has developed in technological features, analysis procedures and geographical coverage. In particular in recent years the network has been further enhanced through the integration of Swiss and French stations installed in the cross-border area. The environmental context enables the installation of sensors in sites with good conditions as regards ambient noise and limited local amplification effects (as proved by PSD analysis, signal quality monitoring via PQLX, H/V analysis). The instrumental equipment consists of Broadband and Very Broadband sensors (Nanometrics Trillium 40" and 240") and different technological solutions for signals real-time transmission (cable, satellite, GPRS), according to the different local environment, with redundant connections and with experimental innovative systems. Digital transmission and acquisition systems operate through standard protocols (Nanometrics, SeedLink), with redundancy in data centers (Genoa, Turin, Rome). Both real-time automatic and manual operational procedures are in use for signals analysis (events detection, picking, focal parameters and ground shaking determination). In the framework of cross-border cooperation program ALCoTra (http://www.interreg-alcotra.org), approved by the European Commission, several projects have been developed to improve the performances of seismic monitoring systems used by partners (Arpa Piemonte, Aosta Valley Region, CNRS, Joseph Fourier University). The cross-border context points out first of all the importance of signals sharing (from 14 to 23 stations in narrow French-Italian border area, with an increase of over 50%) and of coordination during new stations planning and installation in the area. In the ongoing ALCoTra project "CASSAT" (Coordination and Analysis of Alpine Trans-border Seismic Surveillance), we evaluate the improvement of monitoring systems performances in terms of localizations precision and number of detections. Furthermore, we update the procedures for the production of ground shaking maps, with installation of accelerometers and integration of new available data for site effects assessment (VS30 map, FA-VS30 correlations by numerical simulations of seismic response), determined for the specific regional context from geophysical surveys data and geological analysis. As a consequence of the increase of available data due to new stations installation and recently recorded events, a new local magnitude scaling law is calibrated for the area. We also develop a parametric methodology to improve network real-time localization procedures in Northwestern Italy. The area, surrounded by Western Alps and Northern Apennines, presents a complex system of lithospheric structures, characterized by strong heterogeneities of various physical parameters (Ivrea Body, subducting European lithosphere, Ligurian Sea Moho, Po Valley deposits). We work with a localization algorithm (Hypoinverse-2000) suitable for such a heterogeneous context , adopting multi-1d crustal velocities models, linked to epicentral coordinates. In this analysis, first we build velocities models integrating several available geophysical and geo-structural data; then we test jointly both models and algorithm parameters with specifically developed automatic iterative procedures, through batch scripting, database, GIS and statistical analysis tools.

  7. Analysis of the Earthquake Impact towards water-based fire extinguishing system

    NASA Astrophysics Data System (ADS)

    Lee, J.; Hur, M.; Lee, K.

    2015-09-01

    Recently, extinguishing system installed in the building when the earthquake occurred at a separate performance requirements. Before the building collapsed during the earthquake, as a function to maintain a fire extinguishing. In particular, the automatic sprinkler fire extinguishing equipment, such as after a massive earthquake without damage to piping also must maintain confidentiality. In this study, an experiment installed in the building during the earthquake, the water-based fire extinguishing saw grasp the impact of the pipe. Experimental structures for water-based fire extinguishing seismic construction step by step, and then applied to the seismic experiment, the building appears in the extinguishing of the earthquake response of the pipe was measured. Construction of acceleration caused by vibration being added to the size and the size of the displacement is measured and compared with the data response of the pipe from the table, thereby extinguishing water piping need to enhance the seismic analysis. Define the seismic design category (SDC) for the four groups in the building structure with seismic criteria (KBC2009) designed according to the importance of the group and earthquake seismic intensity. The event of a real earthquake seismic analysis of Category A and Category B for the seismic design of buildings, the current fire-fighting facilities could have also determined that the seismic performance. In the case of seismic design categories C and D are installed in buildings to preserve the function of extinguishing the required level of seismic retrofit design is determined.

  8. Multi-Decadal analysis of Global Trends in Microseism Intensity: A Proxy for Changes in Extremal Storm Activity and Oceanic Wave State

    NASA Astrophysics Data System (ADS)

    Anthony, R. E.; Aster, R. C.; Rowe, C. A.

    2016-12-01

    The Earth's seismic noise spectrum features two globally ubiquitous peaks near 8 and 16 s periods (secondary and primary bands) that arise when storm-generated ocean gravity waves are converted to seismic energy, predominantly into Rayleigh waves. Because of its regionally integrative nature, microseism intensity and other seismographic data from long running sites can provide useful proxies for wave state. Expanding an earlier study of global microseism trends (Aster et al., 2010), we analyze digitally-archived, up-to-date (through late 2016) multi-decadal seismic data from stations of global seismographic networks to characterize the spatiotemporal evolution of wave climate over the past >20 years. The IRIS Noise Tool Kit (Bahavair et al., 2013) is used to produce ground motion power spectral density (PSD) estimates in 3-hour overlapping time series segments. The result of this effort is a longer duration and more broadly geographically distributed PSD database than attained in previous studies, particularly for the primary microseism band. Integrating power within the primary and secondary microseism bands enables regional characterization of spatially-integrated trends in wave states and storm event statistics of varying thresholds. The results of these analyses are then interpreted within the context of recognized modes of atmospheric variability, including the particularly strong 2015-2016 El Niño. We note a number of statistically significant increasing trends in both raw microseism power and storm activity occurring at multiple stations in the Northwest Atlantic and Southeast Pacific consistent with generally increased wave heights and storminess in these regions. Such trends in wave activity have the potential to significantly influence coastal environments particularly under rising global sea levels.

  9. Natural time analysis and Tsallis non-additive entropy statistical mechanics.

    NASA Astrophysics Data System (ADS)

    Sarlis, N. V.; Skordas, E. S.; Varotsos, P.

    2016-12-01

    Upon analyzing the seismic data in natural time and employing a sliding natural time window comprising a number of events that would occur in a few months, it has been recently uncovered[1] that a precursory Seismic Electric Signals activity[2] initiates almost simultaneously with the appearance of a minimum in the fluctuations of the order parameter of seismicity [3]. Such minima have been ascertained [4] during periods of the magnitude time series exhibiting long range correlations [5] a few months before all earthquakes of magnitude 7.6 or larger that occurred in the entire Japanese area from 1 January 1984 to 11 March 2011 (the day of the M9 Tohoku-Oki earthquake). Before and after these minima, characteristic changes of the temporal correlations between earthquake magnitudes are observed which cannot be captured by Tsallis non-additive entropy statistical mechanics in the frame of which it has been suggested that kappa distributions arise [6]. Here, we extend the study concerning the existence of such minima in a large area that includes Aegean Sea and its surrounding area which exhibits in general seismo-tectonics [7] different than that of the entire Japanese area. References P. A. Varotsos et al., Tectonophysics, 589 (2013) 116. P. Varotsos and M. Lazaridou, Tectonophysics 188 (1991) 321. P.A. Varotsos et al., Phys Rev E 72 (2005) 041103. N. V. Sarlis et al., Proc Natl Acad Sci USA 110 (2013) 13734. P. A. Varotsos, N. V. Sarlis, and E. S. Skordas, J Geophys Res Space Physics 119 (2014), 9192, doi: 10.1002/2014JA0205800. G. Livadiotis, and D. J. McComas, J Geophys Res 114 (2009) A11105, doi:10.1029/2009JA014352. S. Uyeda et al., Tectonophysics, 304 (1999) 41.

  10. New discovered Izmir and Busan Mud Volcanoes and Application of Seismic Attributes and AVO Analysis in the Easternmost Black Sea.

    NASA Astrophysics Data System (ADS)

    Okay, S.; Cifci, G.; Ozel, S.; Atgin, O.; Ozel, O.; Barin, B.; Er, M.; Dondurur, D.; Kucuk, M.; Gurcay, S.; Choul Kim, D.; Sung-Ho, B.

    2012-04-01

    Recently, the continental margins of Black Sea became important for its gas content. There are no scientific researches offshore Trabzon-Giresun area except the explorations of oil companies. This is the first survey that performed in that area. 1700 km high resolution multichannel seismic and chirp data simultaneously were collected onboard R/V K.Piri Reis . The seismic data reveal BSRs, bright spots and acoustic maskings especially on the eastern part of the survey area. The survey area in the Eastern Black Sea includes continental slope, apron and deep basin. Two mud volcanoes are discovered and named as Busan and Izmir. The observed fold belt is believed to be the main driving force for the growth of mud volcanoes.Faults are developed at the flanks of diapiric uplift. Seismic attributes and AVO analysis are applied to 9 seismic sections which have probable gassy sediments and BSR zones. In the seismic attribute analysis high amplitude horzions with reverse polarity are observed in instantaneous frequency, envelope and apparent polarity sections also with low frequency at instantaneous frequency sections. These analysis verify existence of gas accumulations in the sediments. AVO analysis and cross section drawing and Gradient analysis show Class 1 AVO anomaly and indicate gas in sediments. Keywords: BSR, Bright spot, Mud volcano, Seismic Attributes, AVO

  11. Yield Estimation for Semipalatinsk Underground Nuclear Explosions Using Seismic Surface-wave Observations at Near-regional Distances

    NASA Astrophysics Data System (ADS)

    Adushkin, V. V.

    - A statistical procedure is described for estimating the yields of underground nuclear tests at the former Soviet Semipalatinsk test site using the peak amplitudes of short-period surface waves observed at near-regional distances (Δ < 150 km) from these explosions. This methodology is then applied to data recorded from a large sample of the Semipalatinsk explosions, including the Soviet JVE explosion of September 14, 1988, and it is demonstrated that it provides seismic estimates of explosion yield which are typically within 20% of the yields determined for these same explosions using more accurate, non-seismic techniques based on near-source observations.

  12. Temporal static stress drop variations due to injection activity at The Geysers geothermal field, California

    NASA Astrophysics Data System (ADS)

    Staszek, M.; Orlecka-Sikora, B.; Leptokaropoulos, K.; Kwiatek, G.; Martínez-Garzón, P.

    2017-07-01

    We use a high-quality data set from the NW part of The Geysers geothermal field to determine statistical significance of temporal static stress drop variations and their relation to injection rate changes. We use a group of 322 seismic events which occurred in the proximity of Prati-9 and Prati-29 injection wells to examine the influence of parameters such as moment magnitude, focal mechanism, hypocentral depth, and normalized hypocentral distances from open-hole sections of injection wells on static stress drop changes. Our results indicate that (1) static stress drop variations in time are statistically significant, (2) statistically significant static stress drop changes are inversely related to injection rate fluctuations. Therefore, it is highly expected that static stress drop of seismic events is influenced by pore pressure in underground fluid injection conditions and depends on the effective normal stress and strength of the medium.

  13. Research on the spatial analysis method of seismic hazard for island

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  14. Spatial Temporal Analysis Of Mine-induced Seismicity

    NASA Astrophysics Data System (ADS)

    Fedotova, I. V.; Yunga, S. L.

    The results of analysis of influence mine-induced seismicity on state of stress of a rock mass are represented. The spatial-temporal analysis of influence of mass explosions on rock massif deformation is carried out in the territory of a mine field Yukspor of a wing of the Joined Kirovsk mine JSC "Apatite". Estimation of influence of mass explosions on a massif were determined based firstly on the parameters of natural seismicic regime, and secondly taking into consideration change of seismic energy release. After long series of explosions variations in average number of seismic events was fixed. Is proved, that with increase of a volume of rocks, involved in a deforma- tion the released energy of seismic events, and characteristic intervals of time of their preparation are also varied. At the same time, the mechanism of destruction changes also: from destruction's, of a type shift - separation before destruction's, in a quasi- solid heterogeneous massif (in oxidized zones and zones of actuated faults). Analysis of a database seismicity of a massif from 1993 to 1999 years has confirmed, that the response of a massif on explosions is connected to stress-deformations state a mas- sif and parameters of a mining working. The analysis of spatial-temporal distribution of hypocenters of seismic events has allowed to allocate migration of fissile regions of destruction after mass explosions. The researches are executed at support of the Russian foundation for basic research, - projects 00-05-64758, 01-05-65340.

  15. Regional Observation of Seismic Activity in Baekdu Mountain

    NASA Astrophysics Data System (ADS)

    Kim, Geunyoung; Che, Il-Young; Shin, Jin-Soo; Chi, Heon-Cheol

    2015-04-01

    Seismic unrest in Baekdu Mountain area between North Korea and Northeast China region has called attention to geological research community in Northeast Asia due to her historical and cultural importance. Seismic bulletin shows level of seismic activity in the area is higher than that of Jilin Province of Northeast China. Local volcanic observation shows a symptom of magmatic unrest in period between 2002 and 2006. Regional seismic data have been used to analyze seismic activity of the area. The seismic activity could be differentiated from other seismic phenomena in the region by the analysis.

  16. An alternative approach to probabilistic seismic hazard analysis in the Aegean region using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Weatherill, Graeme; Burton, Paul W.

    2010-09-01

    The Aegean is the most seismically active and tectonically complex region in Europe. Damaging earthquakes have occurred here throughout recorded history, often resulting in considerable loss of life. The Monte Carlo method of probabilistic seismic hazard analysis (PSHA) is used to determine the level of ground motion likely to be exceeded in a given time period. Multiple random simulations of seismicity are generated to calculate, directly, the ground motion for a given site. Within the seismic hazard analysis we explore the impact of different seismic source models, incorporating both uniform zones and distributed seismicity. A new, simplified, seismic source model, derived from seismotectonic interpretation, is presented for the Aegean region. This is combined into the epistemic uncertainty analysis alongside existing source models for the region, and models derived by a K-means cluster analysis approach. Seismic source models derived using the K-means approach offer a degree of objectivity and reproducibility into the otherwise subjective approach of delineating seismic sources using expert judgment. Similar review and analysis is undertaken for the selection of peak ground acceleration (PGA) attenuation models, incorporating into the epistemic analysis Greek-specific models, European models and a Next Generation Attenuation model. Hazard maps for PGA on a "rock" site with a 10% probability of being exceeded in 50 years are produced and different source and attenuation models are compared. These indicate that Greek-specific attenuation models, with their smaller aleatory variability terms, produce lower PGA hazard, whilst recent European models and Next Generation Attenuation (NGA) model produce similar results. The Monte Carlo method is extended further to assimilate epistemic uncertainty into the hazard calculation, thus integrating across several appropriate source and PGA attenuation models. Site condition and fault-type are also integrated into the hazard mapping calculations. These hazard maps are in general agreement with previous maps for the Aegean, recognising the highest hazard in the Ionian Islands, Gulf of Corinth and Hellenic Arc. Peak Ground Accelerations for some sites in these regions reach as high as 500-600 cm s -2 using European/NGA attenuation models, and 400-500 cm s -2 using Greek attenuation models.

  17. Groundwater Depletion and the Sharp Increase of Seismicity in the Southern States, How GRACE Data Could Help?

    NASA Astrophysics Data System (ADS)

    Hong, Z.; Hasan, E.; Hong, Y.; Xia, B.; Zhong, H.

    2016-12-01

    This study is a contribution to how NASA's Gravity Recovery and Climate Experiment (GRACE) data may be used to track anthropogenic related change in the groundwater in the Southern Great Plains (SGP) as well recently increased seismicity in the southern states. The SGP contains one of the most important groundwater aquifers in the United States, the Ogallala groundwater aquifer, which has been exploited since 1900. Meanwhile, the recent activities of oil and gas extraction from the unconventional shall reservoir systems has led to significantly increased groundwater withdrawal and injection of wastewater. Consequently, numerous induced fracture related earthquakes have been recorded in Oklahoma and Texas between 2002 and 2016 The current paper investigates the utility of GRACE data along with the Land Water Content (LWC) information from the Global Land Data Assimilation System (GLDAS) to monitor and track the groundwater changes in three southern states of SGP (Oklahoma, Texas and New Mexico). Additionally, the paper investigates links between active seismicity and the injection of the wastewater due to the oil and gas production. Using GRACE data yields unprecedented information about the inter-annual changes in the Total Water Storage (TWS) from 2002 to 2016 over SGP. The LWC data set sums the soil moisture records with the the total canopy water storage to reveal the total land surface water content. The arithmetic difference between the TWS and LWC is the Groundwater Anomaly (GWA) for any particular region. In the current study, the GWA analysis reveals the following: (1) statistically significant drop of the GWA of about - 27 mm from 2002 to 2007 due to natural and anthropogenic causes; (2) the increased precipitation records from 2008 to 2011 over SGP leads to significant recovery in TWS and an increase in the groundwater content of about 40 mm; (3) the period from 2012 to 2015 experienced increased GWA of about - 6 mm for the period. Using the available seismicity records showed high agreement between the seismicity and the oil production locations. Additionally, the correlation between the groundwater changes and the seismic activity in the study region showed that the changes in groundwater levels are associated with regions of induced seismic activities.

  18. Preliminary Analysis of Remote Triggered Seismicity in Northern Baja California Generated by the 2011, Tohoku-Oki, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Wong-Ortega, V.; Castro, R. R.; Gonzalez-Huizar, H.; Velasco, A. A.

    2013-05-01

    We analyze possible variations of seismicity in the northern Baja California due to the passage of seismic waves from the 2011, M9.0, Tohoku-Oki, Japan earthquake. The northwestern area of Baja California is characterized by a mountain range composed of crystalline rocks. These Peninsular Ranges of Baja California exhibits high microseismic activity and moderate size earthquakes. In the eastern region of Baja California shearing between the Pacific and the North American plates takes place and the Imperial and Cerro-Prieto faults generate most of the seismicity. The seismicity in these regions is monitored by the seismic network RESNOM operated by the Centro de Investigación Científica y de Educación Superior de Ensenada (CICESE). This network consists of 13 three-component seismic stations. We use the seismic catalog of RESNOM to search for changes in local seismic rates occurred after the passing of surface waves generated by the Tohoku-Oki, Japan earthquake. When we compare one month of seismicity before and after the M9.0 earthquake, the preliminary analysis shows absence of triggered seismicity in the northern Peninsular Ranges and an increase of seismicity south of the Mexicali valley where the Imperial fault jumps southwest and the Cerro Prieto fault continues.

  19. Accuracy of three-dimensional seismic ground response analysis in time domain using nonlinear numerical simulations

    NASA Astrophysics Data System (ADS)

    Liang, Fayun; Chen, Haibing; Huang, Maosong

    2017-07-01

    To provide appropriate uses of nonlinear ground response analysis for engineering practice, a three-dimensional soil column with a distributed mass system and a time domain numerical analysis were implemented on the OpenSees simulation platform. The standard mesh of a three-dimensional soil column was suggested to be satisfied with the specified maximum frequency. The layered soil column was divided into multiple sub-soils with a different viscous damping matrix according to the shear velocities as the soil properties were significantly different. It was necessary to use a combination of other one-dimensional or three-dimensional nonlinear seismic ground analysis programs to confirm the applicability of nonlinear seismic ground motion response analysis procedures in soft soil or for strong earthquakes. The accuracy of the three-dimensional soil column finite element method was verified by dynamic centrifuge model testing under different peak accelerations of the earthquake. As a result, nonlinear seismic ground motion response analysis procedures were improved in this study. The accuracy and efficiency of the three-dimensional seismic ground response analysis can be adapted to the requirements of engineering practice.

  20. Proceedings, Seminar on Probabilistic Methods in Geotechnical Engineering

    NASA Astrophysics Data System (ADS)

    Hynes-Griffin, M. E.; Buege, L. L.

    1983-09-01

    Contents: Applications of Probabilistic Methods in Geotechnical Engineering; Probabilistic Seismic and Geotechnical Evaluation at a Dam Site; Probabilistic Slope Stability Methodology; Probability of Liquefaction in a 3-D Soil Deposit; Probabilistic Design of Flood Levees; Probabilistic and Statistical Methods for Determining Rock Mass Deformability Beneath Foundations: An Overview; Simple Statistical Methodology for Evaluating Rock Mechanics Exploration Data; New Developments in Statistical Techniques for Analyzing Rock Slope Stability.

  1. VERCE: a productive e-Infrastructure and e-Science environment for data-intensive seismology research

    NASA Astrophysics Data System (ADS)

    Vilotte, Jean-Pierre; Atkinson, Malcolm; Carpené, Michele; Casarotti, Emanuele; Frank, Anton; Igel, Heiner; Rietbrock, Andreas; Schwichtenberg, Horst; Spinuso, Alessandro

    2016-04-01

    Seismology pioneers global and open-data access -- with internationally approved data, metadata and exchange standards facilitated worldwide by the Federation of Digital Seismic Networks (FDSN) and in Europe the European Integrated Data Archives (EIDA). The growing wealth of data generated by dense observation and monitoring systems and recent advances in seismic wave simulation capabilities induces a change in paradigm. Data-intensive seismology research requires a new holistic approach combining scalable high-performance wave simulation codes and statistical data analysis methods, and integrating distributed data and computing resources. The European E-Infrastructure project "Virtual Earthquake and seismology Research Community e-science environment in Europe" (VERCE) pioneers the federation of autonomous organisations providing data and computing resources, together with a comprehensive, integrated and operational virtual research environment (VRE) and E-infrastructure devoted to the full path of data use in a research-driven context. VERCE delivers to a broad base of seismology researchers in Europe easily used high-performance full waveform simulations and misfit calculations, together with a data-intensive framework for the collaborative development of innovative statistical data analysis methods, all of which were previously only accessible to a small number of well-resourced groups. It balances flexibility with new integrated capabilities to provide a fluent path from research innovation to production. As such, VERCE is a major contribution to the implementation phase of the ``European Plate Observatory System'' (EPOS), the ESFRI initiative of the solid-Earth community. The VRE meets a range of seismic research needs by eliminating chores and technical difficulties to allow users to focus on their research questions. It empowers researchers to harvest the new opportunities provided by well-established and mature high-performance wave simulation codes of the community. It enables active researchers to invent and refine scalable methods for innovative statistical analysis of seismic waveforms in a wide range of application contexts. The VRE paves the way towards a flexible shared framework for seismic waveform inversion, lowering the barriers to uptake for the next generation of researchers. The VRE can be accessed through the science gateway that puts together computational and data-intensive research into the same framework, integrating multiple data sources and services. It provides a context for task-oriented and data-streaming workflows, and maps user actions to the full gamut of the federated platform resources and procurement policies, activating the necessary behind-the-scene automation and transformation. The platform manages and produces domain metadata, coupling them with the provenance information describing the relationships and the dependencies, which characterise the whole workflow process. This dynamic knowledge base, can be explored for validation purposes via a graphical interface and a web API. Moreover, it fosters the assisted selection and re-use of the data within each phase of the scientific analysis. These phases can be identified as Simulation, Data Access, Preprocessing, Misfit and data processing, and are presented to the users of the gateway as dedicated and interactive workspaces. By enabling researchers to share results and provenance information, VERCE steers open-science behaviour, allowing researchers to discover and build on prior work and thereby to progress faster. A key asset is the agile strategy that VERCE deployed in a multi-organisational context, engaging seismologists, data scientists, ICT researchers, HPC and data resource providers, system administrators into short-lived tasks each with a goal that is a seismology priority, and intimately coupling research thinking with technical innovation. This changes the focus from HPC production environments and community data services to user-focused scenario, avoiding wasteful bouts of technology centricity where technologists collect requirements and develop a system that is not used because the ideas of the planned users have moved on. As such the technologies and concepts developed in VERCE are relevant to many other disciplines in computational and data driven Earth Sciences and can provide the key technologies for a European wide computational and data intensive framework in Earth Sciences.

  2. Relative velocity change measurement based on seismic noise analysis in exploration geophysics

    NASA Astrophysics Data System (ADS)

    Corciulo, M.; Roux, P.; Campillo, M.; Dubuq, D.

    2011-12-01

    Passive monitoring techniques based on noise cross-correlation analysis are still debated in exploration geophysics even if recent studies showed impressive performance in seismology at larger scale. Time evolution of complex geological structure using noise data includes localization of noise sources and measurement of relative velocity variations. Monitoring relative velocity variations only requires the measurement of phase shifts of seismic noise cross-correlation functions computed for successive time recordings. The existing algorithms, such as the Stretching and the Doublet, classically need great efforts in terms of computation time, making them not practical when continuous dataset on dense arrays are acquired. We present here an innovative technique for passive monitoring based on the measure of the instantaneous phase of noise-correlated signals. The Instantaneous Phase Variation (IPV) technique aims at cumulating the advantages of the Stretching and Doublet methods while proposing a faster measurement of the relative velocity change. The IPV takes advantage of the Hilbert transform to compute in the time domain the phase difference between two noise correlation functions. The relative velocity variation is measured through the slope of the linear regression of the phase difference curve as a function of correlation time. The large amount of noise correlation functions, classically available at exploration scale on dense arrays, allows for a statistical analysis that further improves the precision of the estimation of the velocity change. In this work, numerical tests first aim at comparing the IPV performance to the Stretching and Doublet techniques in terms of accuracy, robustness and computation time. Then experimental results are presented using a seismic noise dataset with five days of continuous recording on 397 geophones spread on a ~1 km-squared area.

  3. Clustering P-Wave Receiver Functions To Constrain Subsurface Seismic Structure

    NASA Astrophysics Data System (ADS)

    Chai, C.; Larmat, C. S.; Maceira, M.; Ammon, C. J.; He, R.; Zhang, H.

    2017-12-01

    The acquisition of high-quality data from permanent and temporary dense seismic networks provides the opportunity to apply statistical and machine learning techniques to a broad range of geophysical observations. Lekic and Romanowicz (2011) used clustering analysis on tomographic velocity models of the western United States to perform tectonic regionalization and the velocity-profile clusters agree well with known geomorphic provinces. A complementary and somewhat less restrictive approach is to apply cluster analysis directly to geophysical observations. In this presentation, we apply clustering analysis to teleseismic P-wave receiver functions (RFs) continuing efforts of Larmat et al. (2015) and Maceira et al. (2015). These earlier studies validated the approach with surface waves and stacked EARS RFs from the USArray stations. In this study, we experiment with both the K-means and hierarchical clustering algorithms. We also test different distance metrics defined in the vector space of RFs following Lekic and Romanowicz (2011). We cluster data from two distinct data sets. The first, corresponding to the western US, was by smoothing/interpolation of receiver-function wavefield (Chai et al. 2015). Spatial coherence and agreement with geologic region increase with this simpler, spatially smoothed set of observations. The second data set is composed of RFs for more than 800 stations of the China Digital Seismic Network (CSN). Preliminary results show a first order agreement between clusters and tectonic region and each region cluster includes a distinct Ps arrival, which probably reflects differences in crustal thickness. Regionalization remains an important step to characterize a model prior to application of full waveform and/or stochastic imaging techniques because of the computational expense of these types of studies. Machine learning techniques can provide valuable information that can be used to design and characterize formal geophysical inversion, providing information on spatial variability in the subsurface geology.

  4. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  5. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less

  6. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1, 2000 through December 31, 2001

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Estes, Steve; Moran, Seth C.; Paskievitch, John; McNutt, Stephen R.

    2002-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at potentially active volcanoes in Alaska since 1988 (Power and others, 1993; Jolly and others, 1996; Jolly and others, 2001). The primary objectives of this program are the seismic surveillance of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog reflects the status and evolution of the seismic monitoring program, and presents the basic seismic data for the time period January 1, 2000, through December 31, 2001. For an interpretation of these data and previously recorded data, the reader should refer to several recent articles on volcano related seismicity on Alaskan volcanoes in Appendix G.The AVO seismic network was used to monitor twenty-three volcanoes in real time in 2000-2001. These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai Volcanic Group (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Aniakchak Crater, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Great Sitkin Volcano, and Kanaga Volcano (Figure 1). AVO located 1551 and 1428 earthquakes in 2000 and 2001, respectively, on and around these volcanoes.Highlights of the catalog period (Table 1) include: volcanogenic seismic swarms at Shishaldin Volcano between January and February 2000 and between May and June 2000; an eruption at Mount Cleveland between February and May 2001; episodes of possible tremor at Makushin Volcano starting March 2001 and continuing through 2001, and two earthquake swarms at Great Sitkin Volcano in 2001.This catalog includes: (1) earthquake origin times, hypocenters, and magnitudes with summary statistics describing the earthquake location quality; (2) a description of instruments deployed in the field and their locations; (3) a description of earthquake detection, recording, analysis, and data archival systems; (4) station parameters and velocity models used for earthquake locations; (5) a summary of daily station usage throughout the catalog period; and (6) all HYPOELLIPSE files used to determine the earthquake locations presented in this report.

  7. Seismic Risk Perception compared with seismic Risk Factors

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Pessina, Vera; Pino, Nicola Alessandro; Peruzza, Laura

    2016-04-01

    The communication of natural hazards and their consequences is one of the more relevant ethical issues faced by scientists. In the last years, social studies have provided evidence that risk communication is strongly influenced by the risk perception of people. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. A theory that offers an integrative approach to understanding and explaining risk perception is still missing. To explain risk perception, it is necessary to consider several perspectives: social, psychological and cultural perspectives and their interactions. This paper presents the results of the CATI survey on seismic risk perception in Italy, conducted by INGV researchers on funding by the DPC. We built a questionnaire to assess seismic risk perception, with a particular attention to compare hazard, vulnerability and exposure perception with the real data of the same factors. The Seismic Risk Perception Questionnaire (SRP-Q) is designed by semantic differential method, using opposite terms on a Likert scale to seven points. The questionnaire allows to obtain the scores of five risk indicators: Hazard, Exposure, Vulnerability, People and Community, Earthquake Phenomenon. The questionnaire was administered by telephone interview (C.A.T.I.) on a statistical sample at national level of over 4,000 people, in the period January -February 2015. Results show that risk perception seems be underestimated for all indicators considered. In particular scores of seismic Vulnerability factor are extremely low compared with house information data of the respondents. Other data collected by the questionnaire regard Earthquake information level, Sources of information, Earthquake occurrence with respect to other natural hazards, participation at risk reduction activities and level of involvement. Research on risk perception aims to aid risk analysis and policy-making by providing a basis for understanding and anticipating public responses to hazards and improving the communication of risk information among people, technical experts, and decision-makers. Those dealing with seismic risk need to understand what people think about and how they respond to this risk. Without such understanding, well-intended policies may be ineffective. (Slovic, 1987). For these reasons we believe that comparing the perception factors with the "real factors" of seismic risk, is a crucial point to understand the relationship between scientific knowledge and public understanding. Without a comparison with reality, research on risk perception is just an intellectual exercise.

  8. Seismic analysis of the frame structure reformed by cutting off column and jacking based on stiffness ratio

    NASA Astrophysics Data System (ADS)

    Zhao, J. K.; Xu, X. S.

    2017-11-01

    The cutting off column and jacking technology is a method for increasing story height, which has been widely used and paid much attention in engineering. The stiffness will be changed after the process of cutting off column and jacking, which directly affects the overall seismic performance. It is usually necessary to take seismic strengthening measures to enhance the stiffness. A five story frame structure jacking project in Jinan High-tech Zone was taken as an example, and three finite element models were established which contains the frame model before lifting, after lifting and after strengthening. Based on the stiffness, the dynamic time-history analysis was carried out to research its seismic performance under the EL-Centro seismic wave, the Taft seismic wave and the Tianjin artificial seismic wave. The research can provide some guidance for the design and construction of the entire jack lifting structure.

  9. Seismic Fragility Analysis of a Condensate Storage Tank with Age-Related Degradations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, J.; Braverman, J.; Hofmayer, C

    2011-04-01

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structuresmore » and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. This report describes the research effort performed by BNL for the Year 4 scope of work. This report was developed as an update to the Year 3 report by incorporating a major supplement to the Year 3 fragility analysis. In the Year 4 research scope, an additional study was carried out to consider an additional degradation scenario, in which the three basic degradation scenarios, i.e., degraded tank shell, degraded anchor bolts, and cracked anchorage concrete, are combined in a non-perfect correlation manner. A representative operational water level is used for this effort. Building on the same CDFM procedure implemented for the Year 3 Tasks, a simulation method was applied using optimum Latin Hypercube samples to characterize the deterioration behavior of the fragility capacity as a function of age-related degradations. The results are summarized in Section 5 and Appendices G through I.« less

  10. A bird's eye view: the cognitive strategies of experts interpreting seismic profiles

    NASA Astrophysics Data System (ADS)

    Bond, C. E.; Butler, R.

    2012-12-01

    Geoscience is perhaps unique in its reliance on incomplete datasets and building knowledge from their interpretation. This interpretation basis for the science is fundamental at all levels; from creation of a geological map to interpretation of remotely sensed data. To teach and understand better the uncertainties in dealing with incomplete data we need to understand the strategies individual practitioners deploy that make them effective interpreters. The nature of interpretation is such that the interpreter needs to use their cognitive ability in the analysis of the data to propose a sensible solution in their final output that is both consistent not only with the original data but also with other knowledge and understanding. In a series of experiments Bond et al. (2007, 2008, 2011, 2012) investigated the strategies and pitfalls of expert and non-expert interpretation of seismic images. These studies focused on large numbers of participants to provide a statistically sound basis for analysis of the results. The outcome of these experiments showed that techniques and strategies are more important than expert knowledge per se in developing successful interpretations. Experts are successful because of their application of these techniques. In a new set of experiments we have focused on a small number of experts to determine how they use their cognitive and reasoning skills, in the interpretation of 2D seismic profiles. Live video and practitioner commentary were used to track the evolving interpretation and to gain insight on their decision processes. The outputs of the study allow us to create an educational resource of expert interpretation through online video footage and commentary with associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.

  11. Trans-dimensional and hierarchical Bayesian approaches toward rigorous estimation of seismic sources and structures in the Northeast Asia

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Tkalčić, Hrvoje; Mustać, Marija; Rhie, Junkee; Ford, Sean

    2016-04-01

    A framework is presented within which we provide rigorous estimations for seismic sources and structures in the Northeast Asia. We use Bayesian inversion methods, which enable statistical estimations of models and their uncertainties based on data information. Ambiguities in error statistics and model parameterizations are addressed by hierarchical and trans-dimensional (trans-D) techniques, which can be inherently implemented in the Bayesian inversions. Hence reliable estimation of model parameters and their uncertainties is possible, thus avoiding arbitrary regularizations and parameterizations. Hierarchical and trans-D inversions are performed to develop a three-dimensional velocity model using ambient noise data. To further improve the model, we perform joint inversions with receiver function data using a newly developed Bayesian method. For the source estimation, a novel moment tensor inversion method is presented and applied to regional waveform data of the North Korean nuclear explosion tests. By the combination of new Bayesian techniques and the structural model, coupled with meaningful uncertainties related to each of the processes, more quantitative monitoring and discrimination of seismic events is possible.

  12. The Effect Analysis of Strain Rate on Power Transmission Tower-Line System under Seismic Excitation

    PubMed Central

    Wang, Wenming

    2014-01-01

    The effect analysis of strain rate on power transmission tower-line system under seismic excitation is studied in this paper. A three-dimensional finite element model of a transmission tower-line system is created based on a real project. Using theoretical analysis and numerical simulation, incremental dynamic analysis of the power transmission tower-line system is conducted to investigate the effect of strain rate on the nonlinear responses of the transmission tower and line. The results show that the effect of strain rate on the transmission tower generally decreases the maximum top displacements, but it would increase the maximum base shear forces, and thus it is necessary to consider the effect of strain rate on the seismic analysis of the transmission tower. The effect of strain rate could be ignored for the seismic analysis of the conductors and ground lines, but the responses of the ground lines considering strain rate effect are larger than those of the conductors. The results could provide a reference for the seismic design of the transmission tower-line system. PMID:25105157

  13. Determining the Positions of Seismically Active Faults in Platform Regions Based on the Integrated Profile Observations

    NASA Astrophysics Data System (ADS)

    Levshenko, V. T.; Grigoryan, A. G.

    2018-03-01

    By the examples of the Roslavl'skii, Grafskii, and Platava-Varvarinskii faults, the possibility is demonstrated of mapping the geological objects by the measurement algorithm that includes successively measuring the spectra of microseisms at the points of the measurement network by movable instruments and statistical accumulation of the ratios of the power spectra of the amplitudes. Based on this technique, the positions of these seismically active faults are determined by the integrated profile observations of the parameters of microseismic and radon fields. The refined positions of the faults can be used in estimating the seismic impacts on the critical objects in the vicinity of these faults.

  14. A Global Data Analysis for Representing Sediment and Particulate Organic Carbon Yield in Earth System Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Zeli; Leung, L. Ruby; Li, Hongyi

    Although sediment yield (SY) from water erosion is ubiquitous and its environmental consequences are well recognized, its impacts on the global carbon cycle remain largely uncertain. This knowledge gap is partly due to the lack of soil erosion modeling in Earth System Models (ESMs), which are important tools used to understand the global carbon cycle and explore its changes. This study analyzed sediment and particulate organic carbon yield (CY) data from 1081 and 38 small catchments (0.1-200 km27 ), respectively, in different environments across the globe. Using multiple statistical analysis techniques, we explored environmental factors and hydrological processes important formore » SY and CY modeling in ESMs. Our results show clear correlations of high SY with traditional agriculture, seismicity and heavy storms, as well as strong correlations between SY and annual peak runoff. These highlight the potential limitation of SY models that represent only interrill and rill erosion because shallow overland flow and rill flow have limited transport capacity due to their hydraulic geometry to produce high SY. Further, our results suggest that SY modeling in ESMs should be implemented at the event scale to produce the catastrophic mass transport during episodic events. Several environmental factors such as seismicity and land management that are often not considered in current catchment-scale SY models can be important in controlling global SY. Our analyses show that SY is likely the primary control on CY in small catchments and a statistically significant empirical relationship is established to calculate SY and CY jointly in ESMs.« less

  15. A Revised Earthquake Catalogue for South Iceland

    NASA Astrophysics Data System (ADS)

    Panzera, Francesco; Zechar, J. Douglas; Vogfjörd, Kristín S.; Eberhard, David A. J.

    2016-01-01

    In 1991, a new seismic monitoring network named SIL was started in Iceland with a digital seismic system and automatic operation. The system is equipped with software that reports the automatic location and magnitude of earthquakes, usually within 1-2 min of their occurrence. Normally, automatic locations are manually checked and re-estimated with corrected phase picks, but locations are subject to random errors and systematic biases. In this article, we consider the quality of the catalogue and produce a revised catalogue for South Iceland, the area with the highest seismic risk in Iceland. We explore the effects of filtering events using some common recommendations based on network geometry and station spacing and, as an alternative, filtering based on a multivariate analysis that identifies outliers in the hypocentre error distribution. We identify and remove quarry blasts, and we re-estimate the magnitude of many events. This revised catalogue which we consider to be filtered, cleaned, and corrected should be valuable for building future seismicity models and for assessing seismic hazard and risk. We present a comparative seismicity analysis using the original and revised catalogues: we report characteristics of South Iceland seismicity in terms of b value and magnitude of completeness. Our work demonstrates the importance of carefully checking an earthquake catalogue before proceeding with seismicity analysis.

  16. Recent Impacts on Mars: Cluster Properties and Seismic Signal Predictions

    NASA Astrophysics Data System (ADS)

    Justine Daubar, Ingrid; Schmerr, Nicholas; Banks, Maria; Marusiak, Angela; Golombek, Matthew P.

    2016-10-01

    Impacts are a key source of seismic waves that are a primary constraint on the formation, evolution, and dynamics of planetary objects. Geophysical missions such as InSight (Banerdt et al., 2013) will monitor seismic signals from internal and external sources. New martian craters have been identified in orbital images (Malin et al., 2006; Daubar et al., 2013). Seismically detecting such impacts and subsequently imaging the resulting craters will provide extremely accurate epicenters and source crater sizes, enabling calibration of seismic velocities, the efficiency of impact-seismic coupling, and retrieval of detailed regional and local internal structure.To investigate recent impact-induced seismicity on Mars, we have assessed ~100 new, dated impact sites. In approximately half of new impacts, the bolide partially disintegrates in the atmosphere, forming multiple craters in a cluster. We incorporate the resulting, more complex, seismic effects in our model. To characterize the variation between sites, we focus on clustered impacts. We report statistics of craters within clusters: diameters, morphometry indicating subsurface layering, strewn-field azimuths indicating impact direction, and dispersion within clusters indicating combined effects of bolide strength and elevation of breakup.Measured parameters are converted to seismic predictions for impact sources using a scaling law relating crater diameter to the momentum and source duration, calibrated for impacts recorded by Apollo (Lognonne et al., 2009). We use plausible ranges for target properties, bolide densities, and impact velocities to bound the seismic moment. The expected seismic sources are modeled in the near field using a 3-D wave propagation code (Petersson et al., 2010) and in the far field using a 1-D wave propagation code (Friederich et al., 1995), for a martian seismic model. Thus we calculate the amplitudes of seismic phases at varying distances, which can be used to evaluate the detectability of body and surface wave phases created by different sizes and types of impacts all over Mars.

  17. A Methodology to Seperate and Analyze a Seismic Wide Angle Profile

    NASA Astrophysics Data System (ADS)

    Weinzierl, Wolfgang; Kopp, Heidrun

    2010-05-01

    General solutions of inverse problems can often be obtained through the introduction of probability distributions to sample the model space. We present a simple approach of defining an a priori space in a tomographic study and retrieve the velocity-depth posterior distribution by a Monte Carlo method. Utilizing a fitting routine designed for very low statistics to setup and analyze the obtained tomography results, it is possible to statistically separate the velocity-depth model space derived from the inversion of seismic refraction data. An example of a profile acquired in the Lesser Antilles subduction zone reveals the effectiveness of this approach. The resolution analysis of the structural heterogeneity includes a divergence analysis which proves to be capable of dissecting long wide-angle profiles for deep crust and upper mantle studies. The complete information of any parameterised physical system is contained in the a posteriori distribution. Methods for analyzing and displaying key properties of the a posteriori distributions of highly nonlinear inverse problems are therefore essential in the scope of any interpretation. From this study we infer several conclusions concerning the interpretation of the tomographic approach. By calculating a global as well as singular misfits of velocities we are able to map different geological units along a profile. Comparing velocity distributions with the result of a tomographic inversion along the profile we can mimic the subsurface structures in their extent and composition. The possibility of gaining a priori information for seismic refraction analysis by a simple solution to an inverse problem and subsequent resolution of structural heterogeneities through a divergence analysis is a new and simple way of defining a priori space and estimating the a posteriori mean and covariance in singular and general form. The major advantage of a Monte Carlo based approach in our case study is the obtained knowledge of velocity depth distributions. Certainly the decision of where to extract velocity information on the profile for setting up a Monte Carlo ensemble is limiting the a priori space. However, the general conclusion of analyzing the velocity field according to distinct reference distributions gives us the possibility to define the covariance according to any geological unit if we have a priori information on the velocity depth distributions. Using the wide angle data recorded across the Lesser Antilles arc, we are able to resolve a shallow feature like the backstop by a robust and simple divergence analysis. We demonstrate the effectiveness of the new methodology to extract some key features and properties from the inversion results by including information concerning the confidence level of results.

  18. Towards harmonized seismic analysis across Europe using supervised machine learning approaches

    NASA Astrophysics Data System (ADS)

    Zaccarelli, Riccardo; Bindi, Dino; Cotton, Fabrice; Strollo, Angelo

    2017-04-01

    In the framework of the Thematic Core Services for Seismology of EPOS-IP (European Plate Observing System-Implementation Phase), a service for disseminating a regionalized logic-tree of ground motions models for Europe is under development. While for the Mediterranean area the large availability of strong motion data qualified and disseminated through the Engineering Strong Motion database (ESM-EPOS), supports the development of both selection criteria and ground motion models, for the low-to-moderate seismic regions of continental Europe the development of ad-hoc models using weak motion recordings of moderate earthquakes is unavoidable. Aim of this work is to present a platform for creating application-oriented earthquake databases by retrieving information from EIDA (European Integrated Data Archive) and applying supervised learning models for earthquake records selection and processing suitable for any specific application of interest. Supervised learning models, i.e. the task of inferring a function from labelled training data, have been extensively used in several fields such as spam detection, speech and image recognition and in general pattern recognition. Their suitability to detect anomalies and perform a semi- to fully- automated filtering on large waveform data set easing the effort of (or replacing) human expertise is therefore straightforward. Being supervised learning algorithms capable of learning from a relatively small training set to predict and categorize unseen data, its advantage when processing large amount of data is crucial. Moreover, their intrinsic ability to make data driven predictions makes them suitable (and preferable) in those cases where explicit algorithms for detection might be unfeasible or too heuristic. In this study, we consider relatively simple statistical classifiers (e.g., Naive Bayes, Logistic Regression, Random Forest, SVMs) where label are assigned to waveform data based on "recognized classes" needed for our use case. These classes might be a simply binary case (e.g., "good for analysis" vs "bad") or more complex one (e.g., "good for analysis" vs "low SNR", "multi-event", "bad coda envelope"). It is important to stress the fact that our approach can be generalized to any use case providing, as in any supervised approach, an adequate training set of labelled data, a feature-set, a statistical classifier, and finally model validation and evaluation. Examples of use cases considered to develop the system prototype are the characterization of the ground motion in low seismic areas; harmonized spectral analysis across Europe for source and attenuation studies; magnitude calibration; coda analysis for attenuation studies.

  19. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2013-04-01

    Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  20. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT SEISMIC ANALYSIS IN SUPPORT OF INCREASED LIQUID LEVEL IN 241-AP TANK FARMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MACKEY TC; ABBOTT FG; CARPENTER BG

    2007-02-16

    The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford. The "Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Project" is in support of Tri-Party Agreement Milestone M-48-14.

  1. An Analysis of Fundamental Mode Surface Wave Amplitude Measurements

    NASA Astrophysics Data System (ADS)

    Schardong, L.; Ferreira, A. M.; van Heijst, H. J.; Ritsema, J.

    2014-12-01

    Seismic tomography is a powerful tool to decipher the Earth's interior structure at various scales. Traveltimes of seismic waves are widely used to build velocity models, whereas amplitudes are still only seldomly accounted for. This mainly results from our limited ability to separate the various physical effects responsible for observed amplitude variations, such as focussing/defocussing, scattering and source effects. We present new measurements from 50 global earthquakes of fundamental-mode Rayleigh and Love wave amplitude anomalies measured in the period range 35-275 seconds using two different schemes: (i) a standard time-domain amplitude power ratio technique; and (ii) a mode-branch stripping scheme. For minor-arc data, we observe amplitude anomalies with respect to PREM in the range of 0-4, for which the two measurement techniques show a very good overall agreement. We present here a statistical analysis and comparison of these datasets, as well as comparisons with theoretical calculations for a variety of 3-D Earth models. We assess the geographical coherency of the measurements, and investigate the impact of source, path and receiver effects on surface wave amplitudes, as well as their variations with frequency in a wider range than previously studied.

  2. U.S. Geological Survery Oil and Gas Resource Assessment of the Russian Arctic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donald Gautier; Timothy Klett

    2008-12-31

    The U.S. Geological Survey (USGS) recently completed a study of undiscovered petroleum resources in the Russian Arctic as a part of its Circum-Arctic Resource Appraisal (CARA), which comprised three broad areas of work: geological mapping, basin analysis, and quantitative assessment. The CARA was a probabilistic, geologically based study that used existing USGS methodology, modified somewhat for the circumstances of the Arctic. New map compilation was used to identify assessment units. The CARA relied heavily on geological analysis and analog modeling, with numerical input consisting of lognormal distributions of sizes and numbers of undiscovered accumulations. Probabilistic results for individual assessment unitsmore » were statistically aggregated, taking geological dependencies into account. The U.S. Department of Energy (DOE) funds were used to support the purchase of crucial seismic data collected in the Barents Sea, East Siberian Sea, and Chukchi Sea for use by USGS in its assessment of the Russian Arctic. DOE funds were also used to purchase a commercial study, which interpreted seismic data from the northern Kara Sea, and for geographic information system (GIS) support of USGS mapping of geological features, province boundaries, total petroleum systems, and assessment units used in the USGS assessment.« less

  3. Investigation of Volcanic Seismo-Acoustic Signals: Applying Subspace Detection to Lava Fountain Activity at Etna Volcano

    NASA Astrophysics Data System (ADS)

    Sciotto, M.; Rowe, C. A.; Cannata, A.; Arrowsmith, S.; Privitera, E.; Gresta, S.

    2011-12-01

    The current eruption of Mount Etna, which began in January, 2011, has produced numerous energetic episodes of lava fountaining, which have bee recorded by the INGV seismic and acoustic sensors located on and around the volcano. The source of these events was the pit crater on the east flank of the Southeast crater of Etna. Simultaneously, small levels of activity were noted in the Bocca Nuova as well, prior to its lava fountaining activity. We will present an analysis of seismic and acoustic signals related to the 2011 activity wherein we apply the method of subspace detection to determine whether the source exhibits a temporal evolution within or between fountaining events, or otherwise produces repeating, classifiable events occurring through the continuous explosive degassing. We will examine not only the raw waveforms, but also spectral variations in time as well as time-varying statistical functions such as signal skewness and kurtosis. These results will be compared to straightforward cross-correlation analysis. In addition to classification performance, the subspace method has promise to outperform standard STA/LTA methods for real-time event detection in cases where similar events can be expected.

  4. Review of Seismic Hazard Issues Associated with Auburn Dam Project, Sierra Nevada Foothills, California

    USGS Publications Warehouse

    Schwartz, D.P.; Joyner, W.B.; Stein, R.S.; Brown, R.D.; McGarr, A.F.; Hickman, S.H.; Bakun, W.H.

    1996-01-01

    Summary -- The U.S. Geological Survey was requested by the U.S. Department of the Interior to review the design values and the issue of reservoir-induced seismicity for a concrete gravity dam near the site of the previously-proposed Auburn Dam in the western foothills of the Sierra Nevada, central California. The dam is being planned as a flood-control-only dam with the possibility of conversion to a permanent water-storage facility. As a basis for planning studies the U.S. Army Corps of Engineers is using the same design values approved by the Secretary of the Interior in 1979 for the original Auburn Dam. These values were a maximum displacement of 9 inches on a fault intersecting the dam foundation, a maximum earthquake at the site of magnitude 6.5, a peak horizontal acceleration of 0.64 g, and a peak vertical acceleration of 0.39 g. In light of geological and seismological investigations conducted in the western Sierran foothills since 1979 and advances in the understanding of how earthquakes are caused and how faults behave, we have developed the following conclusions and recommendations: Maximum Displacement. Neither the pre-1979 nor the recent observations of faults in the Sierran foothills precisely define the maximum displacement per event on a fault intersecting the dam foundation. Available field data and our current understanding of surface faulting indicate a range of values for the maximum displacement. This may require the consideration of a design value larger than 9 inches. We recommend reevaluation of the design displacement using current seismic hazard methods that incorporate uncertainty into the estimate of this design value. Maximum Earthquake Magnitude. There are no data to indicate that a significant change is necessary in the use of an M 6.5 maximum earthquake to estimate design ground motions at the dam site. However, there is a basis for estimating a range of maximum magnitudes using recent field information and new statistical fault relations. We recommend reevaluating the maximum earthquake magnitude using current seismic hazard methodology. Design Ground Motions. A large number of strong-motion records have been acquired and significant advances in understanding of ground motion have been achieved since the original evaluations. The design value for peak horizontal acceleration (0.64 g) is larger than the median of one recent study and smaller than the median value of another. The value for peak vertical acceleration (0.39 g) is somewhat smaller than median values of two recent studies. We recommend a reevaluation of the design ground motions that takes into account new ground motion data with particular attention to rock sites at small source distances. Reservoir-Induced Seismicity. The potential for reservoir-induced seismicity must be considered for the Auburn Darn project. A reservoir-induced earthquake is not expected to be larger than the maximum naturally occurring earthquake. However, the probability of an earthquake may be enhanced by reservoir impoundment. A flood-control-only project may involve a lower probability of significant induced seismicity than a multipurpose water-storage dam. There is a need to better understand and quantify the likelihood of this hazard. A methodology should be developed to quantify the potential for reservoir induced seismicity using seismicity data from the Sierran foothills, new worldwide observations of induced and triggered seismicity, and current understanding of the earthquake process. Reevaluation of Design Parameters. The reevaluation of the maximum displacement, maximum magnitude earthquake, and design ground motions can be made using available field observations from the Sierran foothills, updated statistical relations for faulting and ground motions, and current computational seismic hazard methodologies that incorporate uncertainty into the analysis. The reevaluation does not require significant new geological field studies.

  5. A Matched Field Processing Framework for Coherent Detection Over Local and Regional Networks (Postprint)

    DTIC Science & Technology

    2011-12-30

    the term " superresolution "). The single-phase matched field statistic for a given template was also demonstrated to be a viable detection statistic... Superresolution with seismic arrays using empirical matched field processing, Geophys. J. Int. 182: 1455–1477. Kim, K.-H. and Park, Y. (2010): The 20

  6. Statistical classification approach to discrimination between weak earthquakes and quarry blasts recorded by the Israel Seismic Network

    NASA Astrophysics Data System (ADS)

    Kushnir, A. F.; Troitsky, E. V.; Haikin, L. M.; Dainty, A.

    1999-06-01

    A semi-automatic procedure has been developed to achieve statistically optimum discrimination between earthquakes and explosions at local or regional distances based on a learning set specific to a given region. The method is used for step-by-step testing of candidate discrimination features to find the optimum (combination) subset of features, with the decision taken on a rigorous statistical basis. Linear (LDF) and Quadratic (QDF) Discriminant Functions based on Gaussian distributions of the discrimination features are implemented and statistically grounded; the features may be transformed by the Box-Cox transformation z=(1/ α)( yα-1) to make them more Gaussian. Tests of the method were successfully conducted on seismograms from the Israel Seismic Network using features consisting of spectral ratios between and within phases. Results showed that the QDF was more effective than the LDF and required five features out of 18 candidates for the optimum set. It was found that discrimination improved with increasing distance within the local range, and that eliminating transformation of the features and failing to correct for noise led to degradation of discrimination.

  7. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.

  8. (Multi)fractality of Earthquakes by use of Wavelet Analysis

    NASA Astrophysics Data System (ADS)

    Enescu, B.; Ito, K.; Struzik, Z. R.

    2002-12-01

    The fractal character of earthquakes' occurrence, in time, space or energy, has by now been established beyond doubt and is in agreement with modern models of seismicity. Moreover, the cascade-like generation process of earthquakes -with one "main" shock followed by many aftershocks, having their own aftershocks- may well be described through multifractal analysis, well suited for dealing with such multiplicative processes. The (multi)fractal character of seismicity has been analysed so far by using traditional techniques, like the box-counting and correlation function algorithms. This work introduces a new approach for characterising the multifractal patterns of seismicity. The use of wavelet analysis, in particular of the wavelet transform modulus maxima, to multifractal analysis was pioneered by Arneodo et al. (1991, 1995) and applied successfully in diverse fields, such as the study of turbulence, the DNA sequences or the heart rate dynamics. The wavelets act like a microscope, revealing details about the analysed data at different times and scales. We introduce and perform such an analysis on the occurrence time of earthquakes and show its advantages. In particular, we analyse shallow seismicity, characterised by a high aftershock "productivity", as well as intermediate and deep seismic activity, known for its scarcity of aftershocks. We examine as well declustered (aftershocks removed) versions of seismic catalogues. Our preliminary results show some degree of multifractality for the undeclustered, shallow seismicity. On the other hand, at large scales, we detect a monofractal scaling behaviour, clearly put in evidence for the declustered, shallow seismic activity. Moreover, some of the declustered sequences show a long-range dependent (LRD) behaviour, characterised by a Hurst exponent, H > 0.5, in contrast with the memory-less, Poissonian model. We demonstrate that the LRD is a genuine characteristic and is not an effect of the time series probability distribution function. One of the most attractive features of wavelet analysis is its ability to determine a local Hurst exponent. We show that this feature together with the possibility of extending the analysis to spatial patterns may constitute a valuable approach to search for anomalous (precursory?) patterns of seismic activity.

  9. Detailed seismicity analysis revealing the dynamics of the southern Dead Sea area

    NASA Astrophysics Data System (ADS)

    Braeuer, B.; Asch, G.; Hofstetter, R.; Haberland, Ch.; Jaser, D.; El-Kelani, R.; Weber, M.

    2014-10-01

    Within the framework of the international DESIRE (DEad Sea Integrated REsearch) project, a dense temporary local seismological network was operated in the southern Dead Sea area. During 18 recording months, 648 events were detected. Based on an already published tomography study clustering, focal mechanisms, statistics and the distribution of the microseismicity in relation to the velocity models from the tomography are analysed. The determined b value of 0.74 leads to a relatively high risk of large earthquakes compared to the moderate microseismic activity. The distribution of the seismicity indicates an asymmetric basin with a vertical strike-slip fault forming the eastern boundary of the basin, and an inclined western boundary, made up of strike-slip and normal faults. Furthermore, significant differences between the area north and south of the Bokek fault were observed. South of the Bokek fault, the western boundary is inactive while the entire seismicity occurs on the eastern boundary and below the basin-fill sediments. The largest events occurred here, and their focal mechanisms represent the northwards transform motion of the Arabian plate along the Dead Sea Transform. The vertical extension of the spatial and temporal cluster from February 2007 is interpreted as being related to the locking of the region around the Bokek fault. North of the Bokek fault similar seismic activity occurs on both boundaries most notably within the basin-fill sediments, displaying mainly small events with strike-slip mechanism and normal faulting in EW direction. Therefore, we suggest that the Bokek fault forms the border between the single transform fault and the pull-apart basin with two active border faults.

  10. Estimation of the behavior factor of existing RC-MRF buildings

    NASA Astrophysics Data System (ADS)

    Vona, Marco; Mastroberti, Monica

    2018-01-01

    In recent years, several research groups have studied a new generation of analysis methods for seismic response assessment of existing buildings. Nevertheless, many important developments are still needed in order to define more reliable and effective assessment procedures. Moreover, regarding existing buildings, it should be highlighted that due to the low knowledge level, the linear elastic analysis is the only analysis method allowed. The same codes (such as NTC2008, EC8) consider the linear dynamic analysis with behavior factor as the reference method for the evaluation of seismic demand. This type of analysis is based on a linear-elastic structural model subject to a design spectrum, obtained by reducing the elastic spectrum through a behavior factor. The behavior factor (reduction factor or q factor in some codes) is used to reduce the elastic spectrum ordinate or the forces obtained from a linear analysis in order to take into account the non-linear structural capacities. The behavior factors should be defined based on several parameters that influence the seismic nonlinear capacity, such as mechanical materials characteristics, structural system, irregularity and design procedures. In practical applications, there is still an evident lack of detailed rules and accurate behavior factor values adequate for existing buildings. In this work, some investigations of the seismic capacity of the main existing RC-MRF building types have been carried out. In order to make a correct evaluation of the seismic force demand, actual behavior factor values coherent with force based seismic safety assessment procedure have been proposed and compared with the values reported in the Italian seismic code, NTC08.

  11. An Algorithm Framework for Isolating Anomalous Signals in Electromagnetic Data

    NASA Astrophysics Data System (ADS)

    Kappler, K. N.; Schneider, D.; Bleier, T.; MacLean, L. S.

    2016-12-01

    QuakeFinder and its international collaborators have installed and currently maintain an array of 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. Based on research by Bleier et al. (2009), Fraser-Smith et al. (1990), and Freund (2007), the electromagnetic data from these instruments are being analyzed for pre-earthquake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, NOA in Greece, LASP at University of Colorado, Stanford, UCLA, NASA-ESI, NASA-AMES and USC-CSEP). QuakeFinder has developed an algorithm framework aimed at isolating anomalous signals (pulses) in the time series. Results are presented from an application of this framework to induction-coil magnetometer data. Our data driven approach starts with sliding windows applied to uniformly resampled array data with a variety of lengths and overlap. Data variance (a proxy for energy) is calculated on each window and a short-term average/ long-term average (STA/LTA) filter is applied to the variance time series. Pulse identification is done by flagging time intervals in the STA/LTA filtered time series which exceed a threshold. Flagged time intervals are subsequently fed into a feature extraction program which computes statistical properties of the resampled data. These features are then filtered using a Principal Component Analysis (PCA) based method to cluster similar pulses. We explore the extent to which this approach categorizes pulses with known sources (e.g. cars, lightning, etc.) and the remaining pulses of unknown origin can be analyzed with respect to their relationship with seismicity. We seek a correlation between these daily pulse-counts (with known sources removed) and subsequent (days to weeks) seismic events greater than M5 within 15km radius. Thus we explore functions which map daily pulse-counts to a time series representing the likelihood of a seismic event occurring at some future time. These "pseudo-probabilities" can in turn be represented as Molchan diagrams. The Molchan curve provides an effective cost function for optimization and allows for a rigorous statistical assessment of the validity of pre-earthquake signals in the electromagnetic data.

  12. An Application of Reassigned Time-Frequency Representations for Seismic Noise/Signal Decomposition

    NASA Astrophysics Data System (ADS)

    Mousavi, S. M.; Langston, C. A.

    2016-12-01

    Seismic data recorded by surface arrays are often strongly contaminated by unwanted noise. This background noise makes the detection of small magnitude events difficult. An automatic method for seismic noise/signal decomposition is presented based upon an enhanced time-frequency representation. Synchrosqueezing is a time-frequency reassignment method aimed at sharpening a time-frequency picture. Noise can be distinguished from the signal and suppressed more easily in this reassigned domain. The threshold level is estimated using a general cross validation approach that does not rely on any prior knowledge about the noise level. Efficiency of thresholding has been improved by adding a pre-processing step based on higher order statistics and a post-processing step based on adaptive hard-thresholding. In doing so, both accuracy and speed of the denoising have been improved compared to our previous algorithms (Mousavi and Langston, 2016a, 2016b; Mousavi et al., 2016). The proposed algorithm can either kill the noise (either white or colored) and keep the signal or kill the signal and keep the noise. Hence, It can be used in either normal denoising applications or in ambient noise studies. Application of the proposed method on synthetic and real seismic data shows the effectiveness of the method for denoising/designaling of local microseismic, and ocean bottom seismic data. References: Mousavi, S.M., C. A. Langston., and S. P. Horton (2016), Automatic Microseismic Denoising and Onset Detection Using the Synchrosqueezed-Continuous Wavelet Transform. Geophysics. 81, V341-V355, doi: 10.1190/GEO2015-0598.1. Mousavi, S.M., and C. A. Langston (2016a), Hybrid Seismic Denoising Using Higher-Order Statistics and Improved Wavelet Block Thresholding. Bull. Seismol. Soc. Am., 106, doi: 10.1785/0120150345. Mousavi, S.M., and C.A. Langston (2016b), Adaptive noise estimation and suppression for improving microseismic event detection, Journal of Applied Geophysics., doi: http://dx.doi.org/10.1016/j.jappgeo.2016.06.008.

  13. Design and analysis of fractional order seismic transducer for displacement and acceleration measurements

    NASA Astrophysics Data System (ADS)

    Veeraian, Parthasarathi; Gandhi, Uma; Mangalanathan, Umapathy

    2018-04-01

    Seismic transducers are widely used for measurement of displacement, velocity, and acceleration. This paper presents the design of seismic transducer in the fractional domain for the measurement of displacement and acceleration. The fractional order transfer function for seismic displacement and acceleration transducer are derived using Grünwald-Letnikov derivative. Frequency response analysis of fractional order seismic displacement transducer (FOSDT) and fractional order seismic acceleration transducer (FOSAT) are carried out for different damping ratio with the different fractional order, and the maximum dynamic measurement range is identified. The results demonstrate that fractional order seismic transducer has increased dynamic measurement range and less phase distortion as compared to the conventional seismic transducer even with a lower damping ratio. Time response of FOSDT and FOSAT are derived analytically in terms of Mittag-Leffler function, the effect of fractional behavior in the time domain is evaluated from the impulse and step response. The fractional order system is found to have significantly reduced overshoot as compared to the conventional transducer. The fractional order seismic transducer design proposed in this paper is illustrated with a design example for FOSDT and FOSAT. Finally, an electrical equivalent of FOSDT and FOSAT is considered, and its frequency response is found to be in close agreement with the proposed fractional order seismic transducer.

  14. Modelling framework developed for managing and forecasting the El Hierro 2011-2014 unrest processes based on the analysis of the seismicity and deformation data rate.

    NASA Astrophysics Data System (ADS)

    Garcia, Alicia; Fernandez-Ros, Alberto; Berrocoso, Manuel; Marrero, Jose Manuel; Prates, Gonçalo; De la Cruz-Reyna, Servando; Ortiz, Ramon

    2014-05-01

    In July 2011 at El Hierro (Canary Islands, Spain), a volcanic unrest was detected, with significant deformations followed by increased seismicity. A submarine eruption started on 10 October 2011 and ceased on 5 March 2012, after the volcanic tremor signals persistently weakened through February 2012. However, the seismic activity did not end when the eruption, as several other seismic crises followed since. The seismic episodes presented a characteristic pattern: over a few days the number and magnitude of seismic event increased persistently, culminating in seismic events severe enough to be felt all over the island. In all cases the seismic activity was preceded by significant deformations measured on the island's surface that continued during the whole episode. Analysis of the available GNSS-GPS and seismic data suggests that several magma injection processes occurred at depth from the beginning of the unrest. A model combining the geometry of the magma injection process and the variations in seismic energy released has allowed successful forecasting of the new-vent opening. The model presented here places special emphasis on phenomena associated to moderate eruptions, as well as on volcano-tectonic earthquakes and landslides, which in some cases, as in El Hierro, may be more destructive than an eruption itself.

  15. Quantitative assessments of mantle flow models against seismic observations: Influence of uncertainties in mineralogical parameters

    NASA Astrophysics Data System (ADS)

    Schuberth, Bernhard S. A.

    2017-04-01

    One of the major challenges in studies of Earth's deep mantle is to bridge the gap between geophysical hypotheses and observations. The biggest dataset available to investigate the nature of mantle flow are recordings of seismic waveforms. On the other hand, numerical models of mantle convection can be simulated on a routine basis nowadays for earth-like parameters, and modern thermodynamic mineralogical models allow us to translate the predicted temperature field to seismic structures. The great benefit of the mineralogical models is that they provide the full non-linear relation between temperature and seismic velocities and thus ensure a consistent conversion in terms of magnitudes. This opens the possibility for quantitative assessments of the theoretical predictions. The often-adopted comparison between geodynamic and seismic models is unsuitable in this respect owing to the effects of damping, limited resolving power and non-uniqueness inherent to tomographic inversions. The most relevant issue, however, is related to wavefield effects that reduce the magnitude of seismic signals (e.g., traveltimes of waves), a phenomenon called wavefront healing. Over the past couple of years, we have developed an approach that takes the next step towards a quantitative assessment of geodynamic models and that enables us to test the underlying geophysical hypotheses directly against seismic observations. It is based solely on forward modelling and warrants a physically correct treatment of the seismic wave equation without theoretical approximations. Fully synthetic 3-D seismic wavefields are computed using a spectral element method for 3-D seismic structures derived from mantle flow models. This way, synthetic seismograms are generated independent of any seismic observations. Furthermore, through the wavefield simulations, it is possible to relate the magnitude of lateral temperature variations in the dynamic flow simulations directly to body-wave traveltime residuals. The synthetic traveltime data can then be compared - on statistical grounds - to the traveltime variations observed on Earth. Here, we now investigate the influence of uncertainties in the various input parameters that enter our modelling. This is especially important for the material properties at high pressure and high temperature entering the mineralogical models. In particular, this concerns uncertainties that arise from relating measurements in the laboratory to Earth properties on a global scale. As one example, we will address the question on the influence of anelasticity on the variance of global synthetic traveltime residuals. Owing to the differences in seismic frequency content between laboratory measurements (MHz to GHz) and the Earth (mHz to Hz), the seismic velocities given in the mineralogical models need to be adjusted; that is, corrected for dispersion due to anelastic effects. This correction will increase the sensitivity of the seismic velocities to temperature variations. The magnitude of this increase depends on absolute temperature, frequency, the frequency dependence of attenuation and the activation enthalpy of the dissipative process. Especially the latter two are poorly known for mantle minerals and our results indicate that variations in activation enthalpy potentially produce the largest differences in temperature sensitivity with respect to the purely elastic case. We will present new wave propagation simulations and corresponding statistical analyses of traveltime measurements for different synthetic seismic models spanning the possible range of anelastic velocity conversions (while being based on the same mantle circulation model).

  16. Post-seismic velocity changes following the 2010 Mw 7.1 Darfield earthquake, New Zealand, revealed by ambient seismic field analysis

    NASA Astrophysics Data System (ADS)

    Heckels, R. EG; Savage, M. K.; Townend, J.

    2018-05-01

    Quantifying seismic velocity changes following large earthquakes can provide insights into fault healing and reloading processes. This study presents temporal velocity changes detected following the 2010 September Mw 7.1 Darfield event in Canterbury, New Zealand. We use continuous waveform data from several temporary seismic networks lying on and surrounding the Greendale Fault, with a maximum interstation distance of 156 km. Nine-component, day-long Green's functions were computed for frequencies between 0.1 and 1.0 Hz for continuous seismic records from immediately after the 2010 September 04 earthquake until 2011 January 10. Using the moving-window cross-spectral method, seismic velocity changes were calculated. Over the study period, an increase in seismic velocity of 0.14 ± 0.04 per cent was determined near the Greendale Fault, providing a new constraint on post-seismic relaxation rates in the region. A depth analysis further showed that velocity changes were confined to the uppermost 5 km of the subsurface. We attribute the observed changes to post-seismic relaxation via crack healing of the Greendale Fault and throughout the surrounding region.

  17. Experimental Concepts for Testing Seismic Hazard Models

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Jordan, T. H.

    2015-12-01

    Seismic hazard analysis is the primary interface through which useful information about earthquake rupture and wave propagation is delivered to society. To account for the randomness (aleatory variability) and limited knowledge (epistemic uncertainty) of these natural processes, seismologists must formulate and test hazard models using the concepts of probability. In this presentation, we will address the scientific objections that have been raised over the years against probabilistic seismic hazard analysis (PSHA). Owing to the paucity of observations, we must rely on expert opinion to quantify the epistemic uncertainties of PSHA models (e.g., in the weighting of individual models from logic-tree ensembles of plausible models). The main theoretical issue is a frequentist critique: subjectivity is immeasurable; ergo, PSHA models cannot be objectively tested against data; ergo, they are fundamentally unscientific. We have argued (PNAS, 111, 11973-11978) that the Bayesian subjectivity required for casting epistemic uncertainties can be bridged with the frequentist objectivity needed for pure significance testing through "experimental concepts." An experimental concept specifies collections of data, observed and not yet observed, that are judged to be exchangeable (i.e., with a joint distribution independent of the data ordering) when conditioned on a set of explanatory variables. We illustrate, through concrete examples, experimental concepts useful in the testing of PSHA models for ontological errors in the presence of aleatory variability and epistemic uncertainty. In particular, we describe experimental concepts that lead to exchangeable binary sequences that are statistically independent but not identically distributed, showing how the Bayesian concept of exchangeability generalizes the frequentist concept of experimental repeatability. We also address the issue of testing PSHA models using spatially correlated data.

  18. Active seismic experiment

    NASA Technical Reports Server (NTRS)

    Kovach, R. L.; Watkins, J. S.; Talwani, P.

    1972-01-01

    The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.

  19. SHAKING TABLE TEST AND EFFECTIVE STRESS ANALYSIS ON SEISMIC PERFORMANCE WITH SEISMIC ISOLATION RUBBER TO THE INTERMEDIATE PART OF PILE FOUNDATION IN LIQUEFACTION

    NASA Astrophysics Data System (ADS)

    Uno, Kunihiko; Otsuka, Hisanori; Mitou, Masaaki

    The pile foundation is heavily damaged at the boundary division of the ground types, liquefied ground and non-liquefied ground, during an earthquake and there is a possibility of the collapse of the piles. In this study, we conduct a shaking table test and effective stress analysis of the influence of soil liquefaction and the seismic inertial force exerted on the pile foundation. When the intermediate part of the pile, there is at the boundary division, is subjected to section force, this part increases in size as compared to the pile head in certain instances. Further, we develop a seismic resistance method for a pile foundation in liquefaction using seismic isolation rubber and it is shown the middle part seismic isolation system is very effective.

  20. Nowcasting Earthquakes and Tsunamis

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk. As another application, we can define large rectangular regions of subduction zones and shallow depths to compute the progress of the fault zone towards the next major tsunami-genic earthquake. We can then rank the relative progress of the major subduction zones of the world through their cycles of large earthquakes using this method to determine which zones are most at risk.

  1. Magma migration at the onset of the 2012-13 Tolbachik eruption revealed by Seismic Amplitude Ratio Analysis

    NASA Astrophysics Data System (ADS)

    Caudron, Corentin; Taisne, Benoit; Kugaenko, Yulia; Saltykov, Vadim

    2015-12-01

    In contrast of the 1975-76 Tolbachik eruption, the 2012-13 Tolbachik eruption was not preceded by any striking change in seismic activity. By processing the Klyuchevskoy volcano group seismic data with the Seismic Amplitude Ratio Analysis (SARA) method, we gain insights into the dynamics of magma movement prior to this important eruption. A clear seismic migration within the seismic swarm, started 20 hours before the reported eruption onset (05:15 UTC, 26 November 2012). This migration proceeded in different phases and ended when eruptive tremor, corresponding to lava flows, was recorded (at 11:00 UTC, 27 November 2012). In order to get a first order approximation of the magma location, we compare the calculated seismic intensity ratios with the theoretical ones. As expected, the observations suggest that the seismicity migrated toward the eruption location. However, we explain the pre-eruptive observed ratios by a vertical migration under the northern slope of Plosky Tolbachik volcano followed by a lateral migration toward the eruptive vents. Another migration is also captured by this technique and coincides with a seismic swarm that started 16-20 km to the south of Plosky Tolbachik at 20:31 UTC on November 28 and lasted for more than 2 days. This seismic swarm is very similar to the seismicity preceding the 1975-76 Tolbachik eruption and can be considered as a possible aborted eruption.

  2. Earthquake hazard and risk assessment based on Unified Scaling Law for Earthquakes: Greater Caucasus and Crimea

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir G.; Nekrasova, Anastasia K.

    2018-05-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10 N(M, L) = A + B·(5 - M) + C·log10 L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within a seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g., peak ground acceleration, PGA, or macro-seismic intensity). After a rigorous verification against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory). The methodology of seismic hazard and risk assessment is illustrated by application to the territory of Greater Caucasus and Crimea.

  3. Seismpol_ a visual-basic computer program for interactive and automatic earthquake waveform analysis

    NASA Astrophysics Data System (ADS)

    Patanè, Domenico; Ferrari, Ferruccio

    1997-11-01

    A Microsoft Visual-Basic computer program for waveform analysis of seismic signals is presented. The program combines interactive and automatic processing of digital signals using data recorded by three-component seismic stations. The analysis procedure can be used in either an interactive earthquake analysis or an automatic on-line processing of seismic recordings. The algorithm works in the time domain using the Covariance Matrix Decomposition method (CMD), so that polarization characteristics may be computed continuously in real time and seismic phases can be identified and discriminated. Visual inspection of the particle motion in hortogonal planes of projection (hodograms) reduces the danger of misinterpretation derived from the application of the polarization filter. The choice of time window and frequency intervals improves the quality of the extracted polarization information. In fact, the program uses a band-pass Butterworth filter to process the signals in the frequency domain by analysis of a selected signal window into a series of narrow frequency bands. Significant results supported by well defined polarizations and source azimuth estimates for P and S phases are also obtained for short-period seismic events (local microearthquakes).

  4. What defines an Expert? - Uncertainty in the interpretation of seismic data

    NASA Astrophysics Data System (ADS)

    Bond, C. E.

    2008-12-01

    Studies focusing on the elicitation of information from experts are concentrated primarily in economics and world markets, medical practice and expert witness testimonies. Expert elicitation theory has been applied in the natural sciences, most notably in the prediction of fluid flow in hydrological studies. In the geological sciences expert elicitation has been limited to theoretical analysis with studies focusing on the elicitation element, gaining expert opinion rather than necessarily understanding the basis behind the expert view. In these cases experts are defined in a traditional sense, based for example on: standing in the field, no. of years of experience, no. of peer reviewed publications, the experts position in a company hierarchy or academia. Here traditional indicators of expertise have been compared for significance on affective seismic interpretation. Polytomous regression analysis has been used to assess the relative significance of length and type of experience on the outcome of a seismic interpretation exercise. Following the initial analysis the techniques used by participants to interpret the seismic image were added as additional variables to the analysis. Specific technical skills and techniques were found to be more important for the affective geological interpretation of seismic data than the traditional indicators of expertise. The results of a seismic interpretation exercise, the techniques used to interpret the seismic and the participant's prior experience have been combined and analysed to answer the question - who is and what defines an expert?

  5. Seismic risk assessment and application in the central United States

    USGS Publications Warehouse

    Wang, Z.

    2011-01-01

    Seismic risk is a somewhat subjective, but important, concept in earthquake engineering and other related decision-making. Another important concept that is closely related to seismic risk is seismic hazard. Although seismic hazard and seismic risk have often been used interchangeably, they are fundamentally different: seismic hazard describes the natural phenomenon or physical property of an earthquake, whereas seismic risk describes the probability of loss or damage that could be caused by a seismic hazard. The distinction between seismic hazard and seismic risk is of practical significance because measures for seismic hazard mitigation may differ from those for seismic risk reduction. Seismic risk assessment is a complicated process and starts with seismic hazard assessment. Although probabilistic seismic hazard analysis (PSHA) is the most widely used method for seismic hazard assessment, recent studies have found that PSHA is not scientifically valid. Use of PSHA will lead to (1) artifact estimates of seismic risk, (2) misleading use of the annual probability of exccedance (i.e., the probability of exceedance in one year) as a frequency (per year), and (3) numerical creation of extremely high ground motion. An alternative approach, which is similar to those used for flood and wind hazard assessments, has been proposed. ?? 2011 ASCE.

  6. Rapid Non-Gaussian Uncertainty Quantification of Seismic Velocity Models and Images

    NASA Astrophysics Data System (ADS)

    Ely, G.; Malcolm, A. E.; Poliannikov, O. V.

    2017-12-01

    Conventional seismic imaging typically provides a single estimate of the subsurface without any error bounds. Noise in the observed raw traces as well as the uncertainty of the velocity model directly impact the uncertainty of the final seismic image and its resulting interpretation. We present a Bayesian inference framework to quantify uncertainty in both the velocity model and seismic images, given noise statistics of the observed data.To estimate velocity model uncertainty, we combine the field expansion method, a fast frequency domain wave equation solver, with the adaptive Metropolis-Hastings algorithm. The speed of the field expansion method and its reduced parameterization allows us to perform the tens or hundreds of thousands of forward solves needed for non-parametric posterior estimations. We then migrate the observed data with the distribution of velocity models to generate uncertainty estimates of the resulting subsurface image. This procedure allows us to create both qualitative descriptions of seismic image uncertainty and put error bounds on quantities of interest such as the dip angle of a subduction slab or thickness of a stratigraphic layer.

  7. Effects of magnitude, depth, and time on cellular seismology forecasts

    NASA Astrophysics Data System (ADS)

    Fisher, Steven Wolf

    This study finds that, in most cases analyzed to date, past seismicity tends to delineate zones where future earthquakes are likely to occur. Network seismicity catalogs for the New Madrid Seismic Zone (NMSZ), Australia (AUS), California (CA), and Alaska (AK) are analyzed using modified versions of the Cellular Seismology (CS) method of Kafka (2002, 2007). The percentage of later occurring earthquakes located near earlier occurring earthquakes typically exceeds the expected percentage for randomly distributed later occurring earthquakes, and the specific percentage is influenced by several variables, including magnitude, depth, time, and tectonic setting. At 33% map area coverage, hit percents are typically 85-95% in the NMSZ, 50-60% in AUS, 75-85% in CA, and 75-85% in AK. Statistical significance testing is performed on trials analyzing the same variables so that the overall regions can be compared, although some tests are inconclusive due to the small number of earthquake sample sizes. These results offer useful insights into understanding the capabilities and limits of CS studies, which can provide guidance for improving the seismicity-based components of seismic hazard assessments.

  8. High lateral resolution exploration using surface waves from noise records

    NASA Astrophysics Data System (ADS)

    Chávez-García, Francisco José Yokoi, Toshiaki

    2016-04-01

    Determination of the shear-wave velocity structure at shallow depths is a constant necessity in engineering or environmental projects. Given the sensitivity of Rayleigh waves to shear-wave velocity, subsoil structure exploration using surface waves is frequently used. Methods such as the spectral analysis of surface waves (SASW) or multi-channel analysis of surface waves (MASW) determine phase velocity dispersion from surface waves generated by an active source recorded on a line of geophones. Using MASW, it is important that the receiver array be as long as possible to increase the precision at low frequencies. However, this implies that possible lateral variations are discarded. Hayashi and Suzuki (2004) proposed a different way of stacking shot gathers to increase lateral resolution. They combined strategies used in MASW with the common mid-point (CMP) summation currently used in reflection seismology. In their common mid-point with cross-correlation method (CMPCC), they cross-correlate traces sharing CMP locations before determining phase velocity dispersion. Another recent approach to subsoil structure exploration is based on seismic interferometry. It has been shown that cross-correlation of a diffuse field, such as seismic noise, allows the estimation of the Green's Function between two receivers. Thus, a virtual-source seismic section may be constructed from the cross-correlation of seismic noise records obtained in a line of receivers. In this paper, we use the seismic interferometry method to process seismic noise records obtained in seismic refraction lines of 24 geophones, and analyse the results using CMPCC to increase the lateral resolution of the results. Cross-correlation of the noise records allows reconstructing seismic sections with virtual sources at each receiver location. The Rayleigh wave component of the Green's Functions is obtained with a high signal-to-noise ratio. Using CMPCC analysis of the virtual-source seismic lines, we are able to identify lateral variations of phase velocity inside the seismic line, and increase the lateral resolution compared with results of conventional analysis.

  9. A reliable simultaneous representation of seismic hazard and of ground shaking recurrence

    NASA Astrophysics Data System (ADS)

    Peresan, A.; Panza, G. F.; Magrin, A.; Vaccari, F.

    2015-12-01

    Different earthquake hazard maps may be appropriate for different purposes - such as emergency management, insurance and engineering design. Accounting for the lower occurrence rate of larger sporadic earthquakes may allow to formulate cost-effective policies in some specific applications, provided that statistically sound recurrence estimates are used, which is not typically the case of PSHA (Probabilistic Seismic Hazard Assessment). We illustrate the procedure to associate the expected ground motions from Neo-deterministic Seismic Hazard Assessment (NDSHA) to an estimate of their recurrence. Neo-deterministic refers to a scenario-based approach, which allows for the construction of a broad range of earthquake scenarios via full waveforms modeling. From the synthetic seismograms the estimates of peak ground acceleration, velocity and displacement, or any other parameter relevant to seismic engineering, can be extracted. NDSHA, in its standard form, defines the hazard computed from a wide set of scenario earthquakes (including the largest deterministically or historically defined credible earthquake, MCE) and it does not supply the frequency of occurrence of the expected ground shaking. A recent enhanced variant of NDSHA that reliably accounts for recurrence has been developed and it is applied to the Italian territory. The characterization of the frequency-magnitude relation can be performed by any statistically sound method supported by data (e.g. multi-scale seismicity model), so that a recurrence estimate is associated to each of the pertinent sources. In this way a standard NDSHA map of ground shaking is obtained simultaneously with the map of the corresponding recurrences. The introduction of recurrence estimates in NDSHA naturally allows for the generation of ground shaking maps at specified return periods. This permits a straightforward comparison between NDSHA and PSHA maps.

  10. MUSTANG: A Community-Facing Web Service to Improve Seismic Data Quality Awareness Through Metrics

    NASA Astrophysics Data System (ADS)

    Templeton, M. E.; Ahern, T. K.; Casey, R. E.; Sharer, G.; Weertman, B.; Ashmore, S.

    2014-12-01

    IRIS DMC is engaged in a new effort to provide broad and deep visibility into the quality of data and metadata found in its terabyte-scale geophysical data archive. Taking advantage of large and fast disk capacity, modern advances in open database technologies, and nimble provisioning of virtual machine resources, we are creating an openly accessible treasure trove of data measurements for scientists and the general public to utilize in providing new insights into the quality of this data. We have branded this statistical gathering system MUSTANG, and have constructed it as a component of the web services suite that IRIS DMC offers. MUSTANG measures over forty data metrics addressing issues with archive status, data statistics and continuity, signal anomalies, noise analysis, metadata checks, and station state of health. These metrics could potentially be used both by network operators to diagnose station problems and by data users to sort suitable data from unreliable or unusable data. Our poster details what MUSTANG is, how users can access it, what measurements they can find, and how MUSTANG fits into the IRIS DMC's data access ecosystem. Progress in data processing, approaches to data visualization, and case studies of MUSTANG's use for quality assurance will be presented. We want to illustrate what is possible with data quality assurance, the need for data quality assurance, and how the seismic community will benefit from this freely available analytics service.

  11. Particle precipitation prior to large earthquakes of both the Sumatra and Philippine Regions: A statistical analysis

    NASA Astrophysics Data System (ADS)

    Fidani, Cristiano

    2015-12-01

    A study of statistical correlation between low L-shell electrons precipitating into the atmosphere and strong earthquakes is presented. More than 11 years of the Medium Energy Protons Electrons Detector data from the NOAA-15 Sun-synchronous polar orbiting satellite were analysed. Electron fluxes were analysed using a set of adiabatic coordinates. From this, significant electron counting rate fluctuations were evidenced during geomagnetic quiet periods. Electron counting rates were compared to earthquakes by defining a seismic event L-shell obtained radially projecting the epicentre geographical positions to a given altitude towards the zenith. Counting rates were grouped in every satellite semi-orbit together with strong seismic events and these were chosen with the L-shell coordinates close to each other. NOAA-15 electron data from July 1998 to December 2011 were compared for nearly 1800 earthquakes with magnitudes larger than or equal to 6, occurring worldwide. When considering 30-100 keV precipitating electrons detected by the vertical NOAA-15 telescope and earthquake epicentre projections at altitudes greater that 1300 km, a significant correlation appeared where a 2-3 h electron precipitation was detected prior to large events in the Sumatra and Philippine Regions. This was in physical agreement with different correlation times obtained from past studies that considered particles with greater energies. The Discussion below of satellite orbits and detectors is useful for future satellite missions for earthquake mitigation.

  12. Automated seismic waveform location using Multichannel Coherency Migration (MCM)-I. Theory

    NASA Astrophysics Data System (ADS)

    Shi, Peidong; Angus, Doug; Rost, Sebastian; Nowacki, Andy; Yuan, Sanyi

    2018-03-01

    With the proliferation of dense seismic networks sampling the full seismic wavefield, recorded seismic data volumes are getting bigger and automated analysis tools to locate seismic events are essential. Here, we propose a novel Multichannel Coherency Migration (MCM) method to locate earthquakes in continuous seismic data and reveal the location and origin time of seismic events directly from recorded waveforms. By continuously calculating the coherency between waveforms from different receiver pairs, MCM greatly expands the available information which can be used for event location. MCM does not require phase picking or phase identification, which allows fully automated waveform analysis. By migrating the coherency between waveforms, MCM leads to improved source energy focusing. We have tested and compared MCM to other migration-based methods in noise-free and noisy synthetic data. The tests and analysis show that MCM is noise resistant and can achieve more accurate results compared with other migration-based methods. MCM is able to suppress strong interference from other seismic sources occurring at a similar time and location. It can be used with arbitrary 3D velocity models and is able to obtain reasonable location results with smooth but inaccurate velocity models. MCM exhibits excellent location performance and can be easily parallelized giving it large potential to be developed as a real-time location method for very large datasets.

  13. Study on comparison of special moment frame steel structure (SMF) and base isolation special moment frame steel structure (BI-SMF) in Indonesia

    NASA Astrophysics Data System (ADS)

    Setiawan, Jody; Nakazawa, Shoji

    2017-10-01

    This paper discusses about comparison of seismic response behaviors, seismic performance and seismic loss function of a conventional special moment frame steel structure (SMF) and a special moment frame steel structure with base isolation (BI-SMF). The validation of the proposed simplified estimation method of the maximum deformation of the base isolation system by using the equivalent linearization method and the validation of the design shear force of the superstructure are investigated from results of the nonlinear dynamic response analysis. In recent years, the constructions of steel office buildings with seismic isolation system are proceeding even in Indonesia where the risk of earthquakes is high. Although the design code for the seismic isolation structure has been proposed, there is no actual construction example for special moment frame steel structure with base isolation. Therefore, in this research, the SMF and BI-SMF buildings are designed by Indonesian Building Code which are assumed to be built at Padang City in Indonesia. The material of base isolation system is high damping rubber bearing. Dynamic eigenvalue analysis and nonlinear dynamic response analysis are carried out to show the dynamic characteristics and seismic performance. In addition, the seismic loss function is obtained from damage state probability and repair cost. For the response analysis, simulated ground accelerations, which have the phases of recorded seismic waves (El Centro NS, El Centro EW, Kobe NS and Kobe EW), adapted to the response spectrum prescribed by the Indonesian design code, that has, are used.

  14. Reservoir characterization using core, well log, and seismic data and intelligent software

    NASA Astrophysics Data System (ADS)

    Soto Becerra, Rodolfo

    We have developed intelligent software, Oilfield Intelligence (OI), as an engineering tool to improve the characterization of oil and gas reservoirs. OI integrates neural networks and multivariate statistical analysis. It is composed of five main subsystems: data input, preprocessing, architecture design, graphics design, and inference engine modules. More than 1,200 lines of programming code as M-files using the language MATLAB been written. The degree of success of many oil and gas drilling, completion, and production activities depends upon the accuracy of the models used in a reservoir description. Neural networks have been applied for identification of nonlinear systems in almost all scientific fields of humankind. Solving reservoir characterization problems is no exception. Neural networks have a number of attractive features that can help to extract and recognize underlying patterns, structures, and relationships among data. However, before developing a neural network model, we must solve the problem of dimensionality such as determining dominant and irrelevant variables. We can apply principal components and factor analysis to reduce the dimensionality and help the neural networks formulate more realistic models. We validated OI by obtaining confident models in three different oil field problems: (1) A neural network in-situ stress model using lithology and gamma ray logs for the Travis Peak formation of east Texas, (2) A neural network permeability model using porosity and gamma ray and a neural network pseudo-gamma ray log model using 3D seismic attributes for the reservoir VLE 196 Lamar field located in Block V of south-central Lake Maracaibo (Venezuela), and (3) Neural network primary ultimate oil recovery (PRUR), initial waterflooding ultimate oil recovery (IWUR), and infill drilling ultimate oil recovery (IDUR) models using reservoir parameters for San Andres and Clearfork carbonate formations in west Texas. In all cases, we compared the results from the neural network models with the results from regression statistical and non-parametric approach models. The results show that it is possible to obtain the highest cross-correlation coefficient between predicted and actual target variables, and the lowest average absolute errors using the integrated techniques of multivariate statistical analysis and neural networks in our intelligent software.

  15. Solar-terrestrial predictions proceedings. Volume 4: Prediction of terrestrial effects of solar activity

    NASA Technical Reports Server (NTRS)

    Donnelly, R. E. (Editor)

    1980-01-01

    Papers about prediction of ionospheric and radio propagation conditions based primarily on empirical or statistical relations is discussed. Predictions of sporadic E, spread F, and scintillations generally involve statistical or empirical predictions. The correlation between solar-activity and terrestrial seismic activity and the possible relation between solar activity and biological effects is discussed.

  16. Statistical Literacy as the Earth Moves

    ERIC Educational Resources Information Center

    Wild, Chris J.

    2017-01-01

    "The Times They Are a-Changin'" says the old Bob Dylan song. But it is not just the times that are a-changin'. For statistical literacy, the very earth is moving under our feet (apologies to Carole King). The seismic forces are (i) new forms of communication and discourse and (ii) new forms of data, data display and human interaction…

  17. How will induced seismicity in Oklahoma respond to decreased saltwater injection rates?

    PubMed Central

    Langenbruch, Cornelius; Zoback, Mark D.

    2016-01-01

    In response to the marked number of injection-induced earthquakes in north-central Oklahoma, regulators recently called for a 40% reduction in the volume of saltwater being injected in the seismically active areas. We present a calibrated statistical model that predicts that widely felt M ≥ 3 earthquakes in the affected areas, as well as the probability of potentially damaging larger events, should significantly decrease by the end of 2016 and approach historic levels within a few years. Aftershock sequences associated with relatively large magnitude earthquakes that occurred in the Fairview, Cherokee, and Pawnee areas in north-central Oklahoma in late 2015 and 2016 will delay the rate of seismicity decrease in those areas. PMID:28138533

  18. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for conducting rapid vulnerability assessment of stone masonry buildings. With modification of input structural parameters, it can be adapted and applied to any other building class. A sensitivity analysis of the seismic vulnerability modelling is conducted to quantify the uncertainties associated with each of the input parameters. The proposed methodology was validated for a scenario-based seismic risk assessment of existing buildings in Old Quebec City. The procedure for hazard compatible vulnerability modelling was used to develop seismic fragility functions in terms of spectral acceleration representative of the inventoried buildings. A total of 1220 buildings were considered. The assessment was performed for a scenario event of magnitude 6.2 at distance 15km with a probability of exceedance of 2% in 50 years. The study showed that most of the expected damage is concentrated in the old brick and stone masonry buildings.

  19. Seismicity of the Wabash Valley, Ste. Genevieve, and Rough Creek Graben Seismic Zones from the Earthscope Ozarks-Illinois-Indiana-Kentucky (OIINK) FlexArray Experiment

    NASA Astrophysics Data System (ADS)

    Shirley, Matthew Richard

    I analyzed seismic data from the Ozarks-Illinois-Indiana-Kentucky (OIINK) seismic experiment that operated in eastern Missouri, southern Illinois, southern Indiana, and Kentucky from July 2012 through March 2015. A product of this analysis is a new catalog of earthquake locations and magnitudes for small-magnitude local events during this study period. The analysis included a pilot study involving detailed manual analysis of all events in a ten-day test period and determination of the best parameters for a suite of automated detection and location programs. I eliminated events that were not earthquakes (mostly quarry and surface mine blasts) from the output of the automated programs, and reprocessed the locations for the earthquakes with manually picked P- and S-wave arrivals. This catalog consists of earthquake locations, depths, and local magnitudes. The new catalog consists of 147 earthquake locations, including 19 located within the bounds of the OIINK array. Of these events, 16 were newly reported events, too small to be reported in the Center for Earthquake Research and Information (CERI) regional seismic network catalog. I compared the magnitudes reported by CERI for corresponding earthquakes to establish a magnitude calibration factor for all earthquakes recorded by the OIINK array. With the calibrated earthquake magnitudes, I incorporate the previous OIINK results from Yang et al. (2014) to create magnitude-frequency distributions for the seismic zones in the region alongside the magnitude-frequency distributions made from CERI data. This shows that Saint Genevieve and Wabash Valley seismic zones experience seismic activity at an order magnitude lower rate than the New Madrid seismic zone, and the Rough Creek Graben experiences seismic activity two orders of magnitude less frequently than New Madrid.

  20. An integrated approach to characterization of fractured reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Datta-Gupta, A.; Majer, E.; Vasco, D.

    1995-12-31

    This paper summarizes an integrated hydrologic and seismic characterization of a fractured limestone formation at the Conoco Borehole Test Facility (CBTF) in Kay County, Oklahoma. Transient response from pressure interference tests were first inverted in order to identify location and orientation of dominant fractures at the CBTF. Subsequently, high resolution (1000 to 10000 Hz) cross-well and single-well seismic surveys were conducted to verify the preferential slow paths indicated by hydrologic analysis. Seismic surveys were conducted before and after an air injection in order to increase the visibility of the fracture zone to seismic imaging. Both Seismic and hydrologic analysis weremore » found to yield consistent results in detecting the location of a major fracture zone.« less

  1. High precision gas hydrate imaging of small-scale and high-resolution marine sparker multichannel seismic data

    NASA Astrophysics Data System (ADS)

    Luo, D.; Cai, F.

    2017-12-01

    Small-scale and high-resolution marine sparker multi-channel seismic surveys using large energy sparkers are characterized by a high dominant frequency of the seismic source, wide bandwidth, and a high resolution. The technology with a high-resolution and high-detection precision was designed to improve the imaging quality of shallow sedimentary. In the study, a 20KJ sparker and 24-channel streamer cable with a 6.25m group interval were used as a seismic source and receiver system, respectively. Key factors for seismic imaging of gas hydrate are enhancement of S/N ratio, amplitude compensation and detailed velocity analysis. However, the data in this study has some characteristics below: 1. Small maximum offsets are adverse to velocity analysis and multiple attenuation. 2. Lack of low frequency information, that is, information less than 100Hz are invisible. 3. Low S/N ratio since less coverage times (only 12 times). These characteristics make it difficult to reach the targets of seismic imaging. In the study, the target processing methods are used to improve the seismic imaging quality of gas hydrate. First, some technologies of noise suppression are combined used in pre-stack seismic data to suppression of seismic noise and improve the S/N ratio. These technologies including a spectrum sharing noise elimination method, median filtering and exogenous interference suppression method. Second, the combined method of three technologies including SRME, τ-p deconvolution and high precision Radon transformation is used to remove multiples. Third, accurate velocity field are used in amplitude energy compensation to highlight the Bottom Simulating Reflector (short for BSR, the indicator of gas hydrates) and gas migration pathways (such as gas chimneys, hot spots et al.). Fourth, fine velocity analysis technology are used to improve accuracy of velocity analysis. Fifth, pre-stack deconvolution processing technology is used to compensate for low frequency energy and suppress of ghost, thus formation reflection characteristics are highlighted. The result shows that the small-scale and high resolution marine sparker multi-channel seismic surveys are very effective in improving the resolution and quality of gas hydrate imaging than the conventional seismic acquisition technology.

  2. Using geologic maps and seismic refraction in pavement-deflection analysis

    DOT National Transportation Integrated Search

    1999-10-01

    The researchers examined the relationship between three data types -- geologic maps, pavement deflection, and seismic refraction data -- from diverse geologic settings to determine whether geologic maps and seismic data might be used to interpret def...

  3. The shallow elastic structure of the lunar crust: New insights from seismic wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-10-01

    Enigmatic lunar seismograms recorded during the Apollo 17 mission in 1972 have so far precluded the identification of shear-wave arrivals and hence the construction of a comprehensive elastic model of the shallow lunar subsurface. Here, for the first time, we extract shear-wave information from the Apollo active seismic data using a novel waveform analysis technique based on spatial seismic wavefield gradients. The star-like recording geometry of the active seismic experiment lends itself surprisingly well to compute spatial wavefield gradients and rotational ground motion as a function of time. These observables, which are new to seismic exploration in general, allowed us to identify shear waves in the complex lunar seismograms, and to derive a new model of seismic compressional and shear-wave velocities in the shallow lunar crust, critical to understand its lithology and constitution, and its impact on other geophysical investigations of the Moon's deep interior.

  4. Seismic imaging of post-glacial sediments - test study before Spitsbergen expedition

    NASA Astrophysics Data System (ADS)

    Szalas, Joanna; Grzyb, Jaroslaw; Majdanski, Mariusz

    2017-04-01

    This work presents results of the analysis of reflection seismic data acquired from testing area in central Poland. For this experiment we used total number of 147 vertical component seismic stations (DATA-CUBE and Reftek "Texan") with accelerated weight drop (PEG-40). The profile was 350 metres long. It is a part of pilot study for future research project on Spitsbergen. The purpose of the study is to recognise the characteristics of seismic response of post-glacial sediments in order to design the most adequate survey acquisition parameters and processing sequence for data from Spitsbergen. Multiple tests and comparisons have been performed to obtain the best possible quality of seismic image. In this research we examine the influence of receiver interval size, front mute application and surface wave attenuation attempts. Although seismic imaging is the main technique we are planning to support this analysis with additional data from traveltime tomography, MASW and other a priori information.

  5. Back analysis of fault-slip in burst prone environment

    NASA Astrophysics Data System (ADS)

    Sainoki, Atsushi; Mitri, Hani S.

    2016-11-01

    In deep underground mines, stress re-distribution induced by mining activities could cause fault-slip. Seismic waves arising from fault-slip occasionally induce rock ejection when hitting the boundary of mine openings, and as a result, severe damage could be inflicted. In general, it is difficult to estimate fault-slip-induced ground motion in the vicinity of mine openings because of the complexity of the dynamic response of faults and the presence of geological structures. In this paper, a case study is conducted for a Canadian underground mine, herein called "Mine-A", which is known for its seismic activities. Using a microseismic database collected from the mine, a back analysis of fault-slip is carried out with mine-wide 3-dimensional numerical modeling. A back analysis is conducted to estimate the physical and mechanical properties of the causative fracture or shear zones. One large seismic event has been selected for the back analysis to detect a fault-slip related seismic event. In the back analysis, the shear zone properties are estimated with respect to moment magnitude of the seismic event and peak particle velocity (PPV) recorded by a strong ground motion sensor. The estimated properties are then validated through comparison with peak ground acceleration recorded by accelerometers. Lastly, ground motion in active mining areas is estimated by conducting dynamic analysis with the estimated values. The present study implies that it would be possible to estimate the magnitude of seismic events that might occur in the near future by applying the estimated properties to the numerical model. Although the case study is conducted for a specific mine, the developed methodology can be equally applied to other mines suffering from fault-slip related seismic events.

  6. Accelerated Seismic Release and Related Aspects of Seismicity Patterns on Earthquake Faults

    NASA Astrophysics Data System (ADS)

    Ben-Zion, Y.; Lyakhovsky, V.

    2001-05-01

    Observational studies indicate that large earthquakes are sometimes preceded by phases of accelerated seismic release (ASR) characterized by cumulative Benioff strain following a power law time-to-failure relation with a term (tf - t)m, where tf is the failure time of the large event and observed values of m are close to 0.3. We discuss properties of ASR and related aspects of seismicity patterns associated with several theoretical frameworks, with a focus on models of heterogeneous faults in continuum solids. Using stress and earthquake histories simulated by the model of Ben-Zion (1996) for a discrete fault with quenched heterogeneities in a 3D elastic half space, we show that large model earthquakes are associated with non-repeating cyclical establishment and destruction of long-range stress correlations, accompanied by non-stationary cumulative Benioff strain release. We then analyze results associated with a regional lithospheric model consisting of a seismogenic upper crust governed by the damage rheology of Lyakhovsky et al. (1997) over a viscoelastic substrate. We demonstrate analytically for a simplified 1D case that the employed damage rheology leads to a singular power law equation for strain proportional to (tf - t)-1/3, and a non-singular power law relation for cumulative Benioff strain proportional to (tf - t)1/3. A simple approximate generalization of the latter for regional cumulative Benioff strain is obtained by adding to the result a linear function of time representing a stationary background release. To go beyond the analytical expectations, we examine results generated by various realizations of the regional lithospheric model producing seismicity following the characteristic frequency-size statistics, Gutenberg-Richter power law distribution, and mode switching activity. We find that phases of ASR exist only when the seismicity preceding a given large event has broad frequency-size statistics. In such cases the simulated ASR phases can be fitted well by the singular analytical relation with m = -1/3, the non-singular equation with m = 0.2, and the generalized version of the latter including a linear term with m = 1/3. The obtained good fits with all three relations highlight the difficulty of deriving reliable information on functional forms and parameter values from such data sets. The activation process in the simulated ASR phases is found to be accommodated both by increasing rates of moderate events and increasing average event size, with the former starting a few years earlier than the latter. The lack of ASR in portions of the seismicity not having broad frequency-size statistics may explain why some large earthquakes are preceded by ASR and other are not.

  7. Uncertainties in evaluation of hazard and seismic risk

    NASA Astrophysics Data System (ADS)

    Marmureanu, Gheorghe; Marmureanu, Alexandru; Ortanza Cioflan, Carmen; Manea, Elena-Florinela

    2015-04-01

    Two methods are commonly used for seismic hazard assessment: probabilistic (PSHA) and deterministic(DSHA) seismic hazard analysis.Selection of a ground motion for engineering design requires a clear understanding of seismic hazard and risk among stakeholders, seismologists and engineers. What is wrong with traditional PSHA or DSHA ? PSHA common used in engineering is using four assumptions developed by Cornell in 1968:(1)-Constant-in-time average occurrence rate of earthquakes; (2)-Single point source; (3).Variability of ground motion at a site is independent;(4)-Poisson(or "memory - less") behavior of earthquake occurrences. It is a probabilistic method and "when the causality dies, its place is taken by probability, prestigious term meant to define the inability of us to predict the course of nature"(Nils Bohr). DSHA method was used for the original design of Fukushima Daichii, but Japanese authorities moved to probabilistic assessment methods and the probability of exceeding of the design basis acceleration was expected to be 10-4-10-6 . It was exceeded and it was a violation of the principles of deterministic hazard analysis (ignoring historical events)(Klügel,J,U, EGU,2014, ISSO). PSHA was developed from mathematical statistics and is not based on earthquake science(invalid physical models- point source and Poisson distribution; invalid mathematics; misinterpretation of annual probability of exceeding or return period etc.) and become a pure numerical "creation" (Wang, PAGEOPH.168(2011),11-25). An uncertainty which is a key component for seismic hazard assessment including both PSHA and DSHA is the ground motion attenuation relationship or the so-called ground motion prediction equation (GMPE) which describes a relationship between a ground motion parameter (i.e., PGA,MMI etc.), earthquake magnitude M, source to site distance R, and an uncertainty. So far, no one is taking into consideration strong nonlinear behavior of soils during of strong earthquakes. But, how many cities, villages, metropolitan areas etc. in seismic regions are constructed on rock? Most of them are located on soil deposits? A soil is of basic type sand or gravel (termed coarse soils), silt or clay (termed fine soils) etc. The effect on nonlinearity is very large. For example, if we maintain the same spectral amplification factor (SAF=5.8942) as for relatively strong earthquake on May 3,1990(MW=6.4),then at Bacǎu seismic station for Vrancea earthquake on May 30,1990 (MW =6.9) the peak acceleration has to be a*max =0.154g and the actual recorded was only, amax =0.135g(-14.16%). Also, for Vrancea earthquake on August 30,1986(MW=7.1),the peak acceleration has to be a*max = 0.107g instead of real value recorded of 0.0736 g(- 45.57%). There are many data for more than 60 seismic stations. There is a strong nonlinear dependence of SAF with earthquake magnitude in each site. The authors are coming with an alternative approach called "real spectral amplification factors" instead of GMPE for all extra-Carpathian area where all cities and villages are located on soil deposits. Key words: Probabilistic Seismic Hazard; Uncertainties; Nonlinear seismology; Spectral amplification factors(SAF).

  8. EMERALD: Coping with the Explosion of Seismic Data

    NASA Astrophysics Data System (ADS)

    West, J. D.; Fouch, M. J.; Arrowsmith, R.

    2009-12-01

    The geosciences are currently generating an unparalleled quantity of new public broadband seismic data with the establishment of large-scale seismic arrays such as the EarthScope USArray, which are enabling new and transformative scientific discoveries of the structure and dynamics of the Earth’s interior. Much of this explosion of data is a direct result of the formation of the IRIS consortium, which has enabled an unparalleled level of open exchange of seismic instrumentation, data, and methods. The production of these massive volumes of data has generated new and serious data management challenges for the seismological community. A significant challenge is the maintenance and updating of seismic metadata, which includes information such as station location, sensor orientation, instrument response, and clock timing data. This key information changes at unknown intervals, and the changes are not generally communicated to data users who have already downloaded and processed data. Another basic challenge is the ability to handle massive seismic datasets when waveform file volumes exceed the fundamental limitations of a computer’s operating system. A third, long-standing challenge is the difficulty of exchanging seismic processing codes between researchers; each scientist typically develops his or her own unique directory structure and file naming convention, requiring that codes developed by another researcher be rewritten before they can be used. To address these challenges, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The overarching goal of the EMERALD project is to enable more efficient and effective use of seismic datasets ranging from just a few hundred to millions of waveforms with a complete database-driven system, leading to higher quality seismic datasets for scientific analysis and enabling faster, more efficient scientific research. We will present a preliminary (beta) version of EMERALD, an integrated, extensible, standalone database server system based on the open-source PostgreSQL database engine. The system is designed for fast and easy processing of seismic datasets, and provides the necessary tools to manage very large datasets and all associated metadata. EMERALD provides methods for efficient preprocessing of seismic records; large record sets can be easily and quickly searched, reviewed, revised, reprocessed, and exported. EMERALD can retrieve and store station metadata and alert the user to metadata changes. The system provides many methods for visualizing data, analyzing dataset statistics, and tracking the processing history of individual datasets. EMERALD allows development and sharing of visualization and processing methods using any of 12 programming languages. EMERALD is designed to integrate existing software tools; the system provides wrapper functionality for existing widely-used programs such as GMT, SOD, and TauP. Users can interact with EMERALD via a web browser interface, or they can directly access their data from a variety of database-enabled external tools. Data can be imported and exported from the system in a variety of file formats, or can be directly requested and downloaded from the IRIS DMC from within EMERALD.

  9. Comprehensive analysis of earthquake source spectra and swarms in the Salton Trough, California

    NASA Astrophysics Data System (ADS)

    Chen, X.; Shearer, P. M.

    2011-09-01

    We study earthquakes within California's Salton Trough from 1981 to 2009 from a precisely relocated catalog. We process the seismic waveforms to isolate source spectra, station spectra and travel-time dependent spectra. The results suggest an average P wave Q of 340, agreeing with previous results indicating relatively high attenuation in the Salton Trough. Stress drops estimated from the source spectra using an empirical Green's function (EGF) method reveal large scatter among individual events but a low median stress drop of 0.56 MPa for the region. The distribution of stress drop after applying a spatial-median filter indicates lower stress drops near geothermal sites. We explore the relationships between seismicity, stress drops and geothermal injection activities. Seismicity within the Salton Trough shows strong spatial clustering, with 20 distinct earthquake swarms with at least 50 events. They can be separated into early-Mmax and late-Mmax groups based on the normalized occurrence time of their largest event. These swarms generally have a low skew value of moment release history, ranging from -9 to 3.0. The major temporal difference between the two groups is the excess of seismicity and an inverse power law increase of seismicity before the largest event for the late-Mmax group. All swarms exhibit spatial migration of seismicity at a statistical significance greater than 85%. A weighted L1-norm inversion of linear migration parameters yields migration velocities from 0.008 to 0.8 km/hour. To explore the influence of fluid injection in geothermal sites, we also model the migration behavior with the diffusion equation, and obtain a hydraulic diffusion coefficient of approximately 0.25 m2/s for the Salton Sea geothermal site, which is within the range of expected values for a typical geothermal reservoir. The swarms with migration velocities over 0.1 km/hour cannot be explained by the diffusion curve, rather, their velocity is consistent with the propagation velocity of creep and slow slip events. These variations in migration behavior allow us to distinguish among different driving processes.

  10. The effect of directivity in a PSHA framework

    NASA Astrophysics Data System (ADS)

    Spagnuolo, E.; Herrero, A.; Cultrera, G.

    2012-09-01

    We propose a method to introduce a refined representation of the ground motion in the framework of the Probabilistic Seismic Hazard Analysis (PSHA). This study is especially oriented to the incorporation of a priori information about source parameters, by focusing on the directivity effect and its influence on seismic hazard maps. Two strategies have been followed. One considers the seismic source as an extended source, and it is valid when the PSHA seismogenetic sources are represented as fault segments. We show that the incorporation of variables related to the directivity effect can lead to variations up to 20 per cent of the hazard level in case of dip-slip faults with uniform distribution of hypocentre location, in terms of spectral acceleration response at 5 s, exceeding probability of 10 per cent in 50 yr. The second one concerns the more general problem of the seismogenetic areas, where each point is a seismogenetic source having the same chance of enucleate a seismic event. In our proposition the point source is associated to the rupture-related parameters, defined using a statistical description. As an example, we consider a source point of an area characterized by strike-slip faulting style. With the introduction of the directivity correction the modulation of the hazard map reaches values up to 100 per cent (for strike-slip, unilateral faults). The introduction of directivity does not increase uniformly the hazard level, but acts more like a redistribution of the estimation that is consistent with the fault orientation. A general increase appears only when no a priori information is available. However, nowadays good a priori knowledge exists on style of faulting, dip and orientation of faults associated to the majority of the seismogenetic zones of the present seismic hazard maps. The percentage of variation obtained is strongly dependent on the type of model chosen to represent analytically the directivity effect. Therefore, it is our aim to emphasize more on the methodology following which, all the information collected may be easily converted to obtain a more comprehensive and meaningful probabilistic seismic hazard formulation.

  11. Bayesian identification of multiple seismic change points and varying seismic rates caused by induced seismicity

    NASA Astrophysics Data System (ADS)

    Montoya-Noguera, Silvana; Wang, Yu

    2017-04-01

    The Central and Eastern United States (CEUS) has experienced an abnormal increase in seismic activity, which is believed to be related to anthropogenic activities. The U.S. Geological Survey has acknowledged this situation and developed the CEUS 2016 1 year seismic hazard model using the catalog of 2015 by assuming stationary seismicity in that period. However, due to the nonstationary nature of induced seismicity, it is essential to identify change points for accurate probabilistic seismic hazard analysis (PSHA). We present a Bayesian procedure to identify the most probable change points in seismicity and define their respective seismic rates. It uses prior distributions in agreement with conventional PSHA and updates them with recent data to identify seismicity changes. It can determine the change points in a regional scale and may incorporate different types of information in an objective manner. It is first successfully tested with simulated data, and then it is used to evaluate Oklahoma's regional seismicity.

  12. Systematic detection and classification of earthquake clusters in Italy

    NASA Astrophysics Data System (ADS)

    Poli, P.; Ben-Zion, Y.; Zaliapin, I. V.

    2017-12-01

    We perform a systematic analysis of spatio-temporal clustering of 2007-2017 earthquakes in Italy with magnitudes m>3. The study employs the nearest-neighbor approach of Zaliapin and Ben-Zion [2013a, 2013b] with basic data-driven parameters. The results indicate that seismicity in Italy (an extensional tectonic regime) is dominated by clustered events, with smaller proportion of background events than in California. Evaluation of internal cluster properties allows separation of swarm-like from burst-like seismicity. This classification highlights a strong geographical coherence of cluster properties. Swarm-like seismicity are dominant in regions characterized by relatively slow deformation with possible elevated temperature and/or fluids (e.g. Alto Tiberina, Pollino), while burst-like seismicity are observed in crystalline tectonic regions (Alps and Calabrian Arc) and in Central Italy where moderate to large earthquakes are frequent (e.g. L'Aquila, Amatrice). To better assess the variation of seismicity style across Italy, we also perform a clustering analysis with region-specific parameters. This analysis highlights clear spatial changes of the threshold separating background and clustered seismicity, and permits better resolution of different clusters in specific geological regions. For example, a large proportion of repeaters is found in the Etna region as expected for volcanic-induced seismicity. A similar behavior is observed in the northern Apennines with high pore pressure associated with mantle degassing. The observed variations of earthquakes properties highlight shortcomings of practices using large-scale average seismic properties, and points to connections between seismicity and local properties of the lithosphere. The observations help to improve the understanding of the physics governing the occurrence of earthquakes in different regions.

  13. Earthquake Hazard and Risk Assessment based on Unified Scaling Law for Earthquakes: Altai-Sayan Region

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.; Nekrasova, A.

    2017-12-01

    We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes, USLE, which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10N(M, L) = A + B·(5 - M) + C·log10L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum credible magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g. peak ground acceleration, PGA, or macro-seismic intensity etc.). After a rigorous testing against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory, etc.). This, USLE based, methodology of seismic hazard and risks assessment is applied to the territory of Altai-Sayan Region, of Russia. The study supported by the Russian Science Foundation Grant No. 15-17-30020.

  14. Mantle-crust interaction at the Blanco Ridge segment of the Blanco Transform Fault Zone: Results from the Blanco Transform Fault OBS Experiment

    NASA Astrophysics Data System (ADS)

    Kuna, V. M.; Nabelek, J.; Braunmiller, J.

    2016-12-01

    We present results of the Blanco Transform OBS Experiment, which consists of the deployment of 55 three-component broadband and short-period ocean bottom seismometers in the vicinity of the Blanco Fault Zone for the period between September 2012 and October 2013. Our research concentrates on the Blanco Ridge, a purely transform segment of the Blanco Fault Zone, that spans over 130 km between the Cascadia and the Gorda pull-apart depressions. Almost 3,000 well-constrained earthquakes were detected and located along the Blanco Ridge by an automatic procedure (using BRTT Antelope) and relocated using a relative location algorithm (hypoDD). The catalog magnitude of completeness is M=2.2 with an overall b value of 1. Earthquakes extend from 0 km to 20 km depth, but cluster predominantly at two depth levels: in the crust (5-7 km) and in the uppermost mantle (12-17 km). Statistical analysis reveals striking differences between crustal and mantle seismicity. The temporal distribution of crustal events follows common patterns given by Omori's law, while most mantle seismicity occurs in spatially tight sequences of unusually short durations lasting 30 minutes or less. These sequences cannot be described by known empirical laws. Moreover, we observe increased seismic activity in the uppermost mantle about 30 days before the largest (M=5.4) earthquake. Two mantle sequences occurred in a small area of 3x3 km about 4 and 2 weeks before the M=5.4 event. In the week leading up to the M=5.4 event we observe a significant downward migration of crustal seismicity, which results in the subsequent nucleation of the main event at the base of the crust. We hypothesize that the highly localized uppermost mantle seismicity is triggered by aseismic slow-slip of the surrounding ductile mantle. We also suggest that the mantle slip loads the crust eventually resulting in relatively large crustal earthquakes.

  15. Hazard Assessment in a Big Data World

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir; Nekrasova, Anastasia

    2017-04-01

    Open data in a Big Data World provides unprecedented opportunities for enhancing scientific studies and better understanding of the Earth System. At the same time, it opens wide avenues for deceptive associations in inter- and transdisciplinary data misleading to erroneous predictions, which are unacceptable for implementation. Even the advanced tools of data analysis may lead to wrong assessments when inappropriately used to describe the phenomenon under consideration. A (self-) deceptive conclusion could be avoided by verification of candidate models in experiments on empirical data and in no other way. Seismology is not an exception. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in early history of instrumental seismology can be proved erroneous when subjected to objective hypothesis testing. In many cases of seismic hazard assessment (SHA), either probabilistic or deterministic, term-less or short-term, the claims of a high potential of a model forecasts are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers, which situation creates numerous deception points and resulted controversies. So far, most, if not all, the standard probabilistic methods to assess seismic hazard and associated risks are based on subjective, commonly unrealistic, and even erroneous assumptions about seismic recurrence and none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Accurate testing against real observations must be done in advance claiming seismically hazardous areas and/or times. The set of errors of the first and second kind in such a comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a user-defined cost-benefit function. The information obtained in testing experiments may supply us with realistic estimates of confidence and accuracy of SHA predictions. If proved reliable, but not necessarily perfect, forecast/prediction related recommendations on the level of risks in regard to engineering design, insurance, and emergency management can be used for efficient decision making.

  16. Composition and variation of noise recorded at the Yellowknife Seismic Array, 1991-2007

    USGS Publications Warehouse

    Koper, K.D.; De Foy, B.; Benz, H.

    2009-01-01

    We analyze seismic noise recorded on the 18 short-period, vertical component seismometers of the Yellowknife Seismic Array (YKA). YKA has an aperture of 23 km and is sited on cratonic lithosphere in an area with low cultural noise. These properties make it ideal for studying natural seismic noise at periods of 1-3 s. We calculated frequency-wave number spectra in this band for over 6,000 time windows that were extracted once per day for 17 years (1991-2007). Slowness analysis reveals a rich variety of seismic phases originating from distinct source regions: Rg waves from the Great Slave Lake; Lg waves from the Atlantic, Pacific, and Arctic Oceans; and teleseismic P waves from the north Pacific and equatorial mid-Atlantic regions. The surface wave energy is generated along coastlines, while the body wave energy is generated at least in part in deep-water, pelagic regions. Surface waves tend to dominate at the longer periods and, just as in earthquake seismograms, Lg is the most prominent arrival. Although the periods we study are slightly shorter than the classic double-frequency microseismic band of 4-10 s, the noise at YKA has clear seasonal behavior that is consistent with the ocean wave climate in the Northern Hemisphere. The temporal variation of most of the noise sources can be well fit using just two Fourier components: yearly and biyearly terms that combine to give a fast rise in microseismic power from mid-June through mid-October, followed by a gradual decline. The exception is the Rg energy from the Great Slave Lake, which shows a sharp drop in noise power over a 2-week period in November as the lake freezes. The L g noise from the east has a small but statistically significant positive slope, perhaps implying increased ocean wave activity in the North Atlantic over the last 17 years. Copyright 2009 by the American Geophysical Union.

  17. Seismic velocity structure of the crust and upper mantle beneath the Texas-Gulf of Mexico margin from joint inversion of Ps and Sp receiver functions and surface wave dispersion

    NASA Astrophysics Data System (ADS)

    Agrawal, M.; Pulliam, J.; Sen, M. K.

    2013-12-01

    The seismic structure beneath Texas Gulf Coast Plain (GCP) is determined via velocity analysis of stacked common conversion point (CCP) Ps and Sp receiver functions and surface wave dispersion. The GCP is a portion of a ocean-continental transition zone, or 'passive margin', where seismic imaging of lithospheric Earth structure via passive seismic techniques has been rare. Seismic data from a temporary array of 22 broadband stations, spaced 16-20 km apart, on a ~380-km-long profile from Matagorda Island, a barrier island in the Gulf of Mexico, to Johnson City, Texas were employed to construct a coherent image of the crust and uppermost mantle. CCP stacking was applied to data from teleseismic earthquakes to enhance the signal-to-noise ratios of converted phases, such as Ps phases. An inaccurate velocity model, used for time-to-depth conversion in CCP stacking, may produce higher errors, especially in a region of substantial lateral velocity variations. An accurate velocity model is therefore essential to constructing high quality depth-domain images. To find accurate velocity P- and S-wave models, we applied a joint modeling approach that searches for best-fitting models via simulated annealing. This joint inversion approach, which we call 'multi objective optimization in seismology' (MOOS), simultaneously models Ps receiver functions, Sp receiver functions and group velocity surface wave dispersion curves after assigning relative weights for each objective function. Weights are computed from the standard deviations of the data. Statistical tools such as the posterior parameter correlation matrix and posterior probability density (PPD) function are used to evaluate the constraints that each data type places on model parameters. They allow us to identify portions of the model that are well or poorly constrained.

  18. Assessing the seismic risk potential of South America

    USGS Publications Warehouse

    Jaiswal, Kishor; Petersen, Mark D.; Harmsen, Stephen; Smoczyk, Gregory M.

    2016-01-01

    We present here a simplified approach to quantifying regional seismic risk. The seismic risk for a given region can be inferred in terms of average annual loss (AAL) that represents long-term value of earthquake losses in any one year caused from a long-term seismic hazard. The AAL are commonly measured in the form of earthquake shaking-induced deaths, direct economic impacts or indirect losses caused due to loss of functionality. In the context of South American subcontinent, the analysis makes use of readily available public data on seismicity, population exposure, and the hazard and vulnerability models for the region. The seismic hazard model was derived using available seismic catalogs, fault databases, and the hazard methodologies that are analogous to the U.S. Geological Survey’s national seismic hazard mapping process. The Prompt Assessment of Global Earthquakes for Response (PAGER) system’s direct empirical vulnerability functions in terms of fatality and economic impact were used for performing exposure and risk analyses. The broad findings presented and the risk maps produced herein are preliminary, yet they do offer important insights into the underlying zones of high and low seismic risks in the South American subcontinent. A more detailed analysis of risk may be warranted by engaging local experts, especially in some of the high risk zones identified through the present investigation.

  19. Pattern Informatics Approach to Earthquake Forecasting in 3D

    NASA Astrophysics Data System (ADS)

    Toya, Y.; Tiampo, K. F.; Rundle, J. B.; Chen, C.; Li, H.; Klein, W.

    2009-05-01

    Natural seismicity is correlated across multiple spatial and temporal scales, but correlations in seismicity prior to a large earthquake are locally subtle (e.g. seismic quiescence) and often prominent in broad scale (e.g., seismic activation), resulting in local and regional seismicity patterns, e.g. a Mogi's donut. Recognizing that patterns in seismicity rate are reflecting the regional dynamics of the directly unobservable crustal stresses, the Pattern Informatics (PI) approach was introduced by Tiampo et al. in 2002 [Europhys. Lett., 60 (3), 481-487,] Rundle et al., 2002 [PNAS 99, suppl. 1, 2514-2521.] In this study, we expand the PI approach to forecasting earthquakes into the third, or vertical dimension, and illustrate its further improvement in the forecasting performance through case studies of both natural and synthetic data. The PI characterizes rapidly evolving spatio-temporal seismicity patterns as angular drifts of a unit state vector in a high dimensional correlation space, and systematically identifies anomalous shifts in seismic activity with respect to the regional background. 3D PI analysis is particularly advantageous over 2D analysis in resolving vertically overlapped seismicity anomalies in a highly complex tectonic environment. Case studies will help to illustrate some important properties of the PI forecasting tool. [Submitted to: Concurrency and Computation: Practice and Experience, Wiley, Special Issue: ACES2008.

  20. Impacts of potential seismic landslides on lifeline corridors.

    DOT National Transportation Integrated Search

    2015-02-01

    This report presents a fully probabilistic method for regional seismically induced landslide hazard analysis and : mapping. The method considers the most current predictions for strong ground motions and seismic sources : through use of the U.S.G.S. ...

  1. Viking-2 Seismometer Measurements on Mars: PDS Data Archive and Meteorological Applications

    NASA Astrophysics Data System (ADS)

    Lorenz, Ralph D.; Nakamura, Yosio; Murphy, James R.

    2017-11-01

    A data product has been generated and archived on the NASA Planetary Data System (Geosciences Node), which presents the seismometer readings of Viking Lander 2 in an easy-to-access form, for both the raw ("high rate") waveform records and the compressed ("event mode") amplitude and frequency records. In addition to the records themselves, a separate summary file for each instrument mode lists key statistics of each record together with the meteorological measurements made closest in time to the seismic record. This juxtaposition facilitates correlation of the seismometer instrument response to different meteorological conditions, or the selection of seismic data during which wind disturbances can be expected to be small. We summarize data quality issues and also discuss lander-generated seismic signals, due to operation of the sampling arm or other systems, which may be of interest for prospective missions to other bodies. We review wind-seismic correlation, the "Martian solar day (sol) 80" candidate seismic event, and identify the seismic signature of a probable dust devil vortex on sol 482 : the seismometer data allow an estimate of the peak wind, occurring between coarsely spaced meteorology measurements. We present code to generate the plots in this paper to illustrate use of the data product.

  2. LANL seismic screening method for existing buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O.

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method andmore » will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.« less

  3. Spatial Distribution of Seismic Anisotropy in the Crust in the Northeast Front Zone of Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Wang, Q.; SHI, Y.

    2017-12-01

    There are orogenic belts and strong deformation in northeastern zone of Tibetan Plateau. The media in crust and in the upper mantle are seismic anisotropic there. This study uses seismic records by permanent seismic stations and portable seismic arrays, and adopts analysis techniques on body waves to obtain spatial anisotropic distribution in northeastern front zone of Tibetan Plateau. With seismic records of small local earthquakes, we study shear-wave splitting in the upper crust. The polarization of fast shear wave (PFS) can be obtained, and PFS is considered parallel to the strike of the cracks, as well as the direction of maximum horizontal compressive stress. However, the result shows the strong influence from tectonics, such as faults. It suggests multiple-influence including stress and fault. Spatial distribution of seismic anisotropy in study zone presents the effect in short range. PFS at the station on the strike-slip fault is quite different to PFS at station just hundreds of meters away from the fault. With seismic records of teleseismic waveforms, we obtained seismic anisotropy in the whole crust by receiver functions. The PFS directions from Pms receiver functions show consistency, generally in WNW. The time-delay of slow S phases is significant. With seismic records of SKS, PKS and SKKS phases, we can detect seismic anisotropy in the upper mantle by splitting analysis. The fast directions of these phases also show consistency, generally in WNW, similar to those of receiver functions, but larger time-delays. It suggests significant seismic anisotropy in the crust and crustal deformation is coherent to that in the upper mantle.Seismic anisotropy in the upper crust, in the whole crust and in the upper mantle are discussed both in difference and tectonic implications [Grateful to the support by NSFC Project 41474032].

  4. Inverse and Forward Modeling of The 2014 Iquique Earthquake with Run-up Data

    NASA Astrophysics Data System (ADS)

    Fuentes, M.

    2015-12-01

    The April 1, 2014 Mw 8.2 Iquique earthquake excited a moderate tsunami which turned on the national alert of tsunami threat. This earthquake was located in the well-known seismic gap in northern Chile which had a high seismic potential (~ Mw 9.0) after the two main large historic events of 1868 and 1877. Nonetheless, studies of the seismic source performed with seismic data inversions suggest that the event exhibited a main patch located around 19.8° S at 40 km of depth with a seismic moment equivalent to Mw = 8.2. Thus, a large seismic deficit remains in the gap being capable to release an event of Mw = 8.8-8.9. To understand the importance of the tsunami threat in this zone, a seismic source modeling of the Iquique Earthquake is performed. A new approach based on stochastic k2 seismic sources is presented. A set of those sources is generated and for each one, a full numerical tsunami model is performed in order to obtain the run-up heights along the coastline. The results are compared with the available field run-up measurements and with the tide gauges that registered the signal. The comparison is not uniform; it penalizes more when the discrepancies are larger close to the peak run-up location. This criterion allows to identify the best seismic source from the set of scenarios that explains better the observations from a statistical point of view. By the other hand, a L2 norm minimization is used to invert the seismic source by comparing the peak nearshore tsunami amplitude (PNTA) with the run-up observations. This method searches in a space of solutions the best seismic configuration by retrieving the Green's function coefficients in order to explain the field measurements. The results obtained confirm that a concentrated down-dip patch slip adequately models the run-up data.

  5. MASW on the standard seismic prospective scale using full spread recording

    NASA Astrophysics Data System (ADS)

    Białas, Sebastian; Majdański, Mariusz; Trzeciak, Maciej; Gałczyński, Edward; Maksym, Andrzej

    2015-04-01

    The Multichannel Analysis of Surface Waves (MASW) is one of seismic survey methods that use the dispersion curve of surface waves in order to describe the stiffness of the surface. Is is used mainly for geotechnical engineering scale with total length of spread between 5 - 450 m and spread offset between 1 - 100 m, the hummer is the seismic source on this surveys. The standard procedure of MASW survey is: data acquisition, dispersion analysis and inversion of extracting dispersion curve to obtain the closest theoretical curve. The final result includes share-wave velocity (Vs) values at different depth along the surveyed lines. The main goal of this work is to expand this engineering method to the bigger scale with the length of standard prospecting spread of 20 km using 4.5 Hz version of vertical component geophones. The standard vibroseis and explosive method are used as the seismic source. The acquisition were conducted on the full spread all the time during each single shoot. The seismic data acquisition used for this analysis were carried out on the Braniewo 2014 project in north of Poland. The results achieved during standard MASW procedure says that this method can be used on much bigger scale as well. The different methodology of this analysis requires only much stronger seismic source.

  6. Modeling earthquake rate changes in Oklahoma and Arkansas: possible signatures of induced seismicity

    USGS Publications Warehouse

    Llenos, Andrea L.; Michael, Andrew J.

    2013-01-01

    The rate of ML≥3 earthquakes in the central and eastern United States increased beginning in 2009, particularly in Oklahoma and central Arkansas, where fluid injection has occurred. We find evidence that suggests these rate increases are man‐made by examining the rate changes in a catalog of ML≥3 earthquakes in Oklahoma, which had a low background seismicity rate before 2009, as well as rate changes in a catalog of ML≥2.2 earthquakes in central Arkansas, which had a history of earthquake swarms prior to the start of injection in 2009. In both cases, stochastic epidemic‐type aftershock sequence models and statistical tests demonstrate that the earthquake rate change is statistically significant, and both the background rate of independent earthquakes and the aftershock productivity must increase in 2009 to explain the observed increase in seismicity. This suggests that a significant change in the underlying triggering process occurred. Both parameters vary, even when comparing natural to potentially induced swarms in Arkansas, which suggests that changes in both the background rate and the aftershock productivity may provide a way to distinguish man‐made from natural earthquake rate changes. In Arkansas we also compare earthquake and injection well locations, finding that earthquakes within 6 km of an active injection well tend to occur closer together than those that occur before, after, or far from active injection. Thus, like a change in productivity, a change in interevent distance distribution may also be an indicator of induced seismicity.

  7. Full Moment Tensor Analysis Using First Motion Data at The Geysers Geothermal Field

    NASA Astrophysics Data System (ADS)

    Boyd, O.; Dreger, D. S.; Lai, V. H.; Gritto, R.

    2012-12-01

    Seismicity associated with geothermal energy production at The Geysers Geothermal Field in northern California has been increasing during the last forty years. We investigate source models of over fifty earthquakes with magnitudes ranging from Mw 3.5 up to Mw 4.5. We invert three-component, complete waveform data from broadband stations of the Berkeley Digital Seismic Network, the Northern California Seismic Network and the USA Array deployment (2005-2007) for the complete, six-element moment tensor. Some solutions are double-couple while others have substantial non-double-couple components. To assess the stability and significance of non-double-couple components, we use a suite of diagnostic tools including the F-test, Jackknife test, bootstrap and network sensitivity solution (NSS). The full moment tensor solutions of the studied events tend to plot in the upper half of the Hudson source type diagram where the fundamental source types include +CLVD, +LVD, tensile-crack, DC and explosion. Using the F-test to compare the goodness-of-fit values between the full and deviatoric moment tensor solutions, most of the full moment tensor solutions do not show a statistically significant improvement in fit over the deviatoric solutions. Because a small isotropic component may not significantly improve the fit, we include first motion polarity data to better constrain the full moment tensor solutions.

  8. A Viscoelastic earthquake simulator with application to the San Francisco Bay region

    USGS Publications Warehouse

    Pollitz, Fred F.

    2009-01-01

    Earthquake simulation on synthetic fault networks carries great potential for characterizing the statistical patterns of earthquake occurrence. I present an earthquake simulator based on elastic dislocation theory. It accounts for the effects of interseismic tectonic loading, static stress steps at the time of earthquakes, and postearthquake stress readjustment through viscoelastic relaxation of the lower crust and mantle. Earthquake rupture initiation and termination are determined with a Coulomb failure stress criterion and the static cascade model. The simulator is applied to interacting multifault systems: one, a synthetic two-fault network, and the other, a fault network representative of the San Francisco Bay region. The faults are discretized both along strike and along dip and can accommodate both strike slip and dip slip. Stress and seismicity functions are evaluated over 30,000 yr trial time periods, resulting in a detailed statistical characterization of the fault systems. Seismicity functions such as the coefficient of variation and a- and b-values exhibit systematic patterns with respect to simple model parameters. This suggests that reliable estimation of the controlling parameters of an earthquake simulator is a prerequisite to the interpretation of its output in terms of seismic hazard.

  9. A synoptic view of the Third Uniform California Earthquake Rupture Forecast (UCERF3)

    USGS Publications Warehouse

    Field, Edward; Jordan, Thomas H.; Page, Morgan T.; Milner, Kevin R.; Shaw, Bruce E.; Dawson, Timothy E.; Biasi, Glenn; Parsons, Thomas E.; Hardebeck, Jeanne L.; Michael, Andrew J.; Weldon, Ray; Powers, Peter; Johnson, Kaj M.; Zeng, Yuehua; Bird, Peter; Felzer, Karen; van der Elst, Nicholas; Madden, Christopher; Arrowsmith, Ramon; Werner, Maximillan J.; Thatcher, Wayne R.

    2017-01-01

    Probabilistic forecasting of earthquake‐producing fault ruptures informs all major decisions aimed at reducing seismic risk and improving earthquake resilience. Earthquake forecasting models rely on two scales of hazard evolution: long‐term (decades to centuries) probabilities of fault rupture, constrained by stress renewal statistics, and short‐term (hours to years) probabilities of distributed seismicity, constrained by earthquake‐clustering statistics. Comprehensive datasets on both hazard scales have been integrated into the Uniform California Earthquake Rupture Forecast, Version 3 (UCERF3). UCERF3 is the first model to provide self‐consistent rupture probabilities over forecasting intervals from less than an hour to more than a century, and it is the first capable of evaluating the short‐term hazards that result from multievent sequences of complex faulting. This article gives an overview of UCERF3, illustrates the short‐term probabilities with aftershock scenarios, and draws some valuable scientific conclusions from the modeling results. In particular, seismic, geologic, and geodetic data, when combined in the UCERF3 framework, reject two types of fault‐based models: long‐term forecasts constrained to have local Gutenberg–Richter scaling, and short‐term forecasts that lack stress relaxation by elastic rebound.

  10. Seismic signatures of carbonate caves affected by near-surface absorptions

    NASA Astrophysics Data System (ADS)

    Rao, Ying; Wang, Yanghua

    2015-12-01

    The near-surface absorption within a low-velocity zone generally has an exponential attenuation effect on seismic waves. But how does this absorption affect seismic signatures of karstic caves in deep carbonate reservoirs? Seismic simulation and analysis reveals that, although this near-surface absorption attenuates the wave energy of a continuous reflection, it does not alter the basic kinematic shape of bead-string reflections, a special seismic characteristic associated with carbonate caves in the Tarim Basin, China. Therefore, the bead-strings in seismic profiles can be utilized, with a great certainty, for interpreting the existence of caves within the deep carbonate reservoirs and for evaluating their pore spaces. Nevertheless, the difference between the central frequency and the peak frequency is increased along with the increment in the absorption. While the wave energy of bead-string reflections remains strong, due to the interference of seismic multiples generated by big impedance contrast between the infill materials of a cave and the surrounding carbonate rocks, the central frequency is shifted linearly with respect to the near-surface absorption. These two features can be exploited simultaneously, for a stable attenuation analysis of field seismic data.

  11. A new approach to geographic partitioning of probabilistic seismic hazard using seismic source distance with earthquake extreme and perceptibility statistics: an application to the southern Balkan region

    NASA Astrophysics Data System (ADS)

    Bayliss, T. J.

    2016-02-01

    The southeastern European cities of Sofia and Thessaloniki are explored as example site-specific scenarios by geographically zoning their individual localized seismic sources based on the highest probabilities of magnitude exceedance. This is with the aim of determining the major components contributing to each city's seismic hazard. Discrete contributions from the selected input earthquake catalogue are investigated to determine those areas that dominate each city's prevailing seismic hazard with respect to magnitude and source-to-site distance. This work is based on an earthquake catalogue developed and described in a previously published paper by the author and components of a magnitude probability density function. Binned magnitude and distance classes are defined using a joint magnitude-distance distribution. The prevailing seismicity to each city-as defined by a child data set extracted from the parent earthquake catalogue for each city considered-is divided into distinct constrained data bins of small discrete magnitude and source-to-site distance intervals. These are then used to describe seismic hazard in terms of uni-variate modal values; that is, M* and D* which are the modal magnitude and modal source-to-site distance in each city's local historical seismicity. This work highlights that Sofia's dominating seismic hazard-that is, the modal magnitudes possessing the highest probabilities of occurrence-is located in zones confined to two regions at 60-80 km and 170-180 km from this city, for magnitude intervals of 5.75-6.00 Mw and 6.00-6.25 Mw respectively. Similarly, Thessaloniki appears prone to highest levels of hazard over a wider epicentral distance interval, from 80 to 200 km in the moment magnitude range 6.00-6.25 Mw.

  12. Tsallis non-additive entropy and natural time analysis of seismicity

    NASA Astrophysics Data System (ADS)

    Sarlis, N. V.; Skordas, E. S.; Varotsos, P.

    2017-12-01

    Within the context of Tsallis non-additive entropy [1] statistical mechanics -in the frame of which kappa distributions arise [2,3]- a derivation of the Gutenberg-Richter (GR) law of seismicity has been proposed [4,5]. Such an analysis leads to a generalized GR law [6,7] which is applied here to the earthquakes in Japan and California. These seismic data are also studied in natural time [6] revealing that although some properties of seismicity may be recovered by the non-additive entropy approach, temporal correlations between successive earthquake magnitudes should be also taken into account [6,8]. The importance of such correlations is strengthened by the observation of periods of long range correlated earthquake magnitude time series [9] a few months before all earthquakes of magnitude 7.6 or larger in the entire Japanese area from 1 January 1984 to 11 March 2011 (the day of the magnitude 9.0 Tohoku-Oki earthquake) almost simultaneously with characteristic order parameter variations of seismicity [10]. These variations appear approximately when low frequency abnormal changes of the electric and magnetic field of the Earth (less than around 1Hz) are recorded [11] before strong earthquakes as the magnitude 9.0 Tohoku-Oki earthquake in Japan in 2011 [12]. 1. C Tsallis, J Stat Phys 52 (1988) 479 2. G Livadiotis, and D J McComas, J Geophys Res 114 (2009) A11105 3. G Livadiotis, Kappa Distributions. (Elsevier, Amsterdam) 2017. doi: 10.1016/B978-0-12-804638-8.01001-9 4. O Sotolongo-Costa, A Posadas, Phys Rev Lett 92 (2004) 048501 5. R Silva, G França, C Vilar, J Alcaniz, Phys Rev E 73 (2006) 026102 6. N Sarlis, E Skordas, P Varotsos, Phys Rev E 82 (2010) 021110 7. L Telesca, Bull Seismol Soc Am 102 (2012) 886-891 8. P Varotsos, N Sarlis, E Skordas, Natural Time Analysis: The new view of time. (Springer, Berlin) 2011. doi: 10.1007/978-3-642-16449-1 9. P Varotsos, N Sarlis, E Skordas, J Geophys Res Space Physics 119 (2014) 9192. 10. N Sarlis, E Skordas, P Varotsos, T Nagao, M Kamogawa, H Tanaka, S Uyeda, Proc Natl Acad Sci USA 110 (2013) 13734. 11. P Varotsos, N Sarlis, E Skordas, M Lazaridou-Varotsos, Earthq Sci 30 (2017) doi: 10.1007/s11589-017-0189-0 12. P Varotsos, N Sarlis, E Skordas, Earthq Sci 30 (2017) doi: 10.1007/s11589-017-0182-7

  13. Post-seismic relaxation following the 2009 April 6, L'Aquila (Italy), earthquake revealed by the mass position of a broad-band seismometer

    NASA Astrophysics Data System (ADS)

    Pino, Nicola Alessandro

    2012-06-01

    Post-seismic relaxation is known to occur after large or moderate earthquakes, on time scales ranging from days to years or even decades. In general, long-term deformation following seismic events has been detected by means of standard geodetic measurements, although seismic instruments are only used to estimate short timescale transient processes. Albeit inertial seismic sensors are also sensitive to rotation around their sensitive axes, the recording of very slow inclination of the ground surface at their standard output channels is practically impossible, because of their design characteristics. However, modern force-balance, broad-band seismometers provide the possibility to detect and measure slow surface inclination, through the analysis of the mass position signal. This output channel represents the integral of the broad-band velocity and is generally considered only for state-of-health diagnostics. In fact, the analysis of mass position data recorded at the time of the 2009 April 6, L'Aquila (MW= 6.3) earthquake, by a closely located STS-2 seismometer, evidenced the occurrence of a very low frequency signal, starting right at the time of the seismic event. This waveform is only visible on the horizontal components and is not related to the usual drift coupled with the temperature changes. This analysis suggests that the observed signal is to be ascribed to slowly developing ground inclination at the station site, caused by post-seismic relaxation following the main shock. The observed tilt reached 1.7 × 10-5 rad in about 2 months. This estimate is in very good agreement with the geodetic observations, giving comparable tilt magnitude and direction at the same site. This study represents the first seismic analysis ever for the mass position signal, suggesting useful applications for usually neglected data.

  14. Geodynamic Evolution of Northeastern Tunisia During the Maastrichtian-Paleocene Time: Insights from Integrated Seismic Stratigraphic Analysis

    NASA Astrophysics Data System (ADS)

    Abidi, Oussama; Inoubli, Mohamed Hédi; Sebei, Kawthar; Amiri, Adnen; Boussiga, Haifa; Nasr, Imen Hamdi; Salem, Abdelhamid Ben; Elabed, Mahmoud

    2017-05-01

    The Maastrichtian-Paleocene El Haria formation was studied and defined in Tunisia on the basis of outcrops and borehole data; few studies were interested in its three-dimensional extent. In this paper, the El Haria formation is reviewed in the context of a tectono-stratigraphic interval using an integrated seismic stratigraphic analysis based on borehole lithology logs, electrical well logging, well shots, vertical seismic profiles and post-stack surface data. Seismic analysis benefits from appropriate calibration with borehole data, conventional interpretation, velocity mapping, seismic attributes and post-stack model-based inversion. The applied methodology proved to be powerful for charactering the marly Maastrichtian-Paleocene interval of the El Haria formation. Migrated seismic sections together with borehole measurements are used to detail the three-dimensional changes in thickness, facies and depositional environment in the Cap Bon and Gulf of Hammamet regions during the Maastrichtian-Paleocene time. Furthermore, dating based on their microfossil content divulges local and multiple internal hiatuses within the El Haria formation which are related to the geodynamic evolution of the depositional floor since the Campanian stage. Interpreted seismic sections display concordance, unconformities, pinchouts, sedimentary gaps, incised valleys and syn-sedimentary normal faulting. Based on the seismic reflection geometry and terminations, seven sequences are delineated. These sequences are related to base-level changes as the combination of depositional floor paleo-topography, tectonic forces, subsidence and the developed accommodation space. These factors controlled the occurrence of the various parts of the Maastrichtian-Paleocene interval. Detailed examinations of these deposits together with the analysis of the structural deformation at different time periods allowed us to obtain a better understanding of the sediment architecture in depth and the delineation of the geodynamic evolution of the region.

  15. A global database of seismically and non-seismically triggered landslides for 2D/3D numerical modeling

    NASA Astrophysics Data System (ADS)

    Domej, Gisela; Bourdeau, Céline; Lenti, Luca; Pluta, Kacper

    2017-04-01

    Landsliding is a worldwide common phenomenon. Every year, and ranging in size from very small to enormous, landslides cause all too often loss of life and disastrous damage to infrastructure, property and the environment. One main reason for more frequent catastrophes is the growth of population on the Earth which entails extending urbanization to areas at risk. Landslides are triggered by a variety and combination of causes, among which the role of water and seismic activity appear to have the most serious consequences. In this regard, seismic shaking is of particular interest since topographic elevation as well as the landslide mass itself can trap waves and hence amplify incoming surface waves - a phenomenon known as "site effects". Research on the topic of landsliding due to seismic and non-seismic activity is extensive and a broad spectrum of methods for modeling slope deformation is available. Those methods range from pseudo-static and rigid-block based models to numerical models. The majority is limited to 2D modeling since more sophisticated approaches in 3D are still under development or calibration. However, the effect of lateral confinement as well as the mechanical properties of the adjacent bedrock might be of great importance because they may enhance the focusing of trapped waves in the landslide mass. A database was created to study 3D landslide geometries. It currently contains 277 distinct seismically and non-seismically triggered landslides spread all around the globe whose rupture bodies were measured in all available details. Therefore a specific methodology was developed to maintain predefined standards, to keep the bias as low as possible and to set up a query tool to explore the database. Besides geometry, additional information such as location, date, triggering factors, material, sliding mechanisms, event chronology, consequences, related literature, among other things are stored for every case. The aim of the database is to enable statistical analysis on a vast and newly updated set of data and to create numerical models in the future. It is possible to define groups of landslides sharing the same characteristics, or cases belonging to different groups can be used to compare their responses to external loads. Thus, different options exist to create input data for numerical models. This is very promising especially considering the possibility of comparing 2D and 3D models having the same framework conditions (i.e. geometry, material, etc.). Comparison of 2D and 3D approaches might contribute to a better understanding of landsliding phenomena to improve the hazard prevention.

  16. Added-value joint source modelling of seismic and geodetic data

    NASA Astrophysics Data System (ADS)

    Sudhaus, Henriette; Heimann, Sebastian; Walter, Thomas R.; Krueger, Frank

    2013-04-01

    In tectonically active regions earthquake source studies strongly support the analysis of the current faulting processes as they reveal the location and geometry of active faults, the average slip released or more. For source modelling of shallow, moderate to large earthquakes often a combination of geodetic (GPS, InSAR) and seismic data is used. A truly joint use of these data, however, usually takes place only on a higher modelling level, where some of the first-order characteristics (time, centroid location, fault orientation, moment) have been fixed already. These required basis model parameters have to be given, assumed or inferred in a previous, separate and highly non-linear modelling step using one of the these data sets alone. We present a new earthquake rupture model implementation that realizes a fully combined data integration of surface displacement measurements and seismic data in a non-linear optimization of simple but extended planar ruptures. The model implementation allows for fast forward calculations of full seismograms and surface deformation and therefore enables us to use Monte Carlo global search algorithms. Furthermore, we benefit from the complementary character of seismic and geodetic data, e. g. the high definition of the source location from geodetic data and the sensitivity of the resolution of the seismic data on moment releases at larger depth. These increased constraints from the combined dataset make optimizations efficient, even for larger model parameter spaces and with a very limited amount of a priori assumption on the source. A vital part of our approach is rigorous data weighting based on the empirically estimated data errors. We construct full data error variance-covariance matrices for geodetic data to account for correlated data noise and also weight the seismic data based on their signal-to-noise ratio. The estimation of the data errors and the fast forward modelling opens the door for Bayesian inferences of the source model parameters. The source model product then features parameter uncertainty estimates and reveals parameter trade-offs that arise from imperfect data coverage and data errors. We applied our new source modelling approach to the 2010 Haiti earthquake for which a number of apparently different seismic, geodetic and joint source models has been reported already - mostly without any model parameter estimations. We here show that the variability of all these source models seems to arise from inherent model parameter trade-offs and mostly has little statistical significance, e.g. even using a large dataset comprising seismic and geodetic data the confidence interval of the fault dip remains as wide as about 20 degrees.

  17. Data Quality Control of the French Permanent Broadband Network in the RESIF Framework

    NASA Astrophysics Data System (ADS)

    Grunberg, Marc; Lambotte, Sophie; Engels, Fabien; Dretzen, Remi; Hernandez, Alain

    2014-05-01

    In the framework of the RESIF (Réseau Sismologique et géodésique Français) project, a new information system is being setting up, allowing the improvement of the management and the distribution of high quality data from the different elements of RESIF and the associated networks. Within this information system, EOST (in Strasbourg) is in charge of collecting real-time permanent broadband seismic waveform, and performing Quality Control on these data. The real-time and validated data set are pushed to the French National Distribution Center (Isterre/Grenoble) in order to make them publicly available. Furthermore EOST hosts the BCSF-ReNaSS, in charge of the French metropolitan seismic bulletin. This allows to benefit from some high-end quality control based on the national and world-wide seismicity. Here we present first the real-time seismic data flow from the stations of the French National Broad Band Network to EOST, and then, the data Quality Control procedures that were recently installed, including some new developments. The data Quality Control consists in applying a variety of subprocesses to check the consistency of the whole system and process from the stations to the data center. This allows us to verify that instruments and data transmission are operating correctly. Moreover analysis of the ambient noise helps to characterize intrinsic seismic quality of the stations and to identify other kind of disturbances. The deployed Quality Control consist in a pipeline that starts with low-level procedures : check the real-time miniseed data file (file naming convention, data integrity), check for inconsistencies between waveform and meta-data (channel name, sample rate, etc.), compute waveform statistics (data availability, gap/overlap, mean, rms, time quality, spike). It is followed by some high-level procedures such as : power spectral density computation (PSD), STA/LTA computation to be correlated to the seismicity, phases picking and stations magnitudes discrepancies. The results of quality control is visualized through a web interface. This latter gathers data from different information systems to provide a global view on last events that could impact the data (like intervention on site or seismic events, etc.). This work is still an ongoing project. We intend to add more sophisticated procedures to enhanced our data Quality Control. Among them, we will deploy a seismic moment tensor inversion tool for amplitude, time and polarity control and a noise correlation procedure for time drift detections.

  18. Mobility Effect on Poroelastic Seismic Signatures in Partially Saturated Rocks With Applications in Time-Lapse Monitoring of a Heavy Oil Reservoir

    NASA Astrophysics Data System (ADS)

    Zhao, Luanxiao; Yuan, Hemin; Yang, Jingkang; Han, De-hua; Geng, Jianhua; Zhou, Rui; Li, Hui; Yao, Qiuliang

    2017-11-01

    Conventional seismic analysis in partially saturated rocks normally lays emphasis on estimating pore fluid content and saturation, typically ignoring the effect of mobility, which decides the ability of fluids moving in the porous rocks. Deformation resulting from a seismic wave in heterogeneous partially saturated media can cause pore fluid pressure relaxation at mesoscopic scale, thereby making the fluid mobility inherently associated with poroelastic reflectivity. For two typical gas-brine reservoir models, with the given rock and fluid properties, the numerical analysis suggests that variations of patchy fluid saturation, fluid compressibility contrast, and acoustic stiffness of rock frame collectively affect the seismic reflection dependence on mobility. In particular, the realistic compressibility contrast of fluid patches in shallow and deep reservoir environments plays an important role in determining the reflection sensitivity to mobility. We also use a time-lapse seismic data set from a Steam-Assisted Gravity Drainage producing heavy oil reservoir to demonstrate that mobility change coupled with patchy saturation possibly leads to seismic spectral energy shifting from the baseline to monitor line. Our workflow starts from performing seismic spectral analysis on the targeted reflectivity interface. Then, on the basis of mesoscopic fluid pressure diffusion between patches of steam and heavy oil, poroelastic reflectivity modeling is conducted to understand the shift of the central frequency toward low frequencies after the steam injection. The presented results open the possibility of monitoring mobility change of a partially saturated geological formation from dissipation-related seismic attributes.

  19. Probabilistic seismic hazard analysis for a nuclear power plant site in southeast Brazil

    NASA Astrophysics Data System (ADS)

    de Almeida, Andréia Abreu Diniz; Assumpção, Marcelo; Bommer, Julian J.; Drouet, Stéphane; Riccomini, Claudio; Prates, Carlos L. M.

    2018-05-01

    A site-specific probabilistic seismic hazard analysis (PSHA) has been performed for the only nuclear power plant site in Brazil, located 130 km southwest of Rio de Janeiro at Angra dos Reis. Logic trees were developed for both the seismic source characterisation and ground-motion characterisation models, in both cases seeking to capture the appreciable ranges of epistemic uncertainty with relatively few branches. This logic-tree structure allowed the hazard calculations to be performed efficiently while obtaining results that reflect the inevitable uncertainty in long-term seismic hazard assessment in this tectonically stable region. An innovative feature of the study is an additional seismic source zone added to capture the potential contributions of characteristics earthquake associated with geological faults in the region surrounding the coastal site.

  20. Induced earthquake magnitudes are as large as (statistically) expected

    USGS Publications Warehouse

    Van Der Elst, Nicholas; Page, Morgan T.; Weiser, Deborah A.; Goebel, Thomas; Hosseini, S. Mehran

    2016-01-01

    A major question for the hazard posed by injection-induced seismicity is how large induced earthquakes can be. Are their maximum magnitudes determined by injection parameters or by tectonics? Deterministic limits on induced earthquake magnitudes have been proposed based on the size of the reservoir or the volume of fluid injected. However, if induced earthquakes occur on tectonic faults oriented favorably with respect to the tectonic stress field, then they may be limited only by the regional tectonics and connectivity of the fault network. In this study, we show that the largest magnitudes observed at fluid injection sites are consistent with the sampling statistics of the Gutenberg-Richter distribution for tectonic earthquakes, assuming no upper magnitude bound. The data pass three specific tests: (1) the largest observed earthquake at each site scales with the log of the total number of induced earthquakes, (2) the order of occurrence of the largest event is random within the induced sequence, and (3) the injected volume controls the total number of earthquakes rather than the total seismic moment. All three tests point to an injection control on earthquake nucleation but a tectonic control on earthquake magnitude. Given that the largest observed earthquakes are exactly as large as expected from the sampling statistics, we should not conclude that these are the largest earthquakes possible. Instead, the results imply that induced earthquake magnitudes should be treated with the same maximum magnitude bound that is currently used to treat seismic hazard from tectonic earthquakes.

  1. Variation of b and p values from aftershocks sequences along the Mexican subduction zone and their relation to plate characteristics

    NASA Astrophysics Data System (ADS)

    Ávila-Barrientos, L.; Zúñiga, F. R.; Rodríguez-Pérez, Q.; Guzmán-Speziale, M.

    2015-11-01

    Aftershock sequences along the Mexican subduction margin (between coordinates 110ºW and 91ºW) were analyzed by means of the p value from the Omori-Utsu relation and the b value from the Gutenberg-Richter relation. We focused on recent medium to large (Mw > 5.6) events considered susceptible of generating aftershock sequences suitable for analysis. The main goal was to try to find a possible correlation between aftershock parameters and plate characteristics, such as displacement rate, age and segmentation. The subduction regime of Mexico is one of the most active regions of the world with a high frequency of occurrence of medium to large events and plate characteristics change along the subduction margin. Previous studies have observed differences in seismic source characteristics at the subduction regime, which may indicate a difference in rheology and possible segmentation. The results of the analysis of the aftershock sequences indicate a slight tendency for p values to decrease from west to east with increasing of plate age although a statistical significance is undermined by the small number of aftershocks in the sequences, a particular feature distinctive of the region as compared to other world subduction regimes. The b values show an opposite, increasing trend towards the east even though the statistical significance is not enough to warrant the validation of such a trend. A linear regression between both parameters provides additional support for the inverse relation. Moreover, we calculated the seismic coupling coefficient, showing a direct relation with the p and b values. While we cannot undoubtedly confirm the hypothesis that aftershock generation depends on certain tectonic characteristics (age, thickness, temperature), our results do not reject it thus encouraging further study into this question.

  2. A new database on subduction seismicity at the global scale

    NASA Astrophysics Data System (ADS)

    Presti, D.; Heuret, A.; Funiciello, F.; Piromallo, C.

    2012-04-01

    In the framework of the EURYI Project 'Convergent margins and seismogenesis: defining the risk of great earthquakes by using statistical data and modelling', a global collection of recent intraslab seismicity has been performed. Based on EHB hypocenter and CMT Harvard catalogues, the hypocenters, nodal planes and seismic moments of worldwide subduction-related earthquakes were extracted for the period 1976 - 2007. Data were collected for centroid depths between sea level and 700 km and for magnitude Mw ≥ 5.5. For each subduction zone, a set of trench-normal transects were constructed choosing a 120km width of the cross-section on each side of a vertical plane and a spacing of 1 degree along the trench. For each of the 505 resulting transects, the whole subduction seismogenic zone was mapped as focal mechanisms projected on to a vertical plane after their faulting type classification according to the Aki-Richards convention. Transect by transect, fist the seismicity that can be considered not related to the subduction process under investigation was removed, then was selected the upper plate seismicity (i.e. earthquakes generated within the upper plate as a result of the subduction process). After deletion from the so obtained event subset of the interplate seismicity as identified in the framework of this project by Heuret et al. (2011), we can be reasonably confident that the remaining seismicity can be related to the subducting plate. Among these earthquakes we then selected the intermediate and deep depth seismicity. The upper limit of the intermediate depth seismicity is generally fixed at 70 km depth in order to avoid possible mixing with interplate seismicity. The ranking of intermediate depth and deep seismicity was in most of cases referred to earthquakes with focal depth between 70-300 km and with depth exceeding 300 km, respectively. Outer-rise seismicity was also selected. Following Heuret et al. (2011), the 505 transects were merged into 62 larger segments that were ideally homogeneous in terms of their seismogenic zone characteristics. Comparisons between main seismic parameters (e.g. cumulated seismic moment, P- and T-axes distributions, spatial and temporal distribution of largest magnitudes) with relation to both the different categories selected and the different segments have been performed in order to obtain a snapshot on the general behaviour of global subduction-related seismicity.

  3. Improvement of Epicentral Direction Estimation by P-wave Polarization Analysis

    NASA Astrophysics Data System (ADS)

    Oshima, Mitsutaka

    2016-04-01

    Polarization analysis has been used to analyze the polarization characteristics of waves and developed in various spheres, for example, electromagnetics, optics, and seismology. As for seismology, polarization analysis is used to discriminate seismic phases or to enhance specific phase (e.g., Flinn, 1965)[1], by taking advantage of the difference in polarization characteristics of seismic phases. In earthquake early warning, polarization analysis is used to estimate the epicentral direction using single station, based on the polarization direction of P-wave portion in seismic records (e.g., Smart and Sproules(1981) [2], Noda et al.,(2012) [3]). Therefore, improvement of the Estimation of Epicentral Direction by Polarization Analysis (EEDPA) directly leads to enhance the accuracy and promptness of earthquake early warning. In this study, the author tried to improve EEDPA by using seismic records of events occurred around Japan from 2003 to 2013. The author selected the events that satisfy following conditions. MJMA larger than 6.5 (JMA: Japan Meteorological Agency). Seismic records are available at least 3 stations within 300km in epicentral distance. Seismic records obtained at stations with no information on seismometer orientation were excluded, so that precise and quantitative evaluation of accuracy of EEDPA becomes possible. In the analysis, polarization has calculated by Vidale(1986) [4] that extended the method proposed by Montalbetti and Kanasewich(1970)[5] to use analytical signal. As a result of the analysis, the author found that accuracy of EEDPA improves by about 15% if velocity records, not displacement records, are used contrary to the author's expectation. Use of velocity records enables reduction of CPU time in integration of seismic records and improvement in promptness of EEDPA, although this analysis is still rough and further scrutiny is essential. At this moment, the author used seismic records that obtained by simply integrating acceleration records and applied no filtering. Further study on optimal type of filter and its application frequency band is necessary. In poster presentation, the results of aforementioned study shall be shown. [1] Flinn, E. A. (1965) , Signal analysis using rectilinearity and direction of particle motion. Proceedings of the IEEE, 53(12), 1874-1876. [2] Smart, E., & Sproules, H. (1981), Regional phase processors (No. SDAC-TR-81-1). TELEDYNE GEOTECH ALEXANDRIA VA SEISMIC DATA ANALYSIS CENTER. [3] Noda, S., Yamamoto, S., Sato, S., Iwata, N., Korenaga, M., & Ashiya, K. (2012). Improvement of back-azimuth estimation in real-time by using a single station record. Earth, planets and space, 64(3), 305-308. [4] Vidale, J. E. (1986). Complex polarization analysis of particle motion. Bulletin of the Seismological society of America, 76(5), 1393-1405. [5] Montalbetti, J. F., & Kanasewich, E. R. (1970). Enhancement of teleseismic body phases with a polarization filter. Geophysical Journal International, 21(2), 119-129.

  4. A quantitative analysis of global intermediate and deep seismicity

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Le Pourhiet, Laetitita; Agard, Philippe; Meier, Thomas

    2017-04-01

    The seismic activity in subduction zones around the world shows a large spatial variabilty with some regions exhibiting strong seismic activity down to depths of almost 700km while in other places seismicity terminates at depths of about 200 or 300 km. Also the decay of the number of seismic events or of the seismic moment with depth is more pronounced in some regions than in others. The same is true for the variability of the ratio of large to small events (the b-value of the Gutenberg-Richter relation) that is varying with depth. These observations are often linked to parameters of the downgoing plate like age or subduction velocity. In this study we investigate a subset of subduction zones utilizing the revised ISC catalogue of intermediate and deep seismicity to determine statistical parameters well suited to describe properties of intermediate deep and deep events. The seismicity is separated into three depth intervals from 50-175km, 175-400km and >400km based on the depth at which the plate contact decouples, the observed nearly exponential decay of the event rate with depth and the supposed depth of phase transition at 410 km depth where also an increase of the event number with depth is observed. For estimation of the b-value and the exponential decay with depth, a restriction of the investigated time interval to the period after 1997 produced significantly better results indicating a globally homogeneous magnitude scale with the magnitude of completeness of about Mw 5. On a global scale the b-value decreases with depth from values of about 1 at 50-175km to values of slightly below 0.8 for events below 400km. Also, there is a slight increase of the b-value with the age of the subducting plate. These changes in the b-value with depth and with age may indicate a varying fragmentation of the slab. With respect to the ratio of the seismic moment between deeper and shallower parts of the subduction zones a dependence on the age is apparent with older slabs exhibiting higher ratios indicating stronger hydration of older slabs and consequently stronger seismic activity at depth in older and thicker slabs. Furthermore, older slabs show the tendency to larger b-values. This indicates stronger fragmentation of older slabs favoring smaller events. Between 50 km and 300 km depth, seismicity in subduction zones decays nearly exponentially with depth. However, the majority of subduction zones show between about 60 km and 100 km lower seismic activity than expected by an exponential decay. This observation correlates well with findings from petrological studies that rocks are rarely scraped off from the downgoing plate at these depths indicating low seismic coupling and low stresses at the plate interface in a depth range below the seismogenic zone and above 100 km depth were dehydration reactions become virulent. Interestingly, the percentage of this deficit becomes larger with plate age for event frequency (reduced number of events), but decreases for moment release (events have larger magnitudes). It is observed that the forearc high is located above the plate interface with reduced seismic coupling. The forearc high is thus an indication of upward directed return flow along the seismically decoupled plate interface. In addition, it is found that the topography of the forearc high is larger above shallow dipping slabs. A correlation of the depth dependent seismic behavior with the subduction or trench velocity is not observed for the investigated subduction zones. Plate age seems to be the dominating factor for properties of intermediate deep and deep seismicity.

  5. A synthetic seismicity model for the Middle America Trench

    NASA Technical Reports Server (NTRS)

    Ward, Steven N.

    1991-01-01

    A novel iterative technique, based on the concept of fault segmentation and computed using 2D static dislocation theory, for building models of seismicity and fault interaction which are physically acceptable and geometrically and kinematically correct, is presented. The technique is applied in two steps to seismicity observed at the Middle America Trench. The first constructs generic models which randomly draw segment strengths and lengths from a 2D probability distribution. The second constructs predictive models in which segment lengths and strengths are adjusted to mimic the actual geography and timing of large historical earthquakes. Both types of models reproduce the statistics of seismicity over five units of magnitude and duplicate other aspects including foreshock and aftershock sequences, migration of foci, and the capacity to produce both characteristic and noncharacteristic earthquakes. Over a period of about 150 yr the complex interaction of fault segments and the nonlinear failure conditions conspire to transform an apparently deterministic model into a chaotic one.

  6. An Improved Statistical Solution for Global Seismicity by the HIST-ETAS Approach

    NASA Astrophysics Data System (ADS)

    Chu, A.; Ogata, Y.; Katsura, K.

    2010-12-01

    For long-term global seismic model fitting, recent work by Chu et al. (2010) applied the spatial-temporal ETAS model (Ogata 1998) and analyzed global data partitioned into tectonic zones based on geophysical characteristics (Bird 2003), and it has shown tremendous improvements of model fitting compared with one overall global model. While the ordinary ETAS model assumes constant parameter values across the complete region analyzed, the hierarchical space-time ETAS model (HIST-ETAS, Ogata 2004) is a newly introduced approach by proposing regional distinctions of the parameters for more accurate seismic prediction. As the HIST-ETAS model has been fit to regional data of Japan (Ogata 2010), our work applies the model to describe global seismicity. Employing the Akaike's Bayesian Information Criterion (ABIC) as an assessment method, we compare the MLE results with zone divisions considered to results obtained by an overall global model. Location dependent parameters of the model and Gutenberg-Richter b-values are optimized, and seismological interpretations are discussed.

  7. Extreme Threshold Failures Within a Heterogeneous Elastic Thin Sheet and the Spatial-Temporal Development of Induced Seismicity Within the Groningen Gas Field

    NASA Astrophysics Data System (ADS)

    Bourne, S. J.; Oates, S. J.

    2017-12-01

    Measurements of the strains and earthquakes induced by fluid extraction from a subsurface reservoir reveal a transient, exponential-like increase in seismicity relative to the volume of fluids extracted. If the frictional strength of these reactivating faults is heterogeneously and randomly distributed, then progressive failures of the weakest fault patches account in a general manner for this initial exponential-like trend. Allowing for the observable elastic and geometric heterogeneity of the reservoir, the spatiotemporal evolution of induced seismicity over 5 years is predictable without significant bias using a statistical physics model of poroelastic reservoir deformations inducing extreme threshold frictional failures of previously inactive faults. This model is used to forecast the temporal and spatial probability density of earthquakes within the Groningen natural gas reservoir, conditional on future gas production plans. Probabilistic seismic hazard and risk assessments based on these forecasts inform the current gas production policy and building strengthening plans.

  8. A test of present-day plate geometries for northeast Asia and Japan

    NASA Technical Reports Server (NTRS)

    Demets, Charles

    1992-01-01

    Alternative geometries for the present-day configuration of plate boundaries in northeast Asia and Japan are tested using NUVEL-1 and 256 horizontal earthquake slip vectors from the Japan and northern Kuril trenches. Statistical analysis of the slip vectors is used to determine whether the North American, Eurasian, or Okhotsk plate overlies the trench. Along the northern Kuril trench, slip vectors are well-fit by the NUVEL-1 Pacific-North America Euler pole, but are poorly fit by the Pacific-Eurasia Euler pole. Results for the Japan trench are less conclusive, but suggest that much of Honshu and Hokkaido are also part of the North American plate. The simplest geometry consistent with the trench slip vectors is a geometry in which the North American plate extends south to 41 deg N, and possibly includes northern Honshu and southern Hokkaido. Although these results imply that the diffuse seismicity that connects the Lena River delta to Sakhalin Island and the eastern Sea of Japan records motion between Eurasia and North America, onshore geologic and seismic data define an additional belt of seismicity in Siberia that cannot be explained with this geometry. Assuming that these two seismic belts constitute evidence for an Okhotsk block, two published kinematic models for motion of the Okhotsk block are tested. The first model, which predicts motion of up to 15 mm/yr relative to North America, is rejected because Kuril and Japan trench slip vectors are fit more poorly than for the simpler geometry described above. The second model gives a good fit to the trench slip vectors, but only if Okhotsk-North America motion is slower than 5 mm/yr.

  9. Complex spatiotemporal evolution of the 2008 Mw 4.9 Mogul earthquake swarm (Reno, Nevada): Interplay of fluid and faulting

    NASA Astrophysics Data System (ADS)

    Ruhl, C. J.; Abercrombie, R. E.; Smith, K. D.; Zaliapin, I.

    2016-11-01

    After approximately 2 months of swarm-like earthquakes in the Mogul neighborhood of west Reno, NV, seismicity rates and event magnitudes increased over several days culminating in an Mw 4.9 dextral strike-slip earthquake on 26 April 2008. Although very shallow, the Mw 4.9 main shock had a different sense of slip than locally mapped dip-slip surface faults. We relocate 7549 earthquakes, calculate 1082 focal mechanisms, and statistically cluster the relocated earthquake catalog to understand the character and interaction of active structures throughout the Mogul, NV earthquake sequence. Rapid temporary instrument deployment provides high-resolution coverage of microseismicity, enabling a detailed analysis of swarm behavior and faulting geometry. Relocations reveal an internally clustered sequence in which foreshocks evolved on multiple structures surrounding the eventual main shock rupture. The relocated seismicity defines a fault-fracture mesh and detailed fault structure from approximately 2-6 km depth on the previously unknown Mogul fault that may be an evolving incipient strike-slip fault zone. The seismicity volume expands before the main shock, consistent with pore pressure diffusion, and the aftershock volume is much larger than is typical for an Mw 4.9 earthquake. We group events into clusters using space-time-magnitude nearest-neighbor distances between events and develop a cluster criterion through randomization of the relocated catalog. Identified clusters are largely main shock-aftershock sequences, without evidence for migration, occurring within the diffuse background seismicity. The migration rate of the largest foreshock cluster and simultaneous background events is consistent with it having triggered, or having been triggered by, an aseismic slip event.

  10. Seismological and geodetic constraints on the 2011 Mw5.3 Trinidad, Colorado earthquake and induced deformation in the Raton Basin

    USGS Publications Warehouse

    Barnhart, William D.; Benz, Harley M.; Hayes, Gavin P.; Rubinstein, Justin L.; Bergman, E.

    2014-01-01

    The Raton Basin of southern Colorado and northern New Mexico is an actively produced hydrocarbon basin that has experienced increased seismicity since 2001, including the August 2011 Mw5.3 Trinidad normal faulting event. Following the 2011 earthquake, regional seismic observations were used to relocate 21 events, including the 2011 main shock, two foreshocks, and 13 aftershocks. Additionally, interferometric synthetic aperture radar (InSAR) observations of both the 2011 event and preevent basin deformation place constraint on the spatial kinematics of the 2011 event and localized basin subsidence due to ground water or gas withdrawal. We find that the 2011 earthquake ruptured an 8–10 km long segment of a normal fault at depths of 1.5–6.0 km within the crystalline Precambrian basement underlying the Raton Basin sedimentary rocks. The earthquake also nucleated within the crystalline basement in the vicinity of an active wastewater disposal site. The ensuing aftershock sequence demonstrated statistical properties expected for intraplate earthquakes, though the length of the 2011 earthquake is unexpectedly long for an Mw5.3 event, suggesting that wastewater disposal may have triggered a low stress drop, otherwise natural earthquake. Additionally, preevent and postevent seismicity in the Raton Basin spatially correlates to regions of subsidence observed in InSAR time series analysis. While these observations cannot discern a causal link between hydrocarbon production and seismicity, they constrain spatial relationships between active basin deformation and geological and anthropogenic features. Furthermore, the InSAR observations highlight the utility of space-based geodetic observations for monitoring and assessing anthropogenically induced and triggered deformation.

  11. Airborne LiDAR analysis and geochronology of faulted glacial moraines in the Tahoe-Sierra frontal fault zone reveal substantial seismic hazards in the Lake Tahoe region, California-Nevada USA

    USGS Publications Warehouse

    Howle, James F.; Bawden, Gerald W.; Schweickert, Richard A.; Finkel, Robert C.; Hunter, Lewis E.; Rose, Ronn S.; von Twistern, Brent

    2012-01-01

    We integrated high-resolution bare-earth airborne light detection and ranging (LiDAR) imagery with field observations and modern geochronology to characterize the Tahoe-Sierra frontal fault zone, which forms the neotectonic boundary between the Sierra Nevada and the Basin and Range Province west of Lake Tahoe. The LiDAR imagery clearly delineates active normal faults that have displaced late Pleistocene glacial moraines and Holocene alluvium along 30 km of linear, right-stepping range front of the Tahoe-Sierra frontal fault zone. Herein, we illustrate and describe the tectonic geomorphology of faulted lateral moraines. We have developed new, three-dimensional modeling techniques that utilize the high-resolution LiDAR data to determine tectonic displacements of moraine crests and alluvium. The statistically robust displacement models combined with new ages of the displaced Tioga (20.8 ± 1.4 ka) and Tahoe (69.2 ± 4.8 ka; 73.2 ± 8.7 ka) moraines are used to estimate the minimum vertical separation rate at 17 sites along the Tahoe-Sierra frontal fault zone. Near the northern end of the study area, the minimum vertical separation rate is 1.5 ± 0.4 mm/yr, which represents a two- to threefold increase in estimates of seismic moment for the Lake Tahoe basin. From this study, we conclude that potential earthquake moment magnitudes (Mw) range from 6.3 ± 0.25 to 6.9 ± 0.25. A close spatial association of landslides and active faults suggests that landslides have been seismically triggered. Our study underscores that the Tahoe-Sierra frontal fault zone poses substantial seismic and landslide hazards.

  12. Prospective Evaluation of the Global Earthquake Activity Rate Model (GEAR1) Earthquake Forecast: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Strader, Anne; Schorlemmer, Danijel; Beutin, Thomas

    2017-04-01

    The Global Earthquake Activity Rate Model (GEAR1) is a hybrid seismicity model, constructed from a loglinear combination of smoothed seismicity from the Global Centroid Moment Tensor (CMT) earthquake catalog and geodetic strain rates (Global Strain Rate Map, version 2.1). For the 2005-2012 retrospective evaluation period, GEAR1 outperformed both parent strain rate and smoothed seismicity forecasts. Since 1. October 2015, GEAR1 has been prospectively evaluated by the Collaboratory for the Study of Earthquake Predictability (CSEP) testing center. Here, we present initial one-year test results of the GEAR1, GSRM and GSRM2.1, as well as localized evaluation of GEAR1 performance. The models were evaluated on the consistency in number (N-test), spatial (S-test) and magnitude (M-test) distribution of forecasted and observed earthquakes, as well as overall data consistency (CL-, L-tests). Performance at target earthquake locations was compared between models using the classical paired T-test and its non-parametric equivalent, the W-test, to determine if one model could be rejected in favor of another at the 0.05 significance level. For the evaluation period from 1. October 2015 to 1. October 2016, the GEAR1, GSRM and GSRM2.1 forecasts pass all CSEP likelihood tests. Comparative test results show statistically significant improvement of GEAR1 performance over both strain rate-based forecasts, both of which can be rejected in favor of GEAR1. Using point process residual analysis, we investigate the spatial distribution of differences in GEAR1, GSRM and GSRM2 model performance, to identify regions where the GEAR1 model should be adjusted, that could not be inferred from CSEP test results. Furthermore, we investigate whether the optimal combination of smoothed seismicity and strain rates remains stable over space and time.

  13. Seismic slope-performance analysis: from hazard map to decision support system

    USGS Publications Warehouse

    Miles, Scott B.; Keefer, David K.; Ho, Carlton L.

    1999-01-01

    In response to the growing recognition of engineers and decision-makers of the regional effects of earthquake-induced landslides, this paper presents a general approach to conducting seismic landslide zonation, based on the popular Newmark's sliding block analogy for modeling coherent landslides. Four existing models based on the sliding block analogy are compared. The comparison shows that the models forecast notably different levels of slope performance. Considering this discrepancy along with the limitations of static maps as a decision tool, a spatial decision support system (SDSS) for seismic landslide analysis is proposed, which will support investigations over multiple scales for any number of earthquake scenarios and input conditions. Most importantly, the SDSS will allow use of any seismic landslide analysis model and zonation approach. Developments associated with the SDSS will produce an object-oriented model for encapsulating spatial data, an object-oriented specification to allow construction of models using modular objects, and a direct-manipulation, dynamic user-interface that adapts to the particular seismic landslide model configuration.

  14. 6C polarization analysis - seismic direction finding in coherent noise, automated event identification, and wavefield separation

    NASA Astrophysics Data System (ADS)

    Schmelzbach, C.; Sollberger, D.; Greenhalgh, S.; Van Renterghem, C.; Robertsson, J. O. A.

    2017-12-01

    Polarization analysis of standard three-component (3C) seismic data is an established tool to determine the propagation directions of seismic waves recorded by a single station. A major limitation of seismic direction finding methods using 3C recordings, however, is that a correct propagation-direction determination is only possible if the wave mode is known. Furthermore, 3C polarization analysis techniques break down in the presence of coherent noise (i.e., when more than one event is present in the analysis time window). Recent advances in sensor technology (e.g., fibre-optical, magnetohydrodynamic angular rate sensors, and ring laser gyroscopes) have made it possible to accurately measure all three components of rotational ground motion exhibited by seismic waves, in addition to the conventionally recorded three components of translational motion. Here, we present an extension of the theory of single station 3C polarization analysis to six-component (6C) recordings of collocated translational and rotational ground motions. We demonstrate that the information contained in rotation measurements can help to overcome some of the main limitations of standard 3C seismic direction finding, such as handling multiple arrivals simultaneously. We show that the 6C polarisation of elastic waves measured at the Earth's free surface does not only depend on the seismic wave type and propagation direction, but also on the local P- and S-wave velocities just beneath the recording station. Using an adaptation of the multiple signal classification algorithm (MUSIC), we demonstrate how seismic events can univocally be identified and characterized in terms of their wave type. Furthermore, we show how the local velocities can be inferred from single-station 6C data, in addition to the direction angles (inclination and azimuth) of seismic arrivals. A major benefit of our proposed 6C method is that it also allows the accurate recovery of the wave type, propagation directions, and phase velocities of multiple, interfering arrivals in one time window. We demonstrate how this property can be exploited to separate the wavefield into its elastic wave-modes and to isolate or suppress waves arriving from specific directions (directional filtering), both in a fully automated fashion.

  15. Design, analysis, and seismic performance of a hypothetical seismically isolated bridge on legacy highway.

    DOT National Transportation Integrated Search

    2011-01-01

    The need to maintain the functionality of critical transportation lifelines after a large seismic event motivates the : strategy to design certain bridges for performance standards beyond the minimum required by bridge design codes. : To design a bri...

  16. On the nature and dynamics of the seismogenetic systems of North California, USA: An analysis based on Non-Extensive Statistical Physics

    NASA Astrophysics Data System (ADS)

    Efstathiou, Angeliki; Tzanis, Andreas; Vallianatos, Filippos

    2017-09-01

    We examine the nature of the seismogenetic system in North California, USA, by searching for evidence of complexity and non-extensivity in the earthquake record. We attempt to determine whether earthquakes are generated by a self-excited Poisson process, in which case they obey Boltzmann-Gibbs thermodynamics, or by a Critical process, in which long-range interactions in non-equilibrium states are expected (correlation) and the thermodynamics deviate from the Boltzmann-Gibbs formalism. Emphasis is given to background seismicity since it is generally agreed that aftershock sequences comprise correlated sets. We use the complete and homogeneous earthquake catalogue published by the North California Earthquake Data Centre, in which aftershocks are either included, or have been removed by a stochastic declustering procedure. We examine multivariate cumulative frequency distributions of earthquake magnitudes, interevent time and interevent distance in the context of Non-Extensive Statistical Physics, which is a generalization of extensive Boltzmann-Gibbs thermodynamics to non-equilibrating (non-extensive) systems. Our results indicate that the seismogenetic systems of North California are generally sub-extensive complex and non-Poissonian. The background seismicity exhibits long-range interaction as evidenced by the overall increase of correlation observed by declustering the earthquake catalogues, as well as by the high correlation observed for earthquakes separated by long interevent distances. It is also important to emphasize that two subsystems with rather different properties appear to exist. The correlation observed along the Sierra Nevada Range - Walker Lane is quasi-stationary and indicates a Self-Organized Critical fault system. Conversely, the north segment of the San Andreas Fault exhibits changes in the level of correlation with reference to the large Loma Prieta event of 1989 and thus has attributes of Critical Point behaviour albeit without acceleration of seismic release rates. SOC appears to be a likely explanation of complexity mechanisms but since there are other ways by which complexity may emerge, additional work is required before assertive conclusions can be drawn.

  17. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  18. Gas hydrate characterization from a 3D seismic dataset in the deepwater eastern Gulf of Mexico

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McConnell, Daniel; Haneberg, William C.

    Seismic stratigraphic features are delineated using principal component analysis of the band limited data at potential gas hydrate sands, and compared and calibrated with spectral decomposition thickness to constrain thickness in the absence of well control. Layers in the abyssal fan sediments are thinner than can be resolved with 50 Hz seismic and thus comprise composite thin-bed reflections. Amplitude vs frequency analysis are used to indicate gas and gas hydrate reflections. Synthetic seismic wedge models show that with 50Hz seismic data, a 40% saturation of a Plio Pleistocene GoM sand in the hydrate stability zone with no subjacent gas canmore » produce a phase change (negative to positive) with a strong correlation between amplitude and hydrate saturation. The synthetic seismic response is more complicated if the gas hydrate filled sediments overlie gassy sediments. Hydrate (or gas) saturation in thin beds enhances the amplitude response and can be used to estimate saturation. Gas hydrate saturation from rock physics, amplitude, and frequency analysis is compared to saturation derived from inversion at several interpreted gas hydrate accumulations in the eastern Gulf of Mexico.« less

  19. 3D seismic data de-noising and reconstruction using Multichannel Time Slice Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Rekapalli, Rajesh; Tiwari, R. K.; Sen, Mrinal K.; Vedanti, Nimisha

    2017-05-01

    Noises and data gaps complicate the seismic data processing and subsequently cause difficulties in the geological interpretation. We discuss a recent development and application of the Multi-channel Time Slice Singular Spectrum Analysis (MTSSSA) for 3D seismic data de-noising in time domain. In addition, L1 norm based simultaneous data gap filling of 3D seismic data using MTSSSA also discussed. We discriminated the noises from single individual time slices of 3D volumes by analyzing Eigen triplets of the trajectory matrix. We first tested the efficacy of the method on 3D synthetic seismic data contaminated with noise and then applied to the post stack seismic reflection data acquired from the Sleipner CO2 storage site (pre and post CO2 injection) from Norway. Our analysis suggests that the MTSSSA algorithm is efficient to enhance the S/N for better identification of amplitude anomalies along with simultaneous data gap filling. The bright spots identified in the de-noised data indicate upward migration of CO2 towards the top of the Utsira formation. The reflections identified applying MTSSSA to pre and post injection data correlate well with the geology of the Southern Viking Graben (SVG).

  20. Seismic reflection response from cross-correlations of ambient vibrations on non-conventional hidrocarbon reservoir

    NASA Astrophysics Data System (ADS)

    Huerta, F. V.; Granados, I.; Aguirre, J.; Carrera, R. Á.

    2017-12-01

    Nowadays, in hydrocarbon industry, there is a need to optimize and reduce exploration costs in the different types of reservoirs, motivating the community specialized in the search and development of alternative exploration geophysical methods. This study show the reflection response obtained from a shale gas / oil deposit through the method of seismic interferometry of ambient vibrations in combination with Wavelet analysis and conventional seismic reflection techniques (CMP & NMO). The method is to generate seismic responses from virtual sources through the process of cross-correlation of records of Ambient Seismic Vibrations (ASV), collected in different receivers. The seismic response obtained is interpreted as the response that would be measured in one of the receivers considering a virtual source in the other. The acquisition of ASV records was performed in northern of Mexico through semi-rectangular arrays of multi-component geophones with instrumental response of 10 Hz. The in-line distance between geophones was 40 m while in cross-line was 280 m, the sampling used during the data collection was 2 ms and the total duration of the records was 6 hours. The results show the reflection response of two lines in the in-line direction and two in the cross-line direction for which the continuity of coherent events have been identified and interpreted as reflectors. There is certainty that the events identified correspond to reflections because the time-frequency analysis performed with the Wavelet Transform has allowed to identify the frequency band in which there are body waves. On the other hand, the CMP and NMO techniques have allowed to emphasize and correct the reflection response obtained during the correlation processes in the frequency band of interest. The results of the processing and analysis of ASV records through the seismic interferometry method have allowed us to see interesting results in light of the cross-correlation process in combination with the Wavelet analysis and conventional seismic reflection techniques. Therefore it was possible to recover the seismic response on each analyzed source-receiver pair, allowing us to obtain the reflection response of each analyzed seismic line.

  1. MatSeis and the GNEM R&E regional seismic anaylsis tools.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chael, Eric Paul; Hart, Darren M.; Young, Christopher John

    2003-08-01

    To improve the nuclear event monitoring capability of the U.S., the NNSA Ground-based Nuclear Explosion Monitoring Research & Engineering (GNEM R&E) program has been developing a collection of products known as the Knowledge Base (KB). Though much of the focus for the KB has been on the development of calibration data, we have also developed numerous software tools for various purposes. The Matlab-based MatSeis package and the associated suite of regional seismic analysis tools were developed to aid in the testing and evaluation of some Knowledge Base products for which existing applications were either not available or ill-suited. This presentationmore » will provide brief overviews of MatSeis and each of the tools, emphasizing features added in the last year. MatSeis was begun in 1996 and is now a fairly mature product. It is a highly flexible seismic analysis package that provides interfaces to read data from either flatfiles or an Oracle database. All of the standard seismic analysis tasks are supported (e.g. filtering, 3 component rotation, phase picking, event location, magnitude calculation), as well as a variety of array processing algorithms (beaming, FK, coherency analysis, vespagrams). The simplicity of Matlab coding and the tremendous number of available functions make MatSeis/Matlab an ideal environment for developing new monitoring research tools (see the regional seismic analysis tools below). New MatSeis features include: addition of evid information to events in MatSeis, options to screen picks by author, input and output of origerr information, improved performance in reading flatfiles, improved speed in FK calculations, and significant improvements to Measure Tool (filtering, multiple phase display), Free Plot (filtering, phase display and alignment), Mag Tool (maximum likelihood options), and Infra Tool (improved calculation speed, display of an F statistic stream). Work on the regional seismic analysis tools (CodaMag, EventID, PhaseMatch, and Dendro) began in 1999 and the tools vary in their level of maturity. All rely on MatSeis to provide necessary data (waveforms, arrivals, origins, and travel time curves). CodaMag Tool implements magnitude calculation by scaling to fit the envelope shape of the coda for a selected phase type (Mayeda, 1993; Mayeda and Walter, 1996). New tool features include: calculation of a yield estimate based on the source spectrum, display of a filtered version of the seismogram based on the selected band, and the output of codamag data records for processed events. EventID Tool implements event discrimination using phase ratios of regional arrivals (Hartse et al., 1997; Walter et al., 1999). New features include: bandpass filtering of displayed waveforms, screening of reference events based on SNR, multivariate discriminants, use of libcgi to access correction surfaces, and the output of discrim{_}data records for processed events. PhaseMatch Tool implements match filtering to isolate surface waves (Herrin and Goforth, 1977). New features include: display of the signal's observed dispersion and an option to use a station-based dispersion surface. Dendro Tool implements agglomerative hierarchical clustering using dendrograms to identify similar events based on waveform correlation (Everitt, 1993). New features include: modifications to include arrival information within the tool, and the capability to automatically add/re-pick arrivals based on the picked arrivals for similar events.« less

  2. Report on studies to monitor the interactions between offshore geophysical exploration activities and bowhead whales in the Alaskan Beaufort Sea, Fall 1982

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reeves, R.; Ljungblad, D.; Clarke, J.T.

    1983-07-01

    A total of 34 survey flights were initiated between 27 August and 4 October 1982 to assess the potential effects of marine geophysical survey work on westward migrating bowhead whales (Balaena mysticetus). No overt changes in whale behavior were observed that could unequivocally be interpreted as responses to seismic noise, with the possible exception of huddling behavior observed on 14-15 September that may have been caused by the onset of seismic sounds. Statistical analyses were performed on four categories of respiratory behavior (blows per surfacing, mean blow interval per surfacing, surface times and dive times) to test for differences betweenmore » times when whales were and were not exposed to seismic sounds.« less

  3. Comparison of Earthquake Damage Patterns and Shallow-Depth Vs Structure Across the Napa Valley, Inferred From Multichannel Analysis of Surface Waves (MASW) and Multichannel Analysis of Love Waves (MALW) Modeling of Basin-Wide Seismic Profiles

    NASA Astrophysics Data System (ADS)

    Chan, J. H.; Catchings, R.; Strayer, L. M.; Goldman, M.; Criley, C.; Sickler, R. R.; Boatwright, J.

    2017-12-01

    We conducted an active-source seismic investigation across the Napa Valley (Napa Valley Seismic Investigation-16) in September of 2016 consisting of two basin-wide seismic profiles; one profile was 20 km long and N-S-trending (338°), and the other 15 km long and E-W-trending (80°) (see Catchings et al., 2017). Data from the NVSI-16 seismic investigation were recorded using a total of 666 vertical- and horizontal-component seismographs, spaced 100 m apart on both seismic profiles. Seismic sources were generated by a total of 36 buried explosions spaced 1 km apart. The two seismic profiles intersected in downtown Napa, where a large number of buildings were red-tagged by the City following the 24 August 2014 Mw 6.0 South Napa earthquake. From the recorded Rayleigh and Love waves, we developed 2-Dimensional S-wave velocity models to depths of about 0.5 km using the multichannel analysis of surface waves (MASW) method. Our MASW (Rayleigh) and MALW (Love) models show two prominent low-velocity (Vs = 350 to 1300 m/s) sub-basins that were also previously identified from gravity studies (Langenheim et al., 2010). These basins trend N-W and also coincide with the locations of more than 1500 red- and yellow-tagged buildings within the City of Napa that were tagged after the 2014 South Napa earthquake. The observed correlation between low-Vs, deep basins, and the red-and yellow-tagged buildings in Napa suggests similar large-scale seismic investigations can be performed. These correlations provide insights into the likely locations of significant structural damage resulting from future earthquakes that occur adjacent to or within sedimentary basins.

  4. Travel-time source-specific station correction improves location accuracy

    NASA Astrophysics Data System (ADS)

    Giuntini, Alessandra; Materni, Valerio; Chiappini, Stefano; Carluccio, Roberto; Console, Rodolfo; Chiappini, Massimo

    2013-04-01

    Accurate earthquake locations are crucial for investigating seismogenic processes, as well as for applications like verifying compliance to the Comprehensive Test Ban Treaty (CTBT). Earthquake location accuracy is related to the degree of knowledge about the 3-D structure of seismic wave velocity in the Earth. It is well known that modeling errors of calculated travel times may have the effect of shifting the computed epicenters far from the real locations by a distance even larger than the size of the statistical error ellipses, regardless of the accuracy in picking seismic phase arrivals. The consequences of large mislocations of seismic events in the context of the CTBT verification is particularly critical in order to trigger a possible On Site Inspection (OSI). In fact, the Treaty establishes that an OSI area cannot be larger than 1000 km2, and its larger linear dimension cannot be larger than 50 km. Moreover, depth accuracy is crucial for the application of the depth event screening criterion. In the present study, we develop a method of source-specific travel times corrections based on a set of well located events recorded by dense national seismic networks in seismically active regions. The applications concern seismic sequences recorded in Japan, Iran and Italy. We show that mislocations of the order of 10-20 km affecting the epicenters, as well as larger mislocations in hypocentral depths, calculated from a global seismic network and using the standard IASPEI91 travel times can be effectively removed by applying source-specific station corrections.

  5. Static Corrections to Improve Seismic Monitoring of the North Korean Nuclear Test Site with Regional Arrays

    NASA Astrophysics Data System (ADS)

    Wilkins, N.; Wookey, J. M.; Selby, N. D.

    2017-12-01

    Seismology is an important part of the International Monitoring System (IMS) installed to detect, identify, and locate nuclear detonations in breach of the Comprehensive nuclear Test Ban Treaty (CTBT) prior to and after its entry into force. Seismic arrays in particular provide not only a means of detecting and locating underground nuclear explosions, but in discriminating them from naturally occurring earthquakes of similar magnitude. One potential discriminant is the amplitude ratio of high frequency (> 2 Hz) P waves to S waves (P/S) measured at regional distances (3 - 17 °). Accurate measurement of such discriminants, and the ability to detect low-magnitude seismicity from a suspicious event relies on high signal-to-noise ratio (SNR) data. A correction to the slowness vector of the incident seismic wavefield, and static corrections applied to the waveforms recorded at each receiver within the array can be shown to improve the SNR. We apply codes we have developed to calculate slowness-azimuth station corrections (SASCs) and static corrections to the arrival time and amplitude of the seismic waveform to seismic arrays regional to the DPRK nuclear test site at Punggye-ri, North Korea. We use the F-statistic to demonstrate the SNR improvement to data from the nuclear tests and other seismic events in the vicinity of the test site. We also make new measurements of P/S with the corrected waveforms and compare these with existing measurements.

  6. First seismic shear wave velocity profile of the lunar crust as extracted from the Apollo 17 active seismic data by wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-04-01

    We present a new seismic velocity model of the shallow lunar crust, including, for the first time, shear wave velocity information. So far, the shear wave velocity structure of the lunar near-surface was effectively unconstrained due to the complexity of lunar seismograms. Intense scattering and low attenuation in the lunar crust lead to characteristic long-duration reverberations on the seismograms. The reverberations obscure later arriving shear waves and mode conversions, rendering them impossible to identify and analyze. Additionally, only vertical component data were recorded during the Apollo active seismic experiments, which further compromises the identification of shear waves. We applied a novel processing and analysis technique to the data of the Apollo 17 lunar seismic profiling experiment (LSPE), which involved recording seismic energy generated by several explosive packages on a small areal array of four vertical component geophones. Our approach is based on the analysis of the spatial gradients of the seismic wavefield and yields key parameters such as apparent phase velocity and rotational ground motion as a function of time (depth), which cannot be obtained through conventional seismic data analysis. These new observables significantly enhance the data for interpretation of the recorded seismic wavefield and allow, for example, for the identification of S wave arrivals based on their lower apparent phase velocities and distinct higher amount of generated rotational motion relative to compressional (P-) waves. Using our methodology, we successfully identified pure-mode and mode-converted refracted shear wave arrivals in the complex LSPE data and derived a P- and S-wave velocity model of the shallow lunar crust at the Apollo 17 landing site. The extracted elastic-parameter model supports the current understanding of the lunar near-surface structure, suggesting a thin layer of low-velocity lunar regolith overlying a heavily fractured crust of basaltic material showing high (>0.4 down to 60 m) Poisson's ratios. Our new model can be used in future studies to better constrain the deep interior of the Moon. Given the rich information derived from the minimalistic recording configuration, our results demonstrate that wavefield gradient analysis should be critically considered for future space missions that aim to explore the interior structure of extraterrestrial objects by seismic methods. Additionally, we anticipate that the proposed shear wave identification methodology can also be applied to the routinely recorded vertical component data from land seismic exploration on Earth.

  7. Ischia Island: Historical Seismicity and Dynamics

    NASA Astrophysics Data System (ADS)

    Carlino, S.; Cubellis, E.; Iannuzzi, R.; Luongo, G.; Obrizzo, F.

    2003-04-01

    The seismic energy release in volcanic areas is a complex process and the island of Ischia provides a significant scenario of historical seismicity. This is characterized by the occurence of earthquakes with low energy and high intensity. Information on the seismicity of the island spans about eight centuries, starting from 1228. With regard to effects, the most recent earthquake of 1883 is extensively documented both in the literature and unpublished sources. The earthquake caused 2333 deaths and the destruction of the historical and environmental heritage of some areas of the island. The most severe damage occurred in Casamicciola. This event, which was the first great catastrophe after the unification of Italy in the 1860s (Imax = XI degree MCS), represents an important date in the prevention of natural disasters, in that it was after this earthquake that the first Seismic Safety Act in Italy was passed by which lower risk zones were identified for new settlements. Thanks to such detailed analysis, reliable modelling of the seismic source was also obtained. The historical data onwards makes it possible to identify the area of the epicenter of all known earthquakes as the northern slope of Monte Epomeo, while analysis of the effects of earthquakes and the geological structures allows us to evaluate the stress fields that generate the earthquakes. In a volcanic area, interpretation of the mechanisms of release and propagation of seismic energy is made even more complex as the stress field that acts at a regional level is compounded by that generated from migration of magmatic masses towards the surface, as well as the rheologic properties of the rocks dependent on the high geothermic gradient. Such structural and dynamic conditions make the island of Ischia a seismic area of considerable interest. It would appear necessary to evaluate the expected damage caused by a new event linked to the renewal of dynamics of the island, where high population density and the high economic value concerned, the island is a tourist destination and holiday resort, increase the seismic risk. A seismic hazard map of the island is proposed according to a comparative analysis of various types of data: the geology, tectonics, historical seismicity and damage caused by the 28 July 1883 Casamicciola earthquake. The analysis was essentially based on a GIS-aided cross-correlation of these data. The GIS is thus able to provide support both for in-depth analysis of the dynamic processes on the island and extend the assessment to other natural risks (volcanic, landslides, flooding, etc.).

  8. Development of hazard-compatible building fragility and vulnerability models

    USGS Publications Warehouse

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  9. Geophysical analysis for the Ada Tepe region (Bulgaria) - case study

    NASA Astrophysics Data System (ADS)

    Trifonova, Petya; Metodiev, Metodi; Solakov, Dimcho; Simeonova, Stela; Vatseva, Rumiana

    2013-04-01

    According to the current archeological investigations Ada Tepe is the oldest gold mine in Europe with Late Bronze and Early Iron age. It is a typical low-sulfidation epithermal gold deposit and is hosted in Maastrichtian-Paleocene sedimentary rocks above a detachment fault contact with underlying Paleozoic metamorphic rocks. Ada Tepe (25o.39'E; 41o.25'N) is located in the Eastern Rhodope unit. The region is highly segmented despite the low altitude (470-750 m) due to widespread volcanic and sediment rocks susceptible to torrential erosion during the cold season. Besides the thorough geological exploration focused on identifying cost-effective stocks of mineral resources, a detailed geophysical analysis concernig diferent stages of the gold extraction project was accomplished. We present the main results from the geophysical investigation aimed to clarify the complex seismotectonic setting of the Ada Tepe site region. The overall study methodology consists of collecting, reviewing and estimating geophysical and seismological information to constrain the model used for seismic hazard assessment of the area. Geophysical information used in the present work consists of gravity, geomagnetic and seismological data. Interpretation of gravity data is applied to outline the axes of steep gravity transitions marked as potential axes of faults, flexures and other structures of dislocation. Direct inverse techniques are also utilized to estimate the form and depth of anomalous sources. For the purposes of seismological investigation of the Ada Tepe site region an earthquake catalogue is compiled for the time period 510BC - 2011AD. Statistical parameters of seismicity - annual seismic rate parameter, ?, and the b-value of the Gutenberg-Richter exponential relation for Ada Tepe site region, are estimated. All geophysical datasets and derived results are integrated using GIS techniques ensuring interoperability of data when combining, processing and visualizing obtained information from different sources.

  10. Robust Satellite Techniques analysis of ten years (2004-2013) of MSG/SEVIRI TIR radiances over Greece region

    NASA Astrophysics Data System (ADS)

    Genzano, N.; Eleftheriou, A.; Filizzola, C.; Paciello, R.; Pergola, N.; Vallianatos, F.; Tramutoli, V.

    2014-12-01

    Space-time fluctuations of Earth's emitted Thermal InfraRed (TIR) radiation have been observed from satellite months to weeks before earthquakes occurrence. Among the different approach proposed to discern transient anomalous signals possibly associated to seismic activity from normal TIR signal fluctuations (i.e. related to the change of natural factor and/or observation conditions), since 2001 the Robust Satellite Techniques (RST) were used to investigate tens of earthquakes with a wide range of magnitudes (from 4.0 to 7.9) occurred in different continents and in various geo-tectonic setting (e.g. Athens earthquake, 7 September 1999; Abruzzo earthquake, 6 April 2009, etc.).The RST approach gives a statistically - based definition of "TIR anomalies" and offers a suitable method for their identification even in very different local (e.g. related to atmosphere and/or surface) and observational (e.g. related to time/season, but also to solar and satellite zenithal angles) conditions. It has been always carried out by using a validation/confutation approach, to verify the presence/absence of anomalous space-time TIR transients in the presence/absence of seismic activity.In this paper, the RST approach is extensively implemented on 10 years of TIR satellite records collected by the geostationary satellite sensor MSG/SEVIRI over the Greece region. The results of the analysis performed to investigate possible correlations (within predefined space-time windows) of anomalous TIR transients with time and place of occurrence of earthquakes with M>4 will be discussed in terms of reliability and effectiveness also in the perspective of a time-Dependent Assessment of Seismic Hazard (t-DASH) system.

  11. Analysis of the 2003-2004 microseismic sequence in the western part of the Corinth Rift

    NASA Astrophysics Data System (ADS)

    Godano, Maxime; Bernard, Pascal; Dublanchet, Pierre; Canitano, Alexandre; Marsan, David

    2013-04-01

    The Corinth rift is one of the most seismically active zones in Europe. The seismic activity follows a swarm organization with alternation of intensive crisis and more quiescent periods. The seismicity mainly occurs under the Gulf of Corinth in a 3-4 km north-dipping layer between 5 and 12 km. Several hypotheses have been proposed to explain this seismic layer. Nevertheless, the relationships between seismicity, deep structures and faults mapped at the surface remain unclear. Moreover, fluids seem to play a key role in the occurrence of the seismic activity (Bourouis and Cornet 2009, Pacchiani and Lyon-Caen 2009). Recently, a detailed analysis of the microseismicity (multiplets identification, precise relocation, focal mechanisms determination) between 2000 and 2007 in the western part of the Corinth rift have highlighted north-dipping (and some south-dipping) planar active microstructures in the seismic layer with normal fault mechanisms (Lambotte et al., in preparation; Godano et al., in preparation). A multiplet (group of earthquakes with similar waveform) can be interpreted as repeated ruptures on the same asperity due to transient forcing as silent creep on fault segment or fluid circulation. The detailed analysis of the multiplets in the Corinth rift is an opportunity to better understand coupling between seismic and aseismic processes. In the present study we focus on the seismic crisis that occurred from October 2003 to July 2004 in the western part of the Corinth Gulf. This crisis consists in 2431 relocated events with magnitude ranging from 0.5 to 3.1 (b-value = 1.4). The joint analysis of (1) the position of the multiplets with respect to the faults mapped at the surface, (2) the geometry of the main multiplets and (3) the fault plane solutions shows that the seismic crisis is probably related to the activation in depth of the Fassouleika and Aigion faults. The spatio-temporal analysis of the microseismicity highlights an overall migration from south-east to north-west characterized by the successive activation of the multiplets. We next perform a spectral analysis to determine source parameters for each multiplet in order to estimate size of the asperities and cumulative coseismic slip. From the preceding observations and results we finally try to reproduce the 2003-2004 microseismic sequence using rate-and-state 3D asperity model (Dublanchet et al., submitted). The deformation measured during the crisis by the strainmeter installed in the Trizonia island is used in the modeling to constrain the maximum slip amplitude.

  12. Full waveform seismic modelling of Chalk Group rocks from the Danish North Sea - implications for velocity analysis

    NASA Astrophysics Data System (ADS)

    Montazeri, Mahboubeh; Moreau, Julien; Uldall, Anette; Nielsen, Lars

    2015-04-01

    This study aims at understanding seismic wave propagation in the fine-layered Chalk Group, which constitutes the main reservoir for oil and gas production in the Danish North Sea. The starting point of our analysis is the Nana-1XP exploration well, which shows strong seismic contrasts inside the Chalk Group. For the purposes of seismic waveform modelling, we here assume a one-dimensional model with homogeneous and isotropic layers designed to capture the main fluctuations in petrophysical properties observed in the well logs. The model is representative of the stratigraphic sequences of the area and it illustrates highly contrasting properties of the Chalk Group. Finite-difference (FD) full wave technique, both acoustic and elastic equations are applied to the model. Velocity analysis of seismic data is a crucial step for stacking, multiple suppression, migration, and depth conversion of the seismic record. Semblance analysis of the synthetic seismic records shows strong amplitude peaks outside the expected range for the time interval representing the Chalk Group, especially at the base. The various synthetic results illustrate the occurrence and the impact of different types of waves including multiples, converted waves and refracted waves. The interference of these different wave types with the primary reflections can explain the strong anomalous amplitudes in the semblance plot. In particular, the effect of strongly contrasting thin beds plays an important role in the generation of the high anomalous amplitude values. If these anomalous amplitudes are used to pick the velocities, it would impede proper stacking of the data and may result in sub-optimal migration and depth conversion. Consequently this may lead to erroneous or sub-optimal seismic images of the Chalk Group and the underlying layers. Our results highlight the importance of detailed velocity analysis and proper picking of velocity functions in the Chalk Group intervals. We show that application of standard front mutes in the mid- and far-offset ranges does not significantly improve the results of the standard semblance analysis. These synthetic modelling results could be used as starting points for defining optimized processing flows for the seismic data sets acquired in the study area with the aim of improving the imaging of the Chalk Group.

  13. Reflection imaging of the Moon's interior using deep-moonquake seismic interferometry

    NASA Astrophysics Data System (ADS)

    Nishitsuji, Yohei; Rowe, C. A.; Wapenaar, Kees; Draganov, Deyan

    2016-04-01

    The internal structure of the Moon has been investigated over many years using a variety of seismic methods, such as travel time analysis, receiver functions, and tomography. Here we propose to apply body-wave seismic interferometry to deep moonquakes in order to retrieve zero-offset reflection responses (and thus images) beneath the Apollo stations on the nearside of the Moon from virtual sources colocated with the stations. This method is called deep-moonquake seismic interferometry (DMSI). Our results show a laterally coherent acoustic boundary around 50 km depth beneath all four Apollo stations. We interpret this boundary as the lunar seismic Moho. This depth agrees with Japan Aerospace Exploration Agency's (JAXA) SELenological and Engineering Explorer (SELENE) result and previous travel time analysis at the Apollo 12/14 sites. The deeper part of the image we obtain from DMSI shows laterally incoherent structures. Such lateral inhomogeneity we interpret as representing a zone characterized by strong scattering and constant apparent seismic velocity at our resolution scale (0.2-2.0 Hz).

  14. Teaching hands-on geophysics: examples from the Rū seismic network in New Zealand

    NASA Astrophysics Data System (ADS)

    van Wijk, Kasper; Simpson, Jonathan; Adam, Ludmila

    2017-03-01

    Education in physics and geosciences can be effectively illustrated by the analysis of earthquakes and the subsequent propagation of seismic waves in the Earth. Educational seismology has matured to a level where both the hard- and software are robust and user friendly. This has resulted in successful implementation of educational networks around the world. Seismic data recorded by students are of such quality that these can be used in classic earthquake location exercises, for example. But even ocean waves weakly coupled into the Earth’s crust can now be recorded on educational seismometers. These signals are not just noise, but form the basis of more recent developments in seismology, such as seismic interferometry, where seismic waves generated by ocean waves—instead of earthquakes—can be used to infer information about the Earth’s interior. Here, we introduce an earthquake location exercise and an analysis of ambient seismic noise, and present examples. Data are provided, and all needed software is freely available.

  15. Detecting and characterizing high-frequency oscillations in epilepsy: a case study of big data analysis

    NASA Astrophysics Data System (ADS)

    Huang, Liang; Ni, Xuan; Ditto, William L.; Spano, Mark; Carney, Paul R.; Lai, Ying-Cheng

    2017-01-01

    We develop a framework to uncover and analyse dynamical anomalies from massive, nonlinear and non-stationary time series data. The framework consists of three steps: preprocessing of massive datasets to eliminate erroneous data segments, application of the empirical mode decomposition and Hilbert transform paradigm to obtain the fundamental components embedded in the time series at distinct time scales, and statistical/scaling analysis of the components. As a case study, we apply our framework to detecting and characterizing high-frequency oscillations (HFOs) from a big database of rat electroencephalogram recordings. We find a striking phenomenon: HFOs exhibit on-off intermittency that can be quantified by algebraic scaling laws. Our framework can be generalized to big data-related problems in other fields such as large-scale sensor data and seismic data analysis.

  16. Gas Reservoir Identification Basing on Deep Learning of Seismic-print Characteristics

    NASA Astrophysics Data System (ADS)

    Cao, J.; Wu, S.; He, X.

    2016-12-01

    Reservoir identification based on seismic data analysis is the core task in oil and gas geophysical exploration. The essence of reservoir identification is to identify the properties of rock pore fluid. We developed a novel gas reservoir identification method named seismic-print analysis by imitation of the vocal-print analysis techniques in speaker identification. The term "seismic-print" is referred to the characteristics of the seismic waveform which can identify determinedly the property of the geological objectives, for instance, a nature gas reservoir. Seismic-print can be characterized by one or a few parameters named as seismic-print parameters. It has been proven that gas reservoirs are of characteristics of negative 1-order cepstrum coefficient anomaly and Positive 2-order cepstrum coefficient anomaly, concurrently. The method is valid for sandstone gas reservoir, carbonate reservoir and shale gas reservoirs, and the accuracy rate may reach up to 90%. There are two main problems to deal with in the application of seismic-print analysis method. One is to identify the "ripple" of a reservoir on the seismogram, and another is to construct the mapping relationship between the seismic-print and the gas reservoirs. Deep learning developed in recent years is of the ability to reveal the complex non-linear relationship between the attribute and the data, and of ability to extract automatically the features of the objective from the data. Thus, deep learning could been used to deal with these two problems. There are lots of algorithms to carry out deep learning. The algorithms can be roughly divided into two categories: Belief Networks Network (DBNs) and Convolutional Neural Network (CNN). DBNs is a probabilistic generative model, which can establish a joint distribution of the observed data and tags. CNN is a feedforward neural network, which can be used to extract the 2D structure feature of the input data. Both DBNs and CNN can be used to deal with seismic data. We use an improved DBNs to identify carbonate rocks from log data, the accuracy rate can reach up to 83%. DBNs is used to deal with seismic waveform data, more information is obtained. The work was supported by NSFC under grant No. 41430323 and No. 41274128, and State Key Lab. of Oil and Gas Reservoir Geology and Exploration.

  17. Structure of the Suasselkä postglacial fault in northern Finland obtained by analysis of local events and ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Afonin, Nikita; Kozlovskaya, Elena; Kukkonen, Ilmo; Dafne/Finland Working Group

    2017-04-01

    Understanding the inner structure of seismogenic faults and their ability to reactivate is particularly important in investigating the continental intraplate seismicity regime. In our study we address this problem using analysis of local seismic events and ambient seismic noise recorded by the temporary DAFNE array in the northern Fennoscandian Shield. The main purpose of the DAFNE/FINLAND passive seismic array experiment was to characterize the present-day seismicity of the Suasselkä postglacial fault (SPGF), which was proposed as one potential target for the DAFNE (Drilling Active Faults in Northern Europe) project. The DAFNE/FINLAND array comprised an area of about 20 to 100 km and consisted of eight short-period and four broadband three-component autonomous seismic stations installed in the close vicinity of the fault area. The array recorded continuous seismic data during September 2011-May 2013. Recordings of the array have being analysed in order to identify and locate natural earthquakes from the fault area and to discriminate them from the blasts in the Kittilä gold mine. As a result, we found a number of natural seismic events originating from the fault area, which proves that the fault is still seismically active. In order to study the inner structure of the SPGF we use cross-correlation of ambient seismic noise recorded by the array. Analysis of azimuthal distribution of noise sources demonstrated that during the time interval under consideration the distribution of noise sources is close to the uniform one. The continuous data were processed in several steps including single-station data analysis, instrument response removal and time-domain stacking. The data were used to estimate empirical Green's functions between pairs of stations in the frequency band of 0.1-1 Hz and to calculate corresponding surface wave dispersion curves. The S-wave velocity models were obtained as a result of dispersion curve inversion. The results suggest that the area of the SPGF corresponds to a narrow region of low S-wave velocities surrounded by rocks with high S-wave velocities. We interpret this low-velocity region as a non-healed mechanically weak fault damage zone (FDZ) formed due to the last major earthquake that occurred after the last glaciation.

  18. Seismo-acoustic analysis of the near quarry blasts using Plostina small aperture array

    NASA Astrophysics Data System (ADS)

    Ghica, Daniela; Stancu, Iulian; Ionescu, Constantin

    2013-04-01

    Seismic and acoustic signals are important to recognize different type of industrial blasting sources in order to discriminate between them and natural earthquakes. We have analyzed the seismic events listed in the Romanian catalogue (Romplus) for the time interval between 2011 and 2012, and occurred in the Dobrogea region, in order to determine detection seismo-acoustic signals of quarry blasts by Plostina array stations. Dobrogea is known as a seismic region characterized by crustal earthquakes with low magnitudes; at the same time, over 40 quarry mines are located in the area, being sources of blasts recorded both with the seismic and infrasound sensors of the Romanian Seismic Network. Plostina seismo-acoustic array, deployed in the central part of Romania, consists of 7 seismic sites (3C broad-band instruments and accelerometers) collocated with 7 infrasound instruments. The array is particularly used for the seismic monitoring of the local and regional events, as well as for the detection of infrasonic signals produced by various sources. Considering the characteristics of the infrasound sensors (frequency range, dynamic, sensibility), the array proved its efficiency in observing the signals produced by explosions, mine explosion and quarry blasts. The quarry mines included for this study cover distances of two hundreds of kilometers from the station and routinely generate explosions that are detected as seismic and infrasonic signals with Plostina array. The combined seismo-acoustic analysis uses two types of detectors for signal identification: one, applied for the seismic signal identification, is based on array processing techniques (beamforming and frequency-wave number analysis), while the other one, which is used for infrasound detection and characterization, is the automatic detector DFX-PMCC (Progressive Multi-Channel Correlation Method). Infrasonic waves generated by quarry blasts have frequencies ranging from 0.05 Hz up to at least 6 Hz and amplitudes below 5 Pa. Seismic data analysis shows that the frequency range of the signals are above 2 Hz. Surface explosions such as quarry blasts are useful sources for checking detection and location efficiency, when seismic measurements are added. The process is crucial for discrimination purposes and for establishing of a set of ground-truth infrasound events. Ground truth information plays a key role in the interpretation of infrasound signals, by including near-field observations from industrial blasts.

  19. Seismicity of Cascade Volcanoes: Characterization and Comparison

    NASA Astrophysics Data System (ADS)

    Thelen, W. A.

    2016-12-01

    Here we summarize and compare the seismicity around each of the Very High Threat Volcanoes of the Cascade Range of Washington, Oregon and California as defined by the National Volcanic Early Warning System (NVEWS) threat assessment (Ewert et al., 2005). Understanding the background seismic activity and processes controlling it is critical for assessing changes in seismicity and their implications for volcanic hazards. Comparing seismicity at different volcanic centers can help determine what critical factors or processes affect the observed seismic behavior. Of the ten Very High Threat Volcanoes in the Cascade Range, five volcanoes are consistently seismogenic when considering earthquakes within 10 km of the volcanic center or caldera edge (Mount Rainier, Mount St. Helens, Mount Hood, Newberry Caldera, Lassen Volcanic Center). Other Very High Threat volcanoes (South Sister, Mount Baker, Glacier Peak, Crater Lake and Mount Shasta) have comparatively low rates of seismicity and not enough recorded earthquakes to calculate catalog statistics. Using a swarm definition of 3 or more earthquakes occurring in a day with magnitudes above the largest of the network's magnitude of completenesses (M 0.9), we find that Lassen Volcanic Center is the "swarmiest" in terms of percent of seismicity occurring in swarms, followed by Mount Hood, Mount St. Helens and Rainier. The predominance of swarms at Mount Hood may be overstated, as much of the seismicity is occurring on surrounding crustal faults (Jones and Malone, 2005). Newberry Caldera has a relatively short record of seismicity since the permanent network was installed in 2011, however there have been no swarms detected as defined here. Future work will include developing discriminates for volcanic versus tectonic seismicity to better filter the seismic catalog and more precise binning of depths at some volcanoes so that we may better consider different processes. Ewert J. W., Guffanti, M. and Murray, T. L. (2005). An Assessment of Volcanic Threat and Monitoring Capabilities in the United States: Framework for a National Volcano Early Warning System, USGS Open File Report 2005-1164, 62 pp. Jones, J., & Malone, S. D. (2005). Mount hood earthquake activity: Volcanic or tectonic origins? Bulletin Of The Seismological Society Of America, 95(3), 818-832.

  20. The New Italian Seismic Hazard Model

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.; Meletti, C.; Albarello, D.; D'Amico, V.; Luzi, L.; Martinelli, F.; Pace, B.; Pignone, M.; Rovida, A.; Visini, F.

    2017-12-01

    In 2015 the Seismic Hazard Center (Centro Pericolosità Sismica - CPS) of the National Institute of Geophysics and Volcanology was commissioned of coordinating the national scientific community with the aim to elaborate a new reference seismic hazard model, mainly finalized to the update of seismic code. The CPS designed a roadmap for releasing within three years a significantly renewed PSHA model, with regard both to the updated input elements and to the strategies to be followed. The main requirements of the model were discussed in meetings with the experts on earthquake engineering that then will participate to the revision of the building code. The activities were organized in 6 tasks: program coordination, input data, seismicity models, ground motion predictive equations (GMPEs), computation and rendering, testing. The input data task has been selecting the most updated information about seismicity (historical and instrumental), seismogenic faults, and deformation (both from seismicity and geodetic data). The seismicity models have been elaborating in terms of classic source areas, fault sources and gridded seismicity based on different approaches. The GMPEs task has selected the most recent models accounting for their tectonic suitability and forecasting performance. The testing phase has been planned to design statistical procedures to test with the available data the whole seismic hazard models, and single components such as the seismicity models and the GMPEs. In this talk we show some preliminary results, summarize the overall strategy for building the new Italian PSHA model, and discuss in detail important novelties that we put forward. Specifically, we adopt a new formal probabilistic framework to interpret the outcomes of the model and to test it meaningfully; this requires a proper definition and characterization of both aleatory variability and epistemic uncertainty that we accomplish through an ensemble modeling strategy. We use a weighting scheme of the different components of the PSHA model that has been built through three different independent steps: a formal experts' elicitation, the outcomes of the testing phase, and the correlation between the outcomes. Finally, we explore through different techniques the influence on seismic hazard of the declustering procedure.

  1. Statistical forecasting of repetitious dome failures during the waning eruption of Redoubt Volcano, Alaska, February-April 1990

    USGS Publications Warehouse

    Page, R.A.; Lahr, J.C.; Chouet, B.A.; Power, J.A.; Stephens, C.D.

    1994-01-01

    The waning phase of the 1989-1990 eruption of Redoubt Volcano in the Cook Inlet region of south-central Alaska comprised a quasi-regular pattern of repetitious dome growth and destruction that lasted from February 15 to late April 1990. The dome failures produced ash plumes hazardous to airline traffic. In response to this hazard, the Alaska Volcano Observatory sought to forecast these ash-producing events using two approaches. One approach built on early successes in issuing warnings before major eruptions on December 14, 1989 and January 2, 1990. These warnings were based largely on changes in seismic activity related to the occurrence of precursory swarms of long-period seismic events. The search for precursory swarms of long-period seismicity was continued through the waning phase of the eruption and led to warnings before tephra eruptions on March 23 and April 6. The observed regularity of dome failures after February 15 suggested that a statistical forecasting method based on a constant-rate failure model might also be successful. The first statistical forecast was issued on March 16 after seven events had occurred, at an average interval of 4.5 days. At this time, the interval between dome failures abruptly lengthened. Accordingly, the forecast was unsuccessful and further forecasting was suspended until the regularity of subsequent failures could be confirmed. Statistical forecasting resumed on April 12, after four dome failure episodes separated by an average of 7.8 days. One dome failure (April 15) was successfully forecast using a 70% confidence window, and a second event (April 21) was narrowly missed before the end of the activity. The cessation of dome failures after April 21 resulted in a concluding false alarm. Although forecasting success during the eruption was limited, retrospective analysis shows that early and consistent application of the statistical method using a constant-rate failure model and a 90% confidence window could have yielded five successful forecasts and two false alarms; no events would have been missed. On closer examination, the intervals between successive dome failures are not uniform but tend to increase with time. This increase attests to the continuous, slowly decreasing supply of magma to the surface vent during the waning phase of the eruption. The domes formed in a precarious position in a breach in the summit crater rim where they were susceptible to gravitational collapse. The instability of the February 15-April 21 domes relative to the earlier domes is attributed to reaming the lip of the vent by a laterally directed explosion during the major dome-destroying eruption of February 15, a process which would leave a less secure foundation for subsequent domes. ?? 1994.

  2. Comparison of smoothing methods for the development of a smoothed seismicity model for Alaska and the implications for seismic hazard

    NASA Astrophysics Data System (ADS)

    Moschetti, M. P.; Mueller, C. S.; Boyd, O. S.; Petersen, M. D.

    2013-12-01

    In anticipation of the update of the Alaska seismic hazard maps (ASHMs) by the U. S. Geological Survey, we report progress on the comparison of smoothed seismicity models developed using fixed and adaptive smoothing algorithms, and investigate the sensitivity of seismic hazard to the models. While fault-based sources, such as those for great earthquakes in the Alaska-Aleutian subduction zone and for the ~10 shallow crustal faults within Alaska, dominate the seismic hazard estimates for locations near to the sources, smoothed seismicity rates make important contributions to seismic hazard away from fault-based sources and where knowledge of recurrence and magnitude is not sufficient for use in hazard studies. Recent developments in adaptive smoothing methods and statistical tests for evaluating and comparing rate models prompt us to investigate the appropriateness of adaptive smoothing for the ASHMs. We develop smoothed seismicity models for Alaska using fixed and adaptive smoothing methods and compare the resulting models by calculating and evaluating the joint likelihood test. We use the earthquake catalog, and associated completeness levels, developed for the 2007 ASHM to produce fixed-bandwidth-smoothed models with smoothing distances varying from 10 to 100 km and adaptively smoothed models. Adaptive smoothing follows the method of Helmstetter et al. and defines a unique smoothing distance for each earthquake epicenter from the distance to the nth nearest neighbor. The consequence of the adaptive smoothing methods is to reduce smoothing distances, causing locally increased seismicity rates, where seismicity rates are high and to increase smoothing distances where seismicity is sparse. We follow guidance from previous studies to optimize the neighbor number (n-value) by comparing model likelihood values, which estimate the likelihood that the observed earthquake epicenters from the recent catalog are derived from the smoothed rate models. We compare likelihood values from all rate models to rank the smoothing methods. We find that adaptively smoothed seismicity models yield better likelihood values than the fixed smoothing models. Holding all other (source and ground motion) models constant, we calculate seismic hazard curves for all points across Alaska on a 0.1 degree grid, using the adaptively smoothed and fixed smoothed seismicity models separately. Because adaptively smoothed models concentrate seismicity near the earthquake epicenters where seismicity rates are high, the corresponding hazard values are higher, locally, but reduced with distance from observed seismicity, relative to the hazard from fixed-bandwidth models. We suggest that adaptively smoothed seismicity models be considered for implementation in the update to the ASHMs because of their improved likelihood estimates relative to fixed smoothing methods; however, concomitant increases in seismic hazard will cause significant changes in regions of high seismicity, such as near the subduction zone, northeast of Kotzebue, and along the NNE trending zone of seismicity in the Alaskan interior.

  3. Comparison of smoothing methods for the development of a smoothed seismicity model for Alaska and the implications for seismic hazard

    USGS Publications Warehouse

    Moschetti, Morgan P.; Mueller, Charles S.; Boyd, Oliver S.; Petersen, Mark D.

    2014-01-01

    In anticipation of the update of the Alaska seismic hazard maps (ASHMs) by the U. S. Geological Survey, we report progress on the comparison of smoothed seismicity models developed using fixed and adaptive smoothing algorithms, and investigate the sensitivity of seismic hazard to the models. While fault-based sources, such as those for great earthquakes in the Alaska-Aleutian subduction zone and for the ~10 shallow crustal faults within Alaska, dominate the seismic hazard estimates for locations near to the sources, smoothed seismicity rates make important contributions to seismic hazard away from fault-based sources and where knowledge of recurrence and magnitude is not sufficient for use in hazard studies. Recent developments in adaptive smoothing methods and statistical tests for evaluating and comparing rate models prompt us to investigate the appropriateness of adaptive smoothing for the ASHMs. We develop smoothed seismicity models for Alaska using fixed and adaptive smoothing methods and compare the resulting models by calculating and evaluating the joint likelihood test. We use the earthquake catalog, and associated completeness levels, developed for the 2007 ASHM to produce fixed-bandwidth-smoothed models with smoothing distances varying from 10 to 100 km and adaptively smoothed models. Adaptive smoothing follows the method of Helmstetter et al. and defines a unique smoothing distance for each earthquake epicenter from the distance to the nth nearest neighbor. The consequence of the adaptive smoothing methods is to reduce smoothing distances, causing locally increased seismicity rates, where seismicity rates are high and to increase smoothing distances where seismicity is sparse. We follow guidance from previous studies to optimize the neighbor number (n-value) by comparing model likelihood values, which estimate the likelihood that the observed earthquake epicenters from the recent catalog are derived from the smoothed rate models. We compare likelihood values from all rate models to rank the smoothing methods. We find that adaptively smoothed seismicity models yield better likelihood values than the fixed smoothing models. Holding all other (source and ground motion) models constant, we calculate seismic hazard curves for all points across Alaska on a 0.1 degree grid, using the adaptively smoothed and fixed smoothed seismicity models separately. Because adaptively smoothed models concentrate seismicity near the earthquake epicenters where seismicity rates are high, the corresponding hazard values are higher, locally, but reduced with distance from observed seismicity, relative to the hazard from fixed-bandwidth models. We suggest that adaptively smoothed seismicity models be considered for implementation in the update to the ASHMs because of their improved likelihood estimates relative to fixed smoothing methods; however, concomitant increases in seismic hazard will cause significant changes in regions of high seismicity, such as near the subduction zone, northeast of Kotzebue, and along the NNE trending zone of seismicity in the Alaskan interior.

  4. Evaluation of tunnel seismic prediction (TSP) result using the Japanese highway rock mass classification system for Pahang-Selangor Raw Water Transfer Tunnel

    NASA Astrophysics Data System (ADS)

    Von, W. C.; Ismail, M. A. M.

    2017-10-01

    The knowing of geological profile ahead of tunnel face is significant to minimize the risk in tunnel excavation work and cost control in preventative measure. Due to mountainous area, site investigation with vertical boring is not recommended to obtain the geological profile for Pahang-Selangor Raw Water Transfer project. Hence, tunnel seismic prediction (TSP) method is adopted to predict the geological profile ahead of tunnel face. In order to evaluate the TSP results, IBM SPSS Statistic 22 is used to run artificial neural network (ANN) analysis to back calculate the predicted Rock Grade Points (JH) from actual Rock Grade Points (JH) using Vp, Vs and Vp/Vs from TSP. The results show good correlation between predicted Rock Grade points and actual Rock Grade Points (JH). In other words, TSP can provide geological profile prediction ahead of tunnel face significantly while allowing continuously TBM excavation works. Identifying weak zones or faults ahead of tunnel face is crucial for preventative measures to be carried out in advance for a safer tunnel excavation works.

  5. Global earthquake catalogs and long-range correlation of seismic activity (Invited)

    NASA Astrophysics Data System (ADS)

    Ogata, Y.

    2009-12-01

    In view of the long-term seismic activity in the world, homogeneity of a global catalog is indispensable. Lately, Engdahl and Villaseñor (2002) compiled a global earthquake catalog of magnitude (M)7.0 or larger during the last century (1900-1999). This catalog is based on the various existing catalogs such as Abe catalog (Abe, 1981, 1984; Abe and Noguchi, 1983a, b) for the world seismicity (1894-1980), its modified catalogs by Perez and Scholz (1984) and by Pacheco and Sykes (1992), and also the Harvard University catalog since 1975. However, the original surface wave magnitudes of Abe catalog were systematically changed by Perez and Scholz (1984) and Pacheco and Sykes (1992). They suspected inhomogeneity of the Abe catalog and claimed that the two seeming changes in the occurrence rate around 1922 and 1948 resulted from magnitude shifts for some instrumental-related reasons. They used a statistical test assuming that such a series of large earthquakes in the world should behave as the stationary Poisson process (uniform occurrences). It is obvious that their claim strongly depends on their a priori assumption of an independent or short-range dependence of earthquake occurrence. We question this assumption from the viewpoint of long-range dependence of seismicity. We make some statistical analyses of the spectrum, dispersion-time diagrams and R/S for estimating and testing of the long-range correlations. We also attempt to show the possibility that the apparent rate change in the global seismicity can be simulated by a certain long-range correlated process. Further, if we divide the globe into the two regions of high and low latitudes, for example, we have different shapes of the cumulative curves to each other, and the above mentioned apparent change-points disappear from the both regions. This suggests that the Abe catalog shows the genuine seismic activity rather than the artifact of the suspected magnitude shifts that should appear in any wide enough regions. We also use a local catalog for the wide regional area around Japan (Utsu, 1982a, b; Japan Meteorological Agency) which covers the period of 1885-1999, complete with M>=6.0 and occupies about 10% of the world seismicity. The synchronous variation of seismic frequency in the high latitude area of the world and in the regional area around Japan obtained from the independent catalogs is suggestive of an external effect such as a large-scale motion of the earth rather than the presupposed inhomogeneity of the catalogs.

  6. Estimation of anisotropy parameters in organic-rich shale: Rock physics forward modeling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herawati, Ida, E-mail: ida.herawati@students.itb.ac.id; Winardhi, Sonny; Priyono, Awali

    Anisotropy analysis becomes an important step in processing and interpretation of seismic data. One of the most important things in anisotropy analysis is anisotropy parameter estimation which can be estimated using well data, core data or seismic data. In seismic data, anisotropy parameter calculation is generally based on velocity moveout analysis. However, the accuracy depends on data quality, available offset, and velocity moveout picking. Anisotropy estimation using seismic data is needed to obtain wide coverage of particular layer anisotropy. In anisotropic reservoir, analysis of anisotropy parameters also helps us to better understand the reservoir characteristics. Anisotropy parameters, especially ε, aremore » related to rock property and lithology determination. Current research aims to estimate anisotropy parameter from seismic data and integrate well data with case study in potential shale gas reservoir. Due to complexity in organic-rich shale reservoir, extensive study from different disciplines is needed to understand the reservoir. Shale itself has intrinsic anisotropy caused by lamination of their formed minerals. In order to link rock physic with seismic response, it is necessary to build forward modeling in organic-rich shale. This paper focuses on studying relationship between reservoir properties such as clay content, porosity and total organic content with anisotropy. Organic content which defines prospectivity of shale gas can be considered as solid background or solid inclusion or both. From the forward modeling result, it is shown that organic matter presence increases anisotropy in shale. The relationships between total organic content and other seismic properties such as acoustic impedance and Vp/Vs are also presented.« less

  7. Post-blasting seismicity in Rudna copper mine, Poland - source parameters analysis.

    NASA Astrophysics Data System (ADS)

    Caputa, Alicja; Rudziński, Łukasz; Talaga, Adam

    2017-04-01

    The really important hazard in Polish copper mines is high seismicity and corresponding rockbursts. Many methods are used to reduce the seismic hazard. Among others the most effective is preventing blasting in potentially hazardous mining panels. The method is expected to provoke small moderate tremors (up to M2.0) and reduce in this way a stress accumulation in the rockmass. This work presents an analysis, which deals with post-blasting events in Rudna copper mine, Poland. Using the Full Moment Tensor (MT) inversion and seismic spectra analysis, we try to find some characteristic features of post blasting seismic sources. Source parameters estimated for post-blasting events are compared with the parameters of not-provoked mining events that occurred in the vicinity of the provoked sources. Our studies show that focal mechanisms of events which occurred after blasts have similar MT decompositions, namely are characterized by a quite strong isotropic component as compared with the isotropic component of not-provoked events. Also source parameters obtained from spectral analysis show that provoked seismicity has a specific source physics. Among others, it is visible from S to P wave energy ratio, which is higher for not-provoked events. The comparison of all our results reveals a three possible groups of sources: a) occurred just after blasts, b) occurred from 5min to 24h after blasts and c) not-provoked seismicity (more than 24h after blasting). Acknowledgements: This work was supported within statutory activities No3841/E-41/S/2016 of Ministry of Science and Higher Education of Poland.

  8. Incorporating induced seismicity in the 2014 United States National Seismic Hazard Model: results of the 2014 workshop and sensitivity studies

    USGS Publications Warehouse

    Petersen, Mark D.; Mueller, Charles S.; Moschetti, Morgan P.; Hoover, Susan M.; Rubinstein, Justin L.; Llenos, Andrea L.; Michael, Andrew J.; Ellsworth, William L.; McGarr, Arthur F.; Holland, Austin A.; Anderson, John G.

    2015-01-01

    The U.S. Geological Survey National Seismic Hazard Model for the conterminous United States was updated in 2014 to account for new methods, input models, and data necessary for assessing the seismic ground shaking hazard from natural (tectonic) earthquakes. The U.S. Geological Survey National Seismic Hazard Model project uses probabilistic seismic hazard analysis to quantify the rate of exceedance for earthquake ground shaking (ground motion). For the 2014 National Seismic Hazard Model assessment, the seismic hazard from potentially induced earthquakes was intentionally not considered because we had not determined how to properly treat these earthquakes for the seismic hazard analysis. The phrases “potentially induced” and “induced” are used interchangeably in this report, however it is acknowledged that this classification is based on circumstantial evidence and scientific judgment. For the 2014 National Seismic Hazard Model update, the potentially induced earthquakes were removed from the NSHM’s earthquake catalog, and the documentation states that we would consider alternative models for including induced seismicity in a future version of the National Seismic Hazard Model. As part of the process of incorporating induced seismicity into the seismic hazard model, we evaluate the sensitivity of the seismic hazard from induced seismicity to five parts of the hazard model: (1) the earthquake catalog, (2) earthquake rates, (3) earthquake locations, (4) earthquake Mmax (maximum magnitude), and (5) earthquake ground motions. We describe alternative input models for each of the five parts that represent differences in scientific opinions on induced seismicity characteristics. In this report, however, we do not weight these input models to come up with a preferred final model. Instead, we present a sensitivity study showing uniform seismic hazard maps obtained by applying the alternative input models for induced seismicity. The final model will be released after further consideration of the reliability and scientific acceptability of each alternative input model. Forecasting the seismic hazard from induced earthquakes is fundamentally different from forecasting the seismic hazard for natural, tectonic earthquakes. This is because the spatio-temporal patterns of induced earthquakes are reliant on economic forces and public policy decisions regarding extraction and injection of fluids. As such, the rates of induced earthquakes are inherently variable and nonstationary. Therefore, we only make maps based on an annual rate of exceedance rather than the 50-year rates calculated for previous U.S. Geological Survey hazard maps.

  9. Elements of the tsunami precursors' detection physics

    NASA Astrophysics Data System (ADS)

    Novik, Oleg; Ruzhin, Yuri; Ershov, Sergey; Volgin, Max; Smirnov, Fedor

    In accordance with the main physical principles and geophysical data, we formulated a nonlinear mathematical model of seismo-hydro-electromagnetic (EM) geophysical field interaction and calculated generation and propagation of elastic, EM, temperature and hydrodynamic seismically generated disturbances (i.e. signals) in the basin of a marginal sea. We show transferring of seismic and electromagnetic (EM) energy from the upper mantle beneath the sea into its depths and EM emission from the sea surface into the atmosphere. Basing on the calculated characteristics of the signals of different physical nature (computations correspond to measurements of other authors) we develop the project of a Lithosphere-Ocean-Atmosphere Monitoring System (LOAMS) including: a bottom complex, a moored ocean surface buoy complex, an observational balloon complex, and satellite complex. The underwater stations of the bottom complex of the LOAMS will record the earlier signals of seismic activation beneath a seafloor (the ULF EM signals outrun seismic ones, according to the above calculations) and localize the seafloor epicenter of an expected seaquake. These stations will be equipped, in particular, with: magnetometers, the lines for the electric field measurements, and magneto-telluric blocks to discover dynamics of physical parameters beneath a sea floor as signs of a seaquake and/or tsunami preparation process. The buoy and balloon complexes of the LOAMS will record the meteorological and oceanographic parameters' variations including changes of reflection from a sea surface (tsunami ‘shadows’) caused by a tsunami wave propagation. Cables of the balloon and moored buoy will be used as receiving antennas and for multidisciplinary measurements including gradients of the fields (we show the cases are possible when the first seismic EM signal will be registered by an antenna above a sea). Also, the project includes radio-tomography with satellite instrumentation and sounding of the ionosphere from the buoy, balloon and satellite complexes. The balloon and buoy complexes will transmit data to a shore station over satellite link. The frequency ranges and sensitivity thresholds of all of the sensors of the LOAMS will be adapted to the characteristics of expected seismic signals according to the numerical research above. Computational methods and statistical analysis (e.g. seismic changes of coherence of spatially distributed sensors of different nature) of the recorded multidimensional time series will be used for prognostic interpretation. The multilevel recordings will provide a stable noise (e.g. ionosphere Pc pulsations, hard sea, industry) and seismic event detection. An intensive heat flow typical for tectonically active lithosphere zones may be considered as an energy source for advanced modifications of the LOAMS. The latter may be used as a warning system for continental and marine technologies, e.g. a sea bottom geothermal energy production. Indeed, seismic distraction of the nuclear power station Fukushima I demonstrates that similar technology hardly is able to solve the energy problems in seismically active regions. On the other hand, the LOAMS may be considered as a scientific observatory for development of the seaquake/tsunami precursor physics, i.e. seismo-hydro-electromagnetics.

  10. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

    2018-01-01

    Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

  11. 5 years of continuous seismic monitoring of a mountain river in the Pyrenees

    NASA Astrophysics Data System (ADS)

    Diaz, Jordi; Sanchez-Pastor, Pilar S.; Gallart, Josep

    2017-04-01

    The analysis of background seismic noise variations in the proximity of river channels has revealed as a useful tool to monitor river flow, even for modest discharges. Nevertheless, this monitoring is usually carried on using temporal deployments of seismic stations. The CANF seismic broad-band station, acquiring data continuously since 2010 and located inside an old railway tunnel in the Central Pyrenees, at about 400 m of the Aragón River channel, provides an excellent opportunity to enlarge this view and present a long term monitoring of a mountain river. Seismic signals in the 2-10 Hz band clearly related to river discharges have been identified in the seismic records. Discharge increases due to rainfall, large storms resulting in floods and snowmelt periods can be discriminated from the analysis of the seismic data. Up to now, two large rainfall events resulting in large discharge and damaging floods have been recorded, both sharing similar properties which can be used to implement automatic procedures to identify seismically potentially damaging floods. Another natural process that can be characterized using continuouly acquired seismic data is mountain snowmelt, as this process results in characteristic discharge patterns which can be identified in the seismic data. The time occurrence and intensity of the snowmelt stages for each season can be identified and the 5 seasons available so far compared to detect possible trends The so-called fluvial seismology can also provide important clues to evaluate the beadload transport in rivers, an important parameter to evaluate erosion rates in mountain environments. Analyzing both the amplitude and frequency variations of the seismic data and its hysteresis cycles, it seems possible to estimate the relative contribution of water flow and bedload transport to the seismic signal. The available results suggest that most of the river-generated seismic signal seems related to bed load transportation, while water turbulence is only significant above a discharge thres.hold Since 2015 we are operating 2 additional stations located beside the Cinca and Segre Rivers, also in the Pyrenean range. First results confirm that the river-generated signal can also be identified at these sites, although wind-related signals are recorded in a close frequency band and hence some further analysis is required to discern between both processes. (Founding: MISTERIOS project, CGL2013-48601-C2-1-R)

  12. Infrasonic and seismic signals from earthquakes and explosions observed with Plostina seismo-acoustic array

    NASA Astrophysics Data System (ADS)

    Ghica, D.; Ionescu, C.

    2012-04-01

    Plostina seismo-acoustic array has been recently deployed by the National Institute for Earth Physics in the central part of Romania, near the Vrancea epicentral area. The array has a 2.5 km aperture and consists of 7 seismic sites (PLOR) and 7 collocated infrasound instruments (IPLOR). The array is being used to assess the importance of collocated seismic and acoustic sensors for the purposes of (1) seismic monitoring of the local and regional events, and (2) acoustic measurement, consisting of detection of the infrasound events (explosions, mine and quarry blasts, earthquakes, aircraft etc.). This paper focuses on characterization of infrasonic and seismic signals from the earthquakes and explosions (accidental and mining type). Two Vrancea earthquakes with magnitude above 5.0 were selected to this study: one occurred on 1st of May 2011 (MD = 5.3, h = 146 km), and the other one, on 4th October 2011 (MD = 5.2, h = 142 km). The infrasonic signals from the earthquakes have the appearance of the vertical component of seismic signals. Because the mechanism of the infrasonic wave formation is the coupling of seismic waves with the atmosphere, trace velocity values for such signals are compatible with the characteristics of the various seismic phases observed with PLOR array. The study evaluates and characterizes, as well, infrasound and seismic data recorded from the explosion caused by the military accident produced at Evangelos Florakis Naval Base, in Cyprus, on 11th July 2011. Additionally, seismo-acoustic signals presumed to be related to strong mine and quarry blasts were investigated. Ground truth of mine observations provides validation of this interpretation. The combined seismo-acoustic analysis uses two types of detectors for signal identification: one is the automatic detector DFX-PMCC, applied for infrasound detection and characterization, while the other one, which is used for seismic data, is based on array processing techniques (beamforming and frequency-wave number analysis). Spectrograms of the recorded infrasonic and seismic data were examined, showing that an earthquake produces acoustic signals with a high energy in the 1 to 5 Hz frequency range, while, for the explosion, this range lays below 0.6 Hz. Using the combined analysis of the seismic and acoustic data, Plostina array can greatly enhance the event detection and localization in the region. The analysis can be, as well, particularly important in identifying sources of industrial explosion, and therefore, in monitoring of the hazard created both by earthquakes and anthropogenic sources of pollution (chemical factories, nuclear and power plants, refineries, mines).

  13. Seismic data compression speeds exploration projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galibert, P.Y.

    As part of an ongoing commitment to ensure industry-wide distribution of its revolutionary seismic data compression technology, Chevron Petroleum Technology Co. (CPTC) has entered into licensing agreements with Compagnie Generale de Geophysique (CGG) and other seismic contractors for use of its software in oil and gas exploration programs. CPTC expects use of the technology to be far-reaching to all of its industry partners involved in seismic data collection, processing, analysis and storage. Here, CGG--one of the world`s leading seismic acquisition and processing companies--talks about its success in applying the new methodology to replace full on-board seismic processing. Chevron`s technology ismore » already being applied on large off-shore 3-D seismic surveys. Worldwide, CGG has acquired more than 80,000 km of seismic data using the data compression technology.« less

  14. Sequential geophysical and flow inversion to characterize fracture networks in subsurface systems

    DOE PAGES

    Mudunuru, Maruti Kumar; Karra, Satish; Makedonska, Nataliia; ...

    2017-09-05

    Subsurface applications, including geothermal, geological carbon sequestration, and oil and gas, typically involve maximizing either the extraction of energy or the storage of fluids. Fractures form the main pathways for flow in these systems, and locating these fractures is critical for predicting flow. However, fracture characterization is a highly uncertain process, and data from multiple sources, such as flow and geophysical are needed to reduce this uncertainty. We present a nonintrusive, sequential inversion framework for integrating data from geophysical and flow sources to constrain fracture networks in the subsurface. In this framework, we first estimate bounds on the statistics formore » the fracture orientations using microseismic data. These bounds are estimated through a combination of a focal mechanism (physics-based approach) and clustering analysis (statistical approach) of seismic data. Then, the fracture lengths are constrained using flow data. In conclusion, the efficacy of this inversion is demonstrated through a representative example.« less

  15. Sequential geophysical and flow inversion to characterize fracture networks in subsurface systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mudunuru, Maruti Kumar; Karra, Satish; Makedonska, Nataliia

    Subsurface applications, including geothermal, geological carbon sequestration, and oil and gas, typically involve maximizing either the extraction of energy or the storage of fluids. Fractures form the main pathways for flow in these systems, and locating these fractures is critical for predicting flow. However, fracture characterization is a highly uncertain process, and data from multiple sources, such as flow and geophysical are needed to reduce this uncertainty. We present a nonintrusive, sequential inversion framework for integrating data from geophysical and flow sources to constrain fracture networks in the subsurface. In this framework, we first estimate bounds on the statistics formore » the fracture orientations using microseismic data. These bounds are estimated through a combination of a focal mechanism (physics-based approach) and clustering analysis (statistical approach) of seismic data. Then, the fracture lengths are constrained using flow data. In conclusion, the efficacy of this inversion is demonstrated through a representative example.« less

  16. Impact of Demographic Siting Criteria and Environmental Suitability on Land Availability for Nuclear Reactor Siting

    NASA Technical Reports Server (NTRS)

    Hansen, K. L.

    1982-01-01

    The effect of population and certain environmental characteristics on the availability of land for siting nuclear power plants was assessed. The study area, consisting of the 48 contiguous states, was divided into 5 kilometer (km) square grid cells yielding a total of 600,000 cells. Through the use of a modern geographic information system, it was possible to provide a detailed analysis of a quite large area. Numerous maps and statistical tables were produced, the detail of which were limited only by available data. Evaluation issues included population density, restricted lands, seismic hardening, site preparation, water availability, and cost factors.

  17. Structural Identification And Seismic Analysis Of An Existing Masonry Building

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Del Monte, Emanuele; Galano, Luciano; Ortolani, Barbara

    2008-07-08

    The paper presents the diagnostic investigation and the seismic analysis performed on an ancient masonry building in Florence. The building has historical interest and is subjected to conservative restrictions. The investigation involves a preliminary phase concerning the research of the historic documents and a second phase of execution of in situ and laboratory tests to detect the mechanical characteristics of the masonry. This investigation was conceived in order to obtain the 'LC2 Knowledge Level' and to perform the non-linear pushover analysis according to the new Italian Standards for seismic upgrading of existing masonry buildings.

  18. Results of application of automatic computation of static corrections on data from the South Banat Terrain

    NASA Astrophysics Data System (ADS)

    Milojević, Slavka; Stojanovic, Vojislav

    2017-04-01

    Due to the continuous development of the seismic acquisition and processing method, the increase of the signal/fault ratio always represents a current target. The correct application of the latest software solutions improves the processing results and justifies their development. A correct computation and application of static corrections represents one of the most important tasks in pre-processing. This phase is of great importance for further processing steps. Static corrections are applied to seismic data in order to compensate the effects of irregular topography, the difference between the levels of source points and receipt in relation to the level of reduction, of close to the low-velocity surface layer (weathering correction), or any reasons that influence the spatial and temporal position of seismic routes. The refraction statics method is the most common method for computation of static corrections. It is successful in resolving of both the long-period statics problems and determining of the difference in the statics caused by abrupt lateral changes in velocity in close to the surface layer. XtremeGeo FlatironsTM is a program whose main purpose is computation of static correction through a refraction statics method and allows the application of the following procedures: picking of first arrivals, checking of geometry, multiple methods for analysis and modelling of statics, analysis of the refractor anisotropy and tomography (Eikonal Tomography). The exploration area is located on the southern edge of the Pannonian Plain, in the plain area with altitudes of 50 to 195 meters. The largest part of the exploration area covers Deliblato Sands, where the geological structure of the terrain and high difference in altitudes significantly affects the calculation of static correction. Software XtremeGeo FlatironsTM has powerful visualization and tools for statistical analysis which contributes to significantly more accurate assessment of geometry close to the surface layers and therefore more accurately computed static corrections.

  19. Refining locations of the 2005 Mukacheve, West Ukraine, earthquakes based on similarity of their waveforms

    NASA Astrophysics Data System (ADS)

    Gnyp, Andriy

    2009-06-01

    Based on the results of application of correlation analysis to records of the 2005 Mukacheve group of recurrent events and their subsequent relocation relative to the reference event of 7 July 2005, a conclusion has been drawn that all the events had most likely occurred on the same rup-ture plane. Station terms have been estimated for seismic stations of the Transcarpathians, accounting for variation of seismic velocities beneath their locations as compared to the travel time tables used in the study. In methodical aspect, potentials and usefulness of correlation analysis of seismic records for a more detailed study of seismic processes, tectonics and geodynamics of the Carpathian region have been demonstrated.

  20. Forecasting induced seismicity rate and Mmax using calibrated numerical models

    NASA Astrophysics Data System (ADS)

    Dempsey, D.; Suckale, J.

    2016-12-01

    At Groningen, The Netherlands, several decades of induced seismicity from gas extraction has culminated in a M 3.6 event (mid 2012). From a public safety and commercial perspective, it is desirable to anticipate future seismicity outcomes at Groningen. One way to quantify earthquake risk is Probabilistic Seismic Hazard Analysis (PSHA), which requires an estimate of the future seismicity rate and its magnitude frequency distribution (MFD). This approach is effective at quantifying risk from tectonic events because the seismicity rate, once measured, is almost constant over timescales of interest. In contrast, rates of induced seismicity vary significantly over building lifetimes, largely in response to changes in injection or extraction. Thus, the key to extending PSHA to induced earthquakes is to estimate future changes of the seismicity rate in response to some proposed operating schedule. Numerical models can describe the physical link between fluid pressure, effective stress change, and the earthquake process (triggering and propagation). However, models with predictive potential of individual earthquakes face the difficulty of characterizing specific heterogeneity - stress, strength, roughness, etc. - at locations of interest. Modeling catalogs of earthquakes provides a means of averaging over this uncertainty, focusing instead on the collective features of the seismicity, e.g., its rate and MFD. The model we use incorporates fluid pressure and stress changes to describe nucleation and crack-like propagation of earthquakes on stochastically characterized 1D faults. This enables simulation of synthetic catalogs of induced seismicity from which the seismicity rate, location and MFD are extracted. A probability distribution for Mmax - the largest event in some specified time window - is also computed. Because the model captures the physics linking seismicity to changes in the reservoir, earthquake observations and operating information can be used to calibrate a model at a specific site (or, ideally, many models). This restricts analysis of future seismicity to likely parameter sets and provides physical justification for linking operational changes to subsequent seismicity. To illustrate these concepts, a recent study of prior and forecast seismicity at Groningen will be presented.

Top