Science.gov

Sample records for accurate earthquake locations

  1. One dimensional P wave velocity structure of the crust beneath west Java and accurate hypocentre locations from local earthquake inversion

    SciTech Connect

    Supardiyono; Santosa, Bagus Jaya

    2012-06-20

    A one-dimensional (1-D) velocity model and station corrections for the West Java zone were computed by inverting P-wave arrival times recorded on a local seismic network of 14 stations. A total of 61 local events with a minimum of 6 P-phases, rms 0.56 s and a maximum gap of 299 Degree-Sign were selected. Comparison with previous earthquake locations shows an improvement for the relocated earthquakes. Tests were carried out to verify the robustness of inversion results in order to corroborate the conclusions drawn out from our reasearch. The obtained minimum 1-D velocity model can be used to improve routine earthquake locations and represents a further step toward more detailed seismotectonic studies in this area of West Java.

  2. Earthquake location in island arcs

    USGS Publications Warehouse

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    -velocity lithospheric slab. In application, JHD has the practical advantage that it does not require the specification of a theoretical velocity model for the slab. Considering earthquakes within a 260 km long by 60 km wide section of the Aleutian main thrust zone, our results suggest that the theoretical velocity structure of the slab is presently not sufficiently well known that accurate locations can be obtained independently of locally recorded data. Using a locally recorded earthquake as a calibration event, JHD gave excellent results over the entire section of the main thrust zone here studied, without showing a strong effect that might be attributed to spatially varying source-station anomalies. We also calibrated the ray-tracing method using locally recorded data and obtained results generally similar to those obtained by JHD. ?? 1982.

  3. Wave-equation Based Earthquake Location

    NASA Astrophysics Data System (ADS)

    Tong, P.; Yang, D.; Yang, X.; Chen, J.; Harris, J.

    2014-12-01

    Precisely locating earthquakes is fundamentally important for studying earthquake physics, fault orientations and Earth's deformation. In industry, accurately determining hypocenters of microseismic events triggered in the course of a hydraulic fracturing treatment can help improve the production of oil and gas from unconventional reservoirs. We develop a novel earthquake location method based on solving full wave equations to accurately locate earthquakes (including microseismic earthquakes) in complex and heterogeneous structures. Traveltime residuals or differential traveltime measurements with the waveform cross-correlation technique are iteratively inverted to obtain the locations of earthquakes. The inversion process involves the computation of the Fréchet derivative with respect to the source (earthquake) location via the interaction between a forward wavefield emitting from the source to the receiver and an adjoint wavefield reversely propagating from the receiver to the source. When there is a source perturbation, the Fréchet derivative not only measures the influence of source location but also the effects of heterogeneity, anisotropy and attenuation of the subsurface structure on the arrival of seismic wave at the receiver. This is essential for the accuracy of earthquake location in complex media. In addition, to reduce the computational cost, we can first assume that seismic wave only propagates in a vertical plane passing through the source and the receiver. The forward wavefield, adjoint wavefield and Fréchet derivative with respect to the source location are all computed in a 2D vertical plane. By transferring the Fréchet derivative along the horizontal direction of the 2D plane into the ones along Latitude and Longitude coordinates or local 3D Cartesian coordinates, the source location can be updated in a 3D geometry. The earthquake location obtained with this combined 2D-3D approach can then be used as the initial location for a true 3D wave

  4. Acoustic wave-equation-based earthquake location

    NASA Astrophysics Data System (ADS)

    Tong, Ping; Yang, Dinghui; Liu, Qinya; Yang, Xu; Harris, Jerry

    2016-04-01

    We present a novel earthquake location method using acoustic wave-equation-based traveltime inversion. The linear relationship between the location perturbation (δt0, δxs) and the resulting traveltime residual δt of a particular seismic phase, represented by the traveltime sensitivity kernel K(t0, xs) with respect to the earthquake location (t0, xs), is theoretically derived based on the adjoint method. Traveltime sensitivity kernel K(t0, xs) is formulated as a convolution between the forward and adjoint wavefields, which are calculated by numerically solving two acoustic wave equations. The advantage of this newly derived traveltime kernel is that it not only takes into account the earthquake-receiver geometry but also accurately honours the complexity of the velocity model. The earthquake location is obtained by solving a regularized least-squares problem. In 3-D realistic applications, it is computationally expensive to conduct full wave simulations. Therefore, we propose a 2.5-D approach which assumes the forward and adjoint wave simulations within a 2-D vertical plane passing through the earthquake and receiver. Various synthetic examples show the accuracy of this acoustic wave-equation-based earthquake location method. The accuracy and efficiency of the 2.5-D approach for 3-D earthquake location are further verified by its application to the 2004 Big Bear earthquake in Southern California.

  5. Precisely locating the Klamath Falls, Oregon, earthquakes

    USGS Publications Warehouse

    Qamar, A.; Meagher, K.L.

    1993-01-01

    In this article we present preliminary results of a close-in, instrumental study of the Klamath Falls earthquake sequence, carried as a cooperative effort by scientists from the U.S Geological Survey (USGS) and universities in Washington, Orgeon, and California. In addition to obtaining much mroe accurate earthquake locations, this study has improved our understanding of the relationship between seismicity and mapped faults in the region. 

  6. Probabilistic earthquake location and 3-D velocity models in routine earthquake location

    NASA Astrophysics Data System (ADS)

    Lomax, A.; Husen, S.

    2003-12-01

    Earthquake monitoring agencies, such as local networks or CTBTO, are faced with the dilemma of providing routine earthquake locations in near real-time with high precision and meaningful uncertainty information. Traditionally, routine earthquake locations are obtained from linearized inversion using layered seismic velocity models. This approach is fast and simple. However, uncertainties derived from a linear approximation to a set of non-linear equations can be imprecise, unreliable, or even misleading. In addition, 1-D velocity models are a poor approximation to real Earth structure in tectonically complex regions. In this paper, we discuss the routine location of earthquakes in near real-time with high precision using non-linear, probabilistic location methods and 3-D velocity models. The combination of non-linear, global search algorithms with probabilistic earthquake location provides a fast and reliable tool for earthquake location that can be used with any kind of velocity model. The probabilistic solution to the earthquake location includes a complete description of location uncertainties, which may be irregular and multimodal. We present applications of this approach to determine seismicity in Switzerland and in Yellowstone National Park, WY. Comparing our earthquake locations to earthquake locations obtained using linearized inversion and 1-D velocity models clearly demonstrates the advantages of probabilistic earthquake location and 3-D velocity models. For example, the more complete and reliable uncertainty information of non-linear, probabilistic earthquake location greatly facilitates the identification of poorly constrained hypocenters. Such events are often not identified in linearized earthquake location, since the location uncertainties are determined with a simplified, localized and approximate Gaussian statistic.

  7. Optimizing correlation techniques for improved earthquake location

    USGS Publications Warehouse

    Schaff, D.P.; Bokelmann, G.H.R.; Ellsworth, W.L.; Zanzerkia, E.; Waldhauser, F.; Beroza, G.C.

    2004-01-01

    Earthquake location using relative arrival time measurements can lead to dramatically reduced location errors and a view of fault-zone processes with unprecedented detail. There are two principal reasons why this approach reduces location errors. The first is that the use of differenced arrival times to solve for the vector separation of earthquakes removes from the earthquake location problem much of the error due to unmodeled velocity structure. The second reason, on which we focus in this article, is that waveform cross correlation can substantially reduce measurement error. While cross correlation has long been used to determine relative arrival times with subsample precision, we extend correlation measurements to less similar waveforms, and we introduce a general quantitative means to assess when correlation data provide an improvement over catalog phase picks. We apply the technique to local earthquake data from the Calaveras Fault in northern California. Tests for an example streak of 243 earthquakes demonstrate that relative arrival times with normalized cross correlation coefficients as low as ???70%, interevent separation distances as large as to 2 km, and magnitudes up to 3.5 as recorded on the Northern California Seismic Network are more precise than relative arrival times determined from catalog phase data. Also discussed are improvements made to the correlation technique itself. We find that for large time offsets, our implementation of time-domain cross correlation is often more robust and that it recovers more observations than the cross spectral approach. Longer time windows give better results than shorter ones. Finally, we explain how thresholds and empirical weighting functions may be derived to optimize the location procedure for any given region of interest, taking advantage of the respective strengths of diverse correlation and catalog phase data on different length scales.

  8. Accurate source location from P waves scattered by surface topography

    NASA Astrophysics Data System (ADS)

    Wang, N.; Shen, Y.

    2015-12-01

    Accurate source locations of earthquakes and other seismic events are fundamental in seismology. The location accuracy is limited by several factors, including velocity models, which are often poorly known. In contrast, surface topography, the largest velocity contrast in the Earth, is often precisely mapped at the seismic wavelength (> 100 m). In this study, we explore the use of P-coda waves generated by scattering at surface topography to obtain high-resolution locations of near-surface seismic events. The Pacific Northwest region is chosen as an example. The grid search method is combined with the 3D strain Green's tensor database type method to improve the search efficiency as well as the quality of hypocenter solution. The strain Green's tensor is calculated by the 3D collocated-grid finite difference method on curvilinear grids. Solutions in the search volume are then obtained based on the least-square misfit between the 'observed' and predicted P and P-coda waves. A 95% confidence interval of the solution is also provided as a posterior error estimation. We find that the scattered waves are mainly due to topography in comparison with random velocity heterogeneity characterized by the von Kάrmάn-type power spectral density function. When only P wave data is used, the 'best' solution is offset from the real source location mostly in the vertical direction. The incorporation of P coda significantly improves solution accuracy and reduces its uncertainty. The solution remains robust with a range of random noises in data, un-modeled random velocity heterogeneities, and uncertainties in moment tensors that we tested.

  9. Accurate source location from waves scattered by surface topography

    NASA Astrophysics Data System (ADS)

    Wang, Nian; Shen, Yang; Flinders, Ashton; Zhang, Wei

    2016-06-01

    Accurate source locations of earthquakes and other seismic events are fundamental in seismology. The location accuracy is limited by several factors, including velocity models, which are often poorly known. In contrast, surface topography, the largest velocity contrast in the Earth, is often precisely mapped at the seismic wavelength (>100 m). In this study, we explore the use of P coda waves generated by scattering at surface topography to obtain high-resolution locations of near-surface seismic events. The Pacific Northwest region is chosen as an example to provide realistic topography. A grid search algorithm is combined with the 3-D strain Green's tensor database to improve search efficiency as well as the quality of hypocenter solutions. The strain Green's tensor is calculated using a 3-D collocated-grid finite difference method on curvilinear grids. Solutions in the search volume are obtained based on the least squares misfit between the "observed" and predicted P and P coda waves. The 95% confidence interval of the solution is provided as an a posteriori error estimation. For shallow events tested in the study, scattering is mainly due to topography in comparison with stochastic lateral velocity heterogeneity. The incorporation of P coda significantly improves solution accuracy and reduces solution uncertainty. The solution remains robust with wide ranges of random noises in data, unmodeled random velocity heterogeneities, and uncertainties in moment tensors. The method can be extended to locate pairs of sources in close proximity by differential waveforms using source-receiver reciprocity, further reducing errors caused by unmodeled velocity structures.

  10. Waveform Cross-Correlation for Improved North Texas Earthquake Locations

    NASA Astrophysics Data System (ADS)

    Phillips, M.; DeShon, H. R.; Oldham, H. R.; Hayward, C.

    2014-12-01

    In November 2013, a sequence of earthquakes began in Reno and Azle, TX, two communities located northwest of Fort Worth in an area of active oil and gas extraction. Only one felt earthquake had been reported within the area before the occurrence of probable injection-induced earthquakes at the Dallas-Fort Worth airport in 2008. The USGS National Earthquakes Information Center (NEIC) has reported 27 felt earthquakes in the Reno-Azle area through January 28, 2014. A temporary seismic network was installed beginning in December 2013 to acquire data to improve location and magnitude estimates and characterize the earthquake sequence. Here, we present high-resolution relative earthquake locations derived using differential time data from waveform cross-correlation. Cross-correlation is computed using the GISMO software suite and event relocation is done using double difference relocation techniques. Waveform cross-correlation of the local data indicates high (>70%) similarity between 4 major swarms of events lasting between 18 and 24 hours. These swarms are temporal zones of high event frequency; 1.4% of the time series data accounts for 42.1% of the identified local earthquakes. Local earthquakes are occurring along the Newark East Fault System, a NE-SW striking normal fault system previously thought inactive at depths between 2 and 8 km in the Ellenburger limestone formation and underlying Precambrian basement. Data analysis is ongoing and continued characterization of the associated fault will provide improved location estimates.

  11. Improved Teleseismic Locations of Shallow Subduction Zone Earthquakes

    NASA Astrophysics Data System (ADS)

    Bisrat, S. T.; Deshon, H. R.; Engdahl, E. R.; Bilek, S. L.

    2009-12-01

    Improved precision teleseismic earthquake locations in subduction zones are being used to better understand shallow megathrust frictional conditions and determine the global distribution of tsunami earthquakes. Most global teleseismic catalogs fail to accurately locate shallow subduction zone earthquakes, especially mid-magnitude events, leading to increased error in determining source time functions useful for identifying tsunami earthquakes. The Engdahl, van der Hilst and Buland (EHB) method had addressed this problem in part by including the teleseismic depth phases pP, pwP and sP in the relocation algorithm. The EHB catalog relies on phase times reported to the ISC and NEIC, but additional high quality depth phase onsets can be incorporated in the relocation procedure to enhance the robustness of individual locations. We present improvements to an automated frequency-based picker that identifies depth phases not reported in the standard catalogs. The revised autopicker uses abrupt amplitude changes of the power spectral density (PSD) function calculated at optimized frequencies for each waveform. It is being used to pick onsets for P and depth phases pP, pwP or sP for inclusion in the EHB phase catalog. In the case of events with an emergent P-wave onset or with a complex waveform consisting of sub-events, the autopicker may either overlook a relatively small change in frequency of the first arrival or misidentify the onset arrival time of associated later arrivals, leading to erroneous results. We track those waveforms by comparing the difference of the P-wave arrival time from ISC/NEIC and the autopicker. The phase arrivals can then be adjusted manually as they usually make up a few percent of the whole data. Epicentral changes following relocation using additional depth phases are generally small (<5 km). Changes in depth may be on the order of 10s of km for some events, though the standard deviation of depth changes within each subduction zone is ~5 km. We

  12. Nonexposure accurate location K-anonymity algorithm in LBS.

    PubMed

    Jia, Jinying; Zhang, Fengli

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060

  13. Nonexposure Accurate Location K-Anonymity Algorithm in LBS

    PubMed Central

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060

  14. How well do earthquake locations forecast future ones?

    NASA Astrophysics Data System (ADS)

    González, Álvaro

    2016-04-01

    It is debated whether the spatial distribution of past earthquakes is a good predictor of the locations of future ones. This is especially discussed for intraplate regions, where the few large earthquakes might alternate from one location to another, instead of recurring at the same sites where previous ones originated. This work points out that this debate may well have a geometric solution, and that the crucial issue would be how many earthquakes are available for analysis. If earthquakes would reoccur exactly at the same locations, past epicentres would perfectly forecast the sites of future ones. In the opposite case, if epicentres were distributed with uniform probability over an area, past earthquake locations would be uninformative about future ones. Reality lies in an intermediate case, in which earthquakes group in space (approximately in a fractal or multifractal way). So earthquakes in general tend to occur close to previous ones, but not necessarily at the same sites. The smaller the fractal dimension of this spatial distribution, the closer to each other earthquakes tend to occur, and the better past earthquake locations forecast future ones. Here, a simple spatial forecast method is extensively used to test to what extent past epicentres forecast the location of future ones. The method calculates maps of spatial probabilities based on the empirical distribution of nearest-neighbour distances between epicentres. According to these maps, earthquakes are more likely to occur in the vicinity of past ones. As new earthquakes happen, the maps improve and self-sharpen. This method has no parameter, and assigns equal weight to the location of any past earthquake, regardless of its magnitude or origin time. Tests are made with complete earthquake catalogues for different tectonic environments, and with up to tens of thousands of events, in: the whole Earth; Southern California (a transcurrent plate boundary); the Iberian Region (a "diffuse" plate boundary

  15. The travel-time sequence method for rapid earthquake locating in Taiwan

    NASA Astrophysics Data System (ADS)

    Tsai, Cheng-Yung; Lin, Ting-Li; Wu, Yih-Min

    2015-04-01

    Taiwan is constantly threatened by large and damage earthquakes as the tectonic consequence of the persistent collisions between the Philippine Sea Plate and Eurasian plate. Nowadays, the earthquake early warning (EEW) system is one of the practical tool for seismic hazard mitigation, and has been developed in Taiwan for almost 20 years (Wu et al., 1997; Wu et al., 2000). The earthquake location for the EEW purpose in Taiwan is based on the traditional method with 1-D velocity structure but using less stations. In this study, we developed a new EEW locating method using 3-D velocity structure and pre-calculated travel time database. The seismic network used in this study is the Rapid Earthquake Information Release System (RTD; Wu et al., 1997; Wu et al., 2000) operated by the Central Weather Bureau, Taiwan. We divided the Taiwan area (119~123゚E, 21~26゚N) into 2×2 km grid and each grid point is assumed as the hypocenter with the constant focal depth of 10 km. Therefore, each grid point has its specific travel-time sequence of the RTD stations using the 3-D velocity model (Wu et al., 2009). When an earthquake occurs, we use the first ten station arrival sequence to compare with the travel-time sequence database, and define the least difference grid as the hypocenter. By using the travel-time sequence method, we can rapidly determine the earthquake location more accurate than the present method in Taiwan

  16. Improved Epicentral Locations for Earthquakes Near Explorer Ridge

    NASA Astrophysics Data System (ADS)

    Clemens-Sewall, D.; Trehu, A. M.

    2014-12-01

    The tectonics and structure of the Explorer region, which is the northern boundary of the subducting Juan de Fuca plate, help to inform our assessments of the seismic hazard in the Pacific Northwest. Our understanding of this tectonically complex area is largely based on morphology of the seafloor from swath bathymetric data, potential field anomalies, and the calculated locations of contemporary earthquakes in the region. However, the Navy Sound Surveillance System hydrophone network, the Canadian National Seismic Network, the U.S. Advanced National Seismic System, and the Harvard Centroid Moment Tensor Catalog report significantly different epicentral locations for swarms of earthquakes near Explorer Ridge in August and October 2008. We relocated the larger (M>5) earthquakes in the August 2008 swarm using data from both U.S. and Canadian networks to improve azimuthal coverage. Absolute locations were determined for the largest events in the swarm, and the smaller events were relocated relative to the largest using a double difference method. To better understand why the locations from land-based seismic networks differ from those computed from the hydrophone arrays, we also examine T-phases from regional events recorded on Ocean Bottom Seismometers from the COLZA and Cascadia Initiative experiments and evaluate the potential for using T-phases to improve the epicentral locations of submarine earthquakes in the Pacific Northwest region.

  17. Accurate eye center location through invariant isocentric patterns.

    PubMed

    Valenti, Roberto; Gevers, Theo

    2012-09-01

    Locating the center of the eyes allows for valuable information to be captured and used in a wide range of applications. Accurate eye center location can be determined using commercial eye-gaze trackers, but additional constraints and expensive hardware make these existing solutions unattractive and impossible to use on standard (i.e., visible wavelength), low-resolution images of eyes. Systems based solely on appearance are proposed in the literature, but their accuracy does not allow us to accurately locate and distinguish eye centers movements in these low-resolution settings. Our aim is to bridge this gap by locating the center of the eye within the area of the pupil on low-resolution images taken from a webcam or a similar device. The proposed method makes use of isophote properties to gain invariance to linear lighting changes (contrast and brightness), to achieve in-plane rotational invariance, and to keep low-computational costs. To further gain scale invariance, the approach is applied to a scale space pyramid. In this paper, we extensively test our approach for its robustness to changes in illumination, head pose, scale, occlusion, and eye rotation. We demonstrate that our system can achieve a significant improvement in accuracy over state-of-the-art techniques for eye center location in standard low-resolution imagery. PMID:22813958

  18. Relative earthquake location for remote offshore and tectonically active continental regions using surface waves

    NASA Astrophysics Data System (ADS)

    Cleveland, M.; Ammon, C. J.; Vandemark, T. F.

    2015-12-01

    Earthquake locations are a fundamental parameter necessary for reliable seismic monitoring and seismic event characterization. Within dense continental seismic networks, event locations can be accurately and precisely estimated. However, for many regions of interest, existing catalog data and traditional location methods provide neither accurate nor precise hypocenters. In particular, for isolated continental and offshore areas, seismic event locations are estimated primarily using distant observations, often resulting in inaccurate and imprecise locations. The use of larger, moderate-size events is critical to the construction of useful travel-time corrections in regions of strong geologic heterogeneity. Double difference methods applied to cross-correlation measured Rayleigh and Love wave time shifts are an effective tool at providing improved epicentroid locations and relative origin-time shifts in these regions. Previous studies have applied correlation of R1 and G1 waveforms to moderate-magnitude vertical strike-slip transform-fault and normal faulting earthquakes from nearby ridges. In this study, we explore the utility of phase-match filtering techniques applied to surface waves to improve cross-correlation measurements, particularly for smaller magnitude seismic events. We also investigate the challenges associated with applying surface-wave location methods to shallow earthquakes in tectonically active continental regions.

  19. Locating earthquakes with surface waves and centroid moment tensor estimation

    NASA Astrophysics Data System (ADS)

    Wei, Shengji; Zhan, Zhongwen; Tan, Ying; Ni, Sidao; Helmberger, Don

    2012-04-01

    Traditionally, P wave arrival times have been used to locate regional earthquakes. In contrast, the travel times of surface waves dependent on source excitation and the source parameters and depth must be determined independently. Thus surface wave path delays need to be known before such data can be used for location. These delays can be estimated from previous earthquakes using the cut-and-paste technique, Ambient Seismic Noise tomography, and from 3D models. Taking the Chino Hills event as an example, we show consistency of path corrections for (>10 s) Love and Rayleigh waves to within about 1 s obtained from these methods. We then use these empirically derived delay maps to determine centroid locations of 138 Southern California moderate-sized (3.5 > Mw> 5.7) earthquakes using surface waves alone. It appears that these methods are capable of locating the main zone of rupture within a few (˜3) km accuracy relative to Southern California Seismic Network locations with 5 stations that are well distributed in azimuth. We also address the timing accuracy required to resolve non-double-couple source parameters which trades-off with location with less than a km error required for a 10% Compensated Linear Vector Dipole resolution.

  20. Accurate tremor locations from coherent S and P waves

    NASA Astrophysics Data System (ADS)

    Armbruster, John G.; Kim, Won-Young; Rubin, Allan M.

    2014-06-01

    Nonvolcanic tremor is an important component of the slow slip processes which load faults from below, but accurately locating tremor has proven difficult because tremor rarely contains clear P or S wave arrivals. Here we report the observation of coherence in the shear and compressional waves of tremor at widely separated stations which allows us to detect and accurately locate tremor events. An event detector using data from two stations sees the onset of tremor activity in the Cascadia tremor episodes of February 2003, July 2004, and September 2005 and confirms the previously reported south to north migration of the tremor. Event detectors using data from three and four stations give Sand P arrival times of high accuracy. The hypocenters of the tremor events fall at depths of ˜30 to ˜40 km and define a narrow plane dipping at a shallow angle to the northeast, consistent with the subducting plate interface. The S wave polarizations and P wave first motions define a source mechanism in agreement with the northeast convergence seen in geodetic observations of slow slip. Tens of thousands of locations determined by constraining the events to the plate interface show tremor sources highly clustered in space with a strongly similar pattern of sources in the three episodes examined. The deeper sources generate tremor in minor episodes as well. The extent to which the narrow bands of tremor sources overlap between the three major episodes suggests relative epicentral location errors as small as 1-2 km.

  1. Magnitude and location of historical earthquakes in Japan and implications for the 1855 Ansei Edo earthquake

    USGS Publications Warehouse

    Bakun, W.H.

    2005-01-01

    Japan Meteorological Agency (JMA) intensity assignments IJMA are used to derive intensity attenuation models suitable for estimating the location and an intensity magnitude Mjma for historical earthquakes in Japan. The intensity for shallow crustal earthquakes on Honshu is equal to -1.89 + 1.42MJMA - 0.00887?? h - 1.66log??h, where MJMA is the JMA magnitude, ??h = (??2 + h2)1/2, and ?? and h are epicentral distance and focal depth (km), respectively. Four earthquakes located near the Japan Trench were used to develop a subducting plate intensity attenuation model where intensity is equal to -8.33 + 2.19MJMA -0.00550??h - 1.14 log ?? h. The IJMA assignments for the MJMA7.9 great 1923 Kanto earthquake on the Philippine Sea-Eurasian plate interface are consistent with the subducting plate model; Using the subducting plate model and 226 IJMA IV-VI assignments, the location of the intensity center is 25 km north of the epicenter, Mjma is 7.7, and MJMA is 7.3-8.0 at the 1?? confidence level. Intensity assignments and reported aftershock activity for the enigmatic 11 November 1855 Ansei Edo earthquake are consistent with an MJMA 7.2 Philippine Sea-Eurasian interplate source or Philippine Sea intraslab source at about 30 km depth. If the 1855 earthquake was a Philippine Sea-Eurasian interplate event, the intensity center was adjacent to and downdip of the rupture area of the great 1923 Kanto earthquake, suggesting that the 1855 and 1923 events ruptured adjoining sections of the Philippine Sea-Eurasian plate interface.

  2. Accurate Tremor Locations in Japan from Coherent S-Waves

    NASA Astrophysics Data System (ADS)

    Armbruster, J. G.

    2014-12-01

    The tremor detectors developed for accurately locating tectonic tremor in Cascadia [Armbruster et al., JGR 2014] have been applied to data from the HINET seismic network in Japan. The best results were obtained in the Tokai region with stations ASU, ASH and TYE having relatively close spacing (11-18 km). 330 days with active tremor, 2004-2014, near these stations were found on the daily epicentral distributions of tremor on the HINET web site. The detector sees numbers of detections per day comparable to minor tremor episodes in Cascadia. Major tremor episodes in Cascadia are associated with geodetic signals stronger than those seen in Japan. If the tremor is located by constraining it to the plate interface, a pattern of persistent sources is seen, with some intense sources. This is similar to what was seen in Cascadia. In southwest Shikoku 139 days with tremor were identified. Stations UWA, OOZ and IKT see tremor with persistent patterns and strong sources but with approximately one fifth as many detections per day on active days, compared to ASU-ASH-TYE. The web site tremor distributions show activity here as strong as in Tokai. We believe the lesser number of detections in Shikoku is primarily the result of wider station spacing, 19-30 km, than in Tokai, although there may be other factors. Yabe and Ide [EPS 2013] detect and locate tremor in Kyushu on July 17-18 2005 and December 4-6 2008. A detector with stations NRA, SUK and KTM, station spacing 21-22 km, sees tremor which resembles minor episodes in Cascadia. The relative arrival times are consistent with their locations. We conclude that the methods developed in Cascadia will work in Japan but the typical spacing of HINET stations, ~20 km, is greater than the optimum distance found in analysis of data from Cascadia, 8 to 15 km.

  3. Effects of heterogeneity on earthquake location at ISC

    NASA Astrophysics Data System (ADS)

    Adams, R. D.

    1992-12-01

    Earthquake location at the International Seismological Centre is carried out by routine least-squares analysis using Jeffreys-Bullen travel times. It is impossible to examine every earthquake in detail, but when obvious discrepancies in location become apparent, adjustments can be made by analysts, usually in phase identification or the restraint of depth. Such discrepancies often result from inappropriateness of the Jeffreys-Bullen model. The effect is most apparent in subduction zones, where it is often difficult to reconcile local and teleseismic observations, and differences from the standard model can result in substantial mislocations. Large events, located by steeply descending teleseismic phases, may be only slightly misplaced, with large residuals at close stations giving a true indication of velocity anomalies. Small events, however, are often significantly misplaced, although giving small residuals at a few close stations. These apparently well located events give compensating misinformation about velocities and location. In other areas, especially mid-oceanic ridges, difficulties in depth determination are likely to be related to deviations from a laterally homogeneous velocity model.

  4. Improved Earthquake Location in the area of N. Euboean Gulf

    NASA Astrophysics Data System (ADS)

    Mouzakiotis, A. S.; Karastathis, V. K.

    2012-12-01

    Considerably improved hypocentral locations of the seismic events recorded during the period from 2009 to 2010 by the Hellenic Unified Seismographic Network (HUSN), have been obtained for the area of North Euboean Gulf, after implementation of a 3D non-linear location algorithm and a local 3D velocity model for both P and S-waves. The velocity model has been produced in previous studies using local earthquake tomography techniques (1D minimum velocity model and simultaneous 3D inversion techniques). In total, 280 events have been recorded in the area covered by the 3D velocity model, by at least 5 local stations. The 223 out of these were well located by the local stations having the azimuthal gap lower than 180o. Within the area covered by the 3D velocity model, there are 7 HUSN stations and two more from other networks. To optimize the hypocentral parameters estimation of the selected events, we used probabilistic non-linear earthquake location method, utilizing the 3D velocity model of the area. The program used produces a misfit function, "optimal" hypocenters and an estimate of the posterior probability density function (PDF) for the spatial hypocenter location. The calculated travel-times are obtained using a 3D version of the Eikonal finite difference scheme and the complete location PDF is calculated by the EDT (equal differential time) function. The results were compared with the ones obtained by the implementation of other 1D velocity models such as a) the 1D velocity model used for the daily earthquake data analysis by NOA and b) the 1D minimum velocity model. In spite of the fact that the local 3D velocity model was based on a completely different dataset than the present, it produced considerably improved event locations with significantly smaller location errors than both the 1D models. This shows the validity of the 3D velocity model. Although the 1D minimum model produced better locations than the NOA model, it was not as effective as the 3D model

  5. Precise relative locations for earthquakes in the northeast Pacific region

    SciTech Connect

    Cleveland, K. Michael; VanDeMark, Thomas F.; Ammon, Charles J.

    2015-10-09

    We report that double-difference methods applied to cross-correlation measured Rayleigh wave time shifts are an effective tool to improve epicentroid locations and relative origin time shifts in remote regions. We apply these methods to seismicity offshore of southwestern Canada and the U.S. Pacific Northwest, occurring along the boundaries of the Pacific and Juan de Fuca (including the Explorer Plate and Gorda Block) Plates. The Blanco, Mendocino, Revere-Dellwood, Nootka, and Sovanco fracture zones host the majority of this seismicity, largely consisting of strike-slip earthquakes. The Explorer, Juan de Fuca, and Gorda spreading ridges join these fracture zones and host normal faulting earthquakes. Our results show that at least the moderate-magnitude activity clusters along fault strike, supporting suggestions of large variations in seismic coupling along oceanic transform faults. Our improved relative locations corroborate earlier interpretations of the internal deformation in the Explorer and Gorda Plates. North of the Explorer Plate, improved locations support models that propose northern extension of the Revere-Dellwood fault. Relocations also support interpretations that favor multiple parallel active faults along the Blanco Transform Fault Zone. Seismicity of the western half of the Blanco appears more scattered and less collinear than the eastern half, possibly related to fault maturity. We use azimuthal variations in the Rayleigh wave cross-correlation amplitude to detect and model rupture directivity for a moderate size earthquake along the eastern Blanco Fault. Lastly, the observations constrain the seismogenic zone geometry and suggest a relatively narrow seismogenic zone width of 2 to 4 km.

  6. Precise relative locations for earthquakes in the northeast Pacific region

    DOE PAGESBeta

    Cleveland, K. Michael; VanDeMark, Thomas F.; Ammon, Charles J.

    2015-10-09

    We report that double-difference methods applied to cross-correlation measured Rayleigh wave time shifts are an effective tool to improve epicentroid locations and relative origin time shifts in remote regions. We apply these methods to seismicity offshore of southwestern Canada and the U.S. Pacific Northwest, occurring along the boundaries of the Pacific and Juan de Fuca (including the Explorer Plate and Gorda Block) Plates. The Blanco, Mendocino, Revere-Dellwood, Nootka, and Sovanco fracture zones host the majority of this seismicity, largely consisting of strike-slip earthquakes. The Explorer, Juan de Fuca, and Gorda spreading ridges join these fracture zones and host normal faultingmore » earthquakes. Our results show that at least the moderate-magnitude activity clusters along fault strike, supporting suggestions of large variations in seismic coupling along oceanic transform faults. Our improved relative locations corroborate earlier interpretations of the internal deformation in the Explorer and Gorda Plates. North of the Explorer Plate, improved locations support models that propose northern extension of the Revere-Dellwood fault. Relocations also support interpretations that favor multiple parallel active faults along the Blanco Transform Fault Zone. Seismicity of the western half of the Blanco appears more scattered and less collinear than the eastern half, possibly related to fault maturity. We use azimuthal variations in the Rayleigh wave cross-correlation amplitude to detect and model rupture directivity for a moderate size earthquake along the eastern Blanco Fault. Lastly, the observations constrain the seismogenic zone geometry and suggest a relatively narrow seismogenic zone width of 2 to 4 km.« less

  7. Accurate Focal Depth Determination of Oceanic Earthquakes Using Water-column Reverberation and Some Implications for the Shrinking Plate Hypothesis

    NASA Astrophysics Data System (ADS)

    Niu, F.; Huang, J.; Gordon, R. G.

    2015-12-01

    Investigation of oceanic earthquakes can play an important role in constraining the lateral and depth variations of the stress and strain-rate fields in oceanic lithosphere and of the thickness of the seismogenic layer as a function of lithosphere age, thereby providing us with critical insight into thermal and dynamic processes associated with the cooling and evolution of oceanic lithosphere. With the goal of estimating hypocentral depths more accurately, we observe clear water reverberations after the direct P wave on teleseismic records of oceanic earthquakes and develop a technique to estimate earthquake depths by using these reverberations. The Z-H grid search method allows the simultaneous determination of the sea floor depth (H) and earthquake depth (Z) with an uncertainty less than 1 km, which compares favorably with alternative approaches. We apply this method to two closely located earthquakes beneath the eastern Pacific. These earthquakes occur in ≈25 Ma-old lithosphere and were previously estimated to have very similar depths of ≈10-12 km. We find that the two events actually occurred at dissimilar depths of 2.5 km and 16.8 km beneath the seafloor, respectively within the oceanic crust and lithospheric mantle. The shallow and deep events are determined to be a thrust and normal earthquake, respectively, indicating that the stress field within the oceanic lithosphere changes from horizontal compression to horizontal extension as depth increases, which is consistent with the prediction of the lithospheric cooling model. Furthermore, we show that the P-axis of the newly investigated thrust-faulting earthquake is roughly perpendicular to that of the previously studied thrust event, consistent with the predictions of the shrinking-plate hypothesis.

  8. Accurate focal depth determination of oceanic earthquakes using water-column reverberation and some implications for the shrinking plate hypothesis

    NASA Astrophysics Data System (ADS)

    Huang, Jianping; Niu, Fenglin; Gordon, Richard G.; Cui, Chao

    2015-12-01

    Investigation of oceanic earthquakes is useful for constraining the lateral and depth variations of the stress and strain-rate fields in oceanic lithosphere, and the thickness of the seismogenic layer as a function of lithosphere age, thereby providing us with critical insight into thermal and dynamic processes associated with the cooling and evolution of oceanic lithosphere. With the goal of estimating hypocentral depths more accurately, we observe clear water reverberations after the direct P wave on teleseismic records of oceanic earthquakes and develop a technique to estimate earthquake depths by using these reverberations. The Z-H grid search method allows the simultaneous determination of the sea floor depth (H) and earthquake depth (Z) with an uncertainty less than 1 km, which compares favorably with alternative approaches. We apply this method to two closely located earthquakes beneath the eastern Pacific. These earthquakes occurred in ∼25 Ma-old lithosphere and were previously estimated to have similar depths of ∼10-12 km. We find that the two events actually occurred at dissimilar depths of 2.5 km and 16.8 km beneath the seafloor, respectively, within the oceanic crust and lithospheric mantle. The shallow and deep events are determined to be a thrust and normal earthquake, respectively, indicating that the stress field within the oceanic lithosphere changes from horizontal deviatoric compression to horizontal deviatoric tension as depth increases, which is consistent with the prediction of lithospheric cooling models. Furthermore, we show that the P-axis of the newly investigated thrust-faulting earthquake is perpendicular to that of the previously studied thrust event, consistent with the predictions of the shrinking-plate hypothesis.

  9. Locating Local Earthquakes Using Single 3-Component Broadband Seismological Data

    NASA Astrophysics Data System (ADS)

    Das, S. B.; Mitra, S.

    2015-12-01

    We devised a technique to locate local earthquakes using single 3-component broadband seismograph and analyze the factors governing the accuracy of our result. The need for devising such a technique arises in regions of sparse seismic network. In state-of-the-art location algorithms, a minimum of three station recordings are required for obtaining well resolved locations. However, the problem arises when an event is recorded by less than three stations. This may be because of the following reasons: (a) down time of stations in a sparse network; (b) geographically isolated regions with limited logistic support to setup large network; (c) regions of insufficient economy for financing multi-station network and (d) poor signal-to-noise ratio for smaller events at most stations, except the one in its closest vicinity. Our technique provides a workable solution to the above problematic scenarios. However, our methodology is strongly dependent on the velocity model of the region. Our method uses a three step processing: (a) ascertain the back-azimuth of the event from the P-wave particle motion recorded on the horizontal components; (b) estimate the hypocentral distance using the S-P time; and (c) ascertain the emergent angle from the vertical and radial components. Once this is obtained, one can ray-trace through the 1-D velocity model to estimate the hypocentral location. We test our method on synthetic data, which produces results with 99% precision. With observed data, the accuracy of our results are very encouraging. The precision of our results depend on the signal-to-noise ratio (SNR) and choice of the right band-pass filter to isolate the P-wave signal. We used our method on minor aftershocks (3 < mb < 4) of the 2011 Sikkim earthquake using data from the Sikkim Himalayan network. Location of these events highlight the transverse strike-slip structure within the Indian plate, which was observed from source mechanism study of the mainshock and larger aftershocks.

  10. Effect of earthquake locations on Rayleigh wave azimuthal anisotropy

    NASA Astrophysics Data System (ADS)

    Ma, Z.; Masters, G.

    2013-12-01

    We have compiled a large dataset for Rayleigh wave phase arrival times from 5mHz to 35mHz by using cluster analysis method. Estimation of source phase is improved by using a second order approximation of the associated Legendre functions. Currently, we have about 300,000 measurements for 5mHz, 600,000 for 10mHz, 400,000 for 20mHz and 280,000 for 35mHz. We use our new dataset to invert for the 2-phi terms of Rayleigh wave azimuthal anisotropy. We have found differences in the inverted fast directions when using PDE versus CMT source locations, especially near subduction zones where most earthquakes happen. Allowing small changes in earthquake locations (latitude and longitude) in our inversion greatly reduces such discrepancies. Residue patterns and checkerboard tests both indicate that the azimuthal anisotropy patterns in ocean basins are likely coherent over large distances, especially in the Pacific. To model the change of anisotropy amplitudes in the Pacific for different frequencies, we follow the approach proposed by Montagner and Nataf (1986). Values of elastic constants are compiled from Anderson and Isaak (1995) and Abramson et al (1997). The depth extent of anisotropy will be discussed.

  11. Double Difference Earthquake Locations at the Salton Sea Geothermal Reservoir

    SciTech Connect

    Boyle, K L; Hutchings, L J; Bonner, B P; Foxall, W; Kasameyer, P W

    2007-08-08

    The purpose of this paper is to report on processing of raw waveform data from 4547 events recorded at 12 stations between 2001 and 2005 by the Salton Sea Geothermal Field (SSGF) seismic network. We identified a central region of the network where vertically elongated distributions of hypocenters have previously been located from regional network analysis. We process the data from the local network by first autopicking first P and S arrivals; second, improving these with hand picks when necessary; then, using cross-correlation to provide very precise P and S relative arrival times. We used the HypoDD earthquake location algorithm to locate the events. We found that the originally elongated distributions of hypocenters became more tightly clustered and extend down the extent of the study volume at 10 Km. However, we found the shapes to depend on choices of location parameters. We speculate that these narrow elongated zones of seismicity may be due to stress release caused by fluid flow.

  12. Locating and Modeling Regional Earthquakes with Broadband Waveform Data

    NASA Astrophysics Data System (ADS)

    Tan, Y.; Zhu, L.; Helmberger, D.

    2003-12-01

    Retrieving source parameters of small earthquakes (Mw < 4.5), including mechanism, depth, location and origin time, relies on local and regional seismic data. Although source characterization for such small events achieves a satisfactory stage in some places with a dense seismic network, such as TriNet, Southern California, a worthy revisit to the historical events in these places or an effective, real-time investigation of small events in many other places, where normally only a few local waveforms plus some short-period recordings are available, is still a problem. To address this issue, we introduce a new type of approach that estimates location, depth, origin time and fault parameters based on 3-component waveform matching in terms of separated Pnl, Rayleigh and Love waves. We show that most local waveforms can be well modeled by a regionalized 1-D model plus different timing corrections for Pnl, Rayleigh and Love waves at relatively long periods, i.e., 4-100 sec for Pnl, and 8-100 sec for surface waves, except for few anomalous paths involving greater structural complexity, meanwhile, these timing corrections reveal similar azimuthal patterns for well-located cluster events, despite their different focal mechanisms. Thus, we can calibrate the paths separately for Pnl, Rayleigh and Love waves with the timing corrections from well-determined events widely recorded by a dense modern seismic network or a temporary PASSCAL experiment. In return, we can locate events and extract their fault parameters by waveform matching for available waveform data, which could be as less as from two stations, assuming timing corrections from the calibration. The accuracy of the obtained source parameters is subject to the error carried by the events used for the calibration. The detailed method requires a Green­_s function library constructed from a regionalized 1-D model together with necessary calibration information, and adopts a grid search strategy for both hypercenter and

  13. Epistemic uncertainty in the location and magnitude of earthquakes in Italy from Macroseismic data

    USGS Publications Warehouse

    Bakun, W.H.; Gomez, Capera A.; Stucchi, M.

    2011-01-01

    Three independent techniques (Bakun and Wentworth, 1997; Boxer from Gasperini et al., 1999; and Macroseismic Estimation of Earthquake Parameters [MEEP; see Data and Resources section, deliverable D3] from R.M.W. Musson and M.J. Jimenez) have been proposed for estimating an earthquake location and magnitude from intensity data alone. The locations and magnitudes obtained for a given set of intensity data are almost always different, and no one technique is consistently best at matching instrumental locations and magnitudes of recent well-recorded earthquakes in Italy. Rather than attempting to select one of the three solutions as best, we use all three techniques to estimate the location and the magnitude and the epistemic uncertainties among them. The estimates are calculated using bootstrap resampled data sets with Monte Carlo sampling of a decision tree. The decision-tree branch weights are based on goodness-of-fit measures of location and magnitude for recent earthquakes. The location estimates are based on the spatial distribution of locations calculated from the bootstrap resampled data. The preferred source location is the locus of the maximum bootstrap location spatial density. The location uncertainty is obtained from contours of the bootstrap spatial density: 68% of the bootstrap locations are within the 68% confidence region, and so on. For large earthquakes, our preferred location is not associated with the epicenter but with a location on the extended rupture surface. For small earthquakes, the epicenters are generally consistent with the location uncertainties inferred from the intensity data if an epicenter inaccuracy of 2-3 km is allowed. The preferred magnitude is the median of the distribution of bootstrap magnitudes. As with location uncertainties, the uncertainties in magnitude are obtained from the distribution of bootstrap magnitudes: the bounds of the 68% uncertainty range enclose 68% of the bootstrap magnitudes, and so on. The instrumental

  14. Correlation of Earthquake Locations with Volumetric Source Components in TauTona Gold Mine, South Africa

    NASA Astrophysics Data System (ADS)

    Kane, D. L.; Boettcher, M. S.

    2013-12-01

    We investigate the source characteristics of earthquakes in TauTona Gold Mine, South Africa, to test if the location of earthquakes relative to mining structures is correlated with significant isotropic source behavior. Earthquakes are well monitored in TauTona Mine, where underground near-source stations record smaller events and higher frequency energy than can generally be observed using surface stations. Our dataset includes -4 < Mw < 4 earthquakes recorded at hypocentral distances of tens of meters to a few kilometers. The locations of structures in the mine, including faults, dikes, tunnels, and stopes, are well known from detailed geologic mapping and surveyed mine plans. We use data collected between 2004 and 2009 from the in-mine array (1-6 kHz), the Natural Earthquake Laboratory in South African Mines (NELSAM) project stations (6-12 kHz), and a short-term PASSCAL experiment (200 Hz) to study source mechanism variability and correlation with mapped structures within the mine. Previous studies of earthquakes in mines suggest a relationship between earthquake size and isotropic moment tensor source characteristics. In TauTona Mine, earthquakes with significant implosive source characteristics tend to be infrequent, larger events (Mw > 1.5), whereas earthquakes with significant explosive source characteristics tend to be smaller (Mw < 0). A possible model for this variability in source behavior relates earthquake size to earthquake location relative to mining and natural structures. Larger events are more likely to produce closure of tunnels and stopes within the mine, whereas the smallest recorded explosive events can be interpreted as opening cracks that form at the edges of mining structures. Double-couple type sources occur throughout the full magnitude range, and are often located along mapped faults and dikes. We focus our analysis on earthquakes located near the NELSAM stations in the deepest part of the mine, and on earthquakes located at depths

  15. What controls the location where large earthquakes nucleate along the North Anatolian Fault ?

    NASA Astrophysics Data System (ADS)

    Bouchon, M.; Karabulut, H.; Schmittbuhl, J.; Durand, V.; Marsan, D.; Renard, F.

    2012-12-01

    We review several sets of observations which suggest that the location of the epicenters of the 1939-1999 sequence of large earthquakes along the NAF obeys some mechanical logic. The 1999 Izmit earthquake nucleated in a zone of localized crustal extension oriented N10E (Crampin et al., 1985; Evans et al., 1987), nearly orthogonal to the strike of the NAF, thus releasing the normal stress on the fault in the area and facilitating rupture nucleation. The 1999 Duzce epicenter, located about 25km from the end of the Izmit rupture, is precisely near the start of a simple linear segment of the fault (Pucci et al., 2006) where supershear rupture occurred (Bouchon et al., 2001, Konca et al., 2010). Aftershock locations of the Izmit earthquake in the region (Gorgun et al., 2009) show that Duzce, at its start, was the first significant Izmit aftershock to occur on this simple segment. The rupture nucleated on the part of this simple segment which had been most loaded in Coulomb stress by the Izmit earthquake. Once rupture of this segment began, it seems logical that the whole segment would break, as its simple geometry suggests that no barrier was present to arrest rupture. Rupture of this segment, in turn, led to the rupture of adjacent segments. Like the Izmit earthquake, the 1943 Tosya and the 1944 Bolu-Gerede earthquakes nucleated near a zone of localized crustal extension. The long-range delayed triggering of extensional clusters observed after the Izmit/Duzce earthquakes (Durand et al., 2010) suggests a possible long-range delayed triggering of the 1943 shock by the 1942 Niksar earthquake. The 1942, 1957 Albant and 1967 Mudurnu earthquake nucleation locations further suggest that like what is observed for the Duzce earthquake, the previous earthquake ruptures stopped when encountering geometrically complex segments and nucleated again, past these segments.

  16. Earthquake precise locations catalog for the Lesser Antilles subduction zone (1972-2013)

    NASA Astrophysics Data System (ADS)

    Massin, Frederick; Amorese, Daniel; Beauducel, Francois; Bengoubou-Valérius, Mendy; Bernard, Marie-Lise; Bertil, Didier

    2014-05-01

    Locations for earthquake recorded in the Lesser Antilles subduction zone are processed separately by regional observatories, NEIC and ISC. There is no earthquake location catalog available compiling all available phase arrival data. We propose a new best complete earthquake catalog by merging all available phase arrival data for better constrains on earthquake locations. ISC provides the phase arrival data of 29243 earthquakes (magnitude range from 1.4 to 6.4) recorded by PRSN (Porto Rico), SRC (British West Indies), and from FUNVISIS (Venezuela). We add phases data from IPGP observatories for 68718 earthquakes from magnitudes 0.1 to 7.5 (OVSG, Guadeloupe, recorded 53226 earthquakes since 1981, and OVSM, Martinique, recorded 29931 earthquakes since 1972). IPGP also provides the accelerometer waveform data of the GIS-RAP network. We achieved automatic picking on the GIS-RAP data using the Component Energy Correlation Method. The CECM provides high precision phase detection, a realistic estimation of picking error and realistic weights that can be used with manual pick weights. The CECM add an average of 3 P-waves and 2 S-waves arrivals to 3846 earthquakes recorded by the GIS-RAP network since 2002. The final catalog contains 84979 earthquakes between 1972 and 2013, 24528 of which we compiled additional data. We achieve earthquake location using NonLinLoc, regional P and S waves data and a set of one dimensional velocity models. We produce improved locations for 22974 earthquakes (better residuals, on equal or larger arrival dataset) and improved duration magnitudes for 6258 earthquakes (using duration data and improved locations). A subset of best constrained 15626 hypocenters (with more than 8 phases and an average RMS of 0.48±0.77s) could be used for structural analysis and earthquake local tomography. Relative locations are to be applied in order to image active faulting. We aim to understand coupling in the seismogenic zone as well as triggering mechanisms of

  17. Quiet zone within a seismic gap near western Nicaragua: Possible location of a future large earthquake

    USGS Publications Warehouse

    Harlow, D.H.; White, R.A.; Cifuentes, I.L.; Aburto, Q.A.

    1981-01-01

    A 5700-square-kilometer quiet zone occurs in the midst of the locations of more than 4000 earthquakes off the Pacific coast of Nicaragua. The region is indicated by the seismic gap technique to be a likely location for an earthquake of magnitude larger than 7. The quiet zone has existed since at least 1950; the last large earthquake originating from this area occurred in 1898 and was of magnitude 7.5. A rough estimate indicates that the magnitude of an earthquake rupturing the entire quiet zone could be as large as that of the 1898 event. It is not yet possible to forecast a time frame for the occurrence of such an earthquake in the quiet zone. Copyright ?? 1981 AAAS.

  18. Seismic swarm associated with the 2008 eruption of Kasatochi Volcano, Alaska: earthquake locations and source parameters

    USGS Publications Warehouse

    Ruppert, Natalia G.; Prejean, Stephanie G.; Hansen, Roger A.

    2011-01-01

    An energetic seismic swarm accompanied an eruption of Kasatochi Volcano in the central Aleutian volcanic arc in August of 2008. In retrospect, the first earthquakes in the swarm were detected about 1 month prior to the eruption onset. Activity in the swarm quickly intensified less than 48 h prior to the first large explosion and subsequently subsided with decline of eruptive activity. The largest earthquake measured as moment magnitude 5.8, and a dozen additional earthquakes were larger than magnitude 4. The swarm exhibited both tectonic and volcanic characteristics. Its shear failure earthquake features were b value = 0.9, most earthquakes with impulsive P and S arrivals and higher-frequency content, and earthquake faulting parameters consistent with regional tectonic stresses. Its volcanic or fluid-influenced seismicity features were volcanic tremor, large CLVD components in moment tensor solutions, and increasing magnitudes with time. Earthquake location tests suggest that the earthquakes occurred in a distributed volume elongated in the NS direction either directly under the volcano or within 5-10 km south of it. Following the MW 5.8 event, earthquakes occurred in a new crustal volume slightly east and north of the previous earthquakes. The central Aleutian Arc is a tectonically active region with seismicity occurring in the crusts of the Pacific and North American plates in addition to interplate events. We postulate that the Kasatochi seismic swarm was a manifestation of the complex interaction of tectonic and magmatic processes in the Earth's crust. Although magmatic intrusion triggered the earthquakes in the swarm, the earthquakes failed in context of the regional stress field.

  19. Seismic swarm associated with the 2008 eruption of Kasatochi Volcano, Alaska: Earthquake locations and source parameters

    USGS Publications Warehouse

    Ruppert, N.A.; Prejean, S.; Hansen, R.A.

    2011-01-01

    An energetic seismic swarm accompanied an eruption of Kasatochi Volcano in the central Aleutian volcanic arc in August of 2008. In retrospect, the first earthquakes in the swarm were detected about 1 month prior to the eruption onset. Activity in the swarm quickly intensified less than 48 h prior to the first large explosion and subsequently subsided with decline of eruptive activity. The largest earthquake measured as moment magnitude 5.8, and a dozen additional earthquakes were larger than magnitude 4. The swarm exhibited both tectonic and volcanic characteristics. Its shear failure earthquake features were b value = 0.9, most earthquakes with impulsive P and S arrivals and higher-frequency content, and earthquake faulting parameters consistent with regional tectonic stresses. Its volcanic or fluid-influenced seismicity features were volcanic tremor, large CLVD components in moment tensor solutions, and increasing magnitudes with time. Earthquake location tests suggest that the earthquakes occurred in a distributed volume elongated in the NS direction either directly under the volcano or within 5-10 km south of it. Following the MW 5.8 event, earthquakes occurred in a new crustal volume slightly east and north of the previous earthquakes. The central Aleutian Arc is a tectonically active region with seismicity occurring in the crusts of the Pacific and North American plates in addition to interplate events. We postulate that the Kasatochi seismic swarm was a manifestation of the complex interaction of tectonic and magmatic processes in the Earth's crust. Although magmatic intrusion triggered the earthquakes in the swarm, the earthquakes failed in context of the regional stress field. Copyright ?? 2011 by the American Geophysical Union.

  20. Earthquake Rupture Dynamics using Adaptive Mesh Refinement and High-Order Accurate Numerical Methods

    NASA Astrophysics Data System (ADS)

    Kozdon, J. E.; Wilcox, L.

    2013-12-01

    Our goal is to develop scalable and adaptive (spatial and temporal) numerical methods for coupled, multiphysics problems using high-order accurate numerical methods. To do so, we are developing an opensource, parallel library known as bfam (available at http://bfam.in). The first application to be developed on top of bfam is an earthquake rupture dynamics solver using high-order discontinuous Galerkin methods and summation-by-parts finite difference methods. In earthquake rupture dynamics, wave propagation in the Earth's crust is coupled to frictional sliding on fault interfaces. This coupling is two-way, required the simultaneous simulation of both processes. The use of laboratory-measured friction parameters requires near-fault resolution that is 4-5 orders of magnitude higher than that needed to resolve the frequencies of interest in the volume. This, along with earlier simulations using a low-order, finite volume based adaptive mesh refinement framework, suggest that adaptive mesh refinement is ideally suited for this problem. The use of high-order methods is motivated by the high level of resolution required off the fault in earlier the low-order finite volume simulations; we believe this need for resolution is a result of the excessive numerical dissipation of low-order methods. In bfam spatial adaptivity is handled using the p4est library and temporal adaptivity will be accomplished through local time stepping. In this presentation we will present the guiding principles behind the library as well as verification of code against the Southern California Earthquake Center dynamic rupture code validation test problems.

  1. Event Detection and Location of Earthquakes Using the Cascadia Initiative Dataset

    NASA Astrophysics Data System (ADS)

    Morton, E.; Bilek, S. L.; Rowe, C. A.

    2015-12-01

    The Cascadia subduction zone (CSZ) produces a range of slip behavior along the plate boundary megathrust, from great earthquakes to episodic slow slip and tremor (ETS). Unlike other subduction zones that produce great earthquakes and ETS, the CSZ is notable for the lack of small and moderate magnitude earthquakes recorded. The seismogenic zone extent is currently estimated to be primarily offshore, thus the lack of observed small, interplate earthquakes may be partially due to the use of only land seismometers. The Cascadia Initiative (CI) community seismic experiment seeks to address this issue by including ocean bottom seismometers (OBS) deployed directly over the locked seismogenic zone, in addition to land seismometers. We use these seismic data to explore whether small magnitude earthquakes are occurring on the plate interface, but have gone undetected by the land-based seismic networks. We select a subset of small magnitude (M0.1-3.7) earthquakes from existing earthquake catalogs, based on land seismic data, whose preliminary hypocentral locations suggest they may have occurred on the plate interface. We window the waveforms on CI OBS and land seismometers around the phase arrival times for these earthquakes to generate templates for subspace detection, which allows for additional flexibility over traditional matched filter detection methods. Here we present event detections from the first year of CI deployment and preliminary locations for the detected events. Initial results of scanning the first year of the CI deployment using one cluster of template events, located near a previously identified subducted seamount, include 473 detections on OBS station M08A (~61.6 km offshore) and 710 detections on OBS station J25A (~44.8 km northeast of M08A). Ongoing efforts include detection using additional OBS stations along the margin, as well as determining locations of clusters detected in the first year of deployment.

  2. Comparison of mid-oceanic earthquake epicentral differences of travel time, centroid locations, and those determined by autonomous underwater hydrophone arrays

    NASA Astrophysics Data System (ADS)

    Pan, Jianfeng; Dziewonski, Adam M.

    2005-07-01

    Mid-oceanic interplate earthquakes are difficult to locate accurately because they normally occur far away from land-based seismic stations. Use of water-borne T waves recorded by autonomous underwater hydrophone (AUH) arrays records an order of magnitude more highly accurate regional low seismicity along the north Mid-Atlantic Ridge than the International Seismic Centre (ISC). Even though the physical meaning of an AUH locations is still not well known, AUH's small location errors are important for better constraining mid-oceanic earthquakes. Comparison of such AUH locations with those in ISC and Harvard centroid moment tensor (CMT) location catalog, and relocated ones based on the high-resolution bathymetry and teleseismic P phases, is made in this study. AUH locations are used as a reference to compare the teleseismically determined locations. For large earthquakes with known focal mechanisms, we find that relocated locations agree with AUH ones better than with ISC. We also note that the centroid vectors from relocated epicenters are usually larger than AUH centroid vectors. The relocated epicenters and AUH locations lie in similar azimuthal directions to the associated CMT epicenters. The larger relocated and AUH centroid vectors (than the error ellipses of AUH, CMT, and relocated ones combined) might be explained by the fault rupture process. For smaller events, relocated location confidence ellipses are usually large enough to cover AUH locations and their error ellipses. Overall, the highly accurate AUH locations can be used to confirm the mid-oceanic earthquake hypocenters and seismicity characteristics and for detail studies of the low-level seismicity associated with the plate motions.

  3. Locations and magnitudes of historical earthquakes in the Sierra of Ecuador (1587-1996)

    NASA Astrophysics Data System (ADS)

    Beauval, Céline; Yepes, Hugo; Bakun, William H.; Egred, José; Alvarado, Alexandra; Singaucho, Juan-Carlos

    2010-06-01

    The whole territory of Ecuador is exposed to seismic hazard. Great earthquakes can occur in the subduction zone (e.g. Esmeraldas, 1906, Mw 8.8), whereas lower magnitude but shallower and potentially more destructive earthquakes can occur in the highlands. This study focuses on the historical crustal earthquakes of the Andean Cordillera. Several large cities are located in the Interandean Valley, among them Quito, the capital (~2.5 millions inhabitants). A total population of ~6 millions inhabitants currently live in the highlands, raising the seismic risk. At present, precise instrumental data for the Ecuadorian territory is not available for periods earlier than 1990 (beginning date of the revised instrumental Ecuadorian seismic catalogue); therefore historical data are of utmost importance for assessing seismic hazard. In this study, the Bakun & Wentworth method is applied in order to determine magnitudes, locations, and associated uncertainties for historical earthquakes of the Sierra over the period 1587-1976. An intensity-magnitude equation is derived from the four most reliable instrumental earthquakes (Mw between 5.3 and 7.1). Intensity data available per historical earthquake vary between 10 (Quito, 1587, Intensity >=VI) and 117 (Riobamba, 1797, Intensity >=III). The bootstrap resampling technique is coupled to the B&W method for deriving geographical confidence contours for the intensity centre depending on the data set of each earthquake, as well as confidence intervals for the magnitude. The extension of the area delineating the intensity centre location at the 67 per cent confidence level (+/-1σ) depends on the amount of intensity data, on their internal coherence, on the number of intensity degrees available, and on their spatial distribution. Special attention is dedicated to the few earthquakes described by intensities reaching IX, X and XI degrees. Twenty-five events are studied, and nineteen new epicentral locations are obtained, yielding

  4. Foreshocks and aftershocks locations of the 2014 Pisagua, N. Chile earthquake: history of a megathrust earthquake nucleation

    NASA Astrophysics Data System (ADS)

    Fuenzalida Velasco, Amaya; Rietbrock, Andreas; Tavera, Hernando; Ryder, Isabelle; Ruiz, Sergio; Thomas, Reece; De Angelis, Silvio; Bondoux, Francis

    2015-04-01

    The April 2014 Mw 8.1 Pisagua earthquake occurred in the Northern Chile seismic gap: a region of the South American subduction zone lying between Arica city and the Mejillones Peninsula. It is believed that this part of the subduction zone has not experienced a large earthquake since 1877. Thanks to the identification of this seismic gap, the north of Chile was well instrumented before the Pisagua earthquake, including the Integrated Plate boundary Observatory Chile (IPOC) network and the Chilean local network installed by the Centro Sismologico Nacional (CSN). These instruments were able to record the full foreshock and aftershock sequences, allowing a unique opportunity to study the nucleation process of large megathrust earthquakes. To improve azimuthal coverage of the Pisagua seismic sequence, after the earthquake, in collaboration with the Instituto Geofisico del Peru (IGP) we installed a temporary seismic network in south of Peru. The network comprised 12 short-period stations located in the coastal area between Moquegua and Tacna and they were operative from 1st May 2014. We also installed three stations on the slopes of the Ticsiani volcano to monitor any possible change in volcanic activity following the Pisagua earthquake. In this work we analysed the continuous seismic data recorded by CSN and IPOC networks from 1 March to 30 June to obtain the catalogue of the sequence, including foreshocks and aftershocks. Using an automatic algorithm based in STA/LTA we obtained the picks for P and S waves. Association in time and space defined the events and computed an initial location using Hypo71 and the 1D local velocity model. More than 11,000 events were identified with this method for the whole period, but we selected the best resolved events that include more than 7 observed arrivals with at least 2 S picks of them, to relocate these events using NonLinLoc software. For the main events of the sequence we carefully estimate event locations and we obtained

  5. Detection, location, and analysis of earthquakes using seismic surface waves (Beno Gutenberg Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Ekström, Göran

    2015-04-01

    For shallow sources, Love and Rayleigh waves are the largest seismic phases recorded at teleseismic distances. The utility of these waves for earthquake characterization was traditionally limited to magnitude estimation, since geographically variable dispersion makes it difficult to determine useful travel-time information from the waveforms. Path delays due to heterogeneity of several tens of seconds are typical for waves at 50 sec period, and these delays must be accounted for with precision and accuracy in order to extract propagation-phase and source-phase information. Advances in tomographic mapping of global surface-wave phase velocities, and continuous growth and improvements of seismographic networks around the world, now make possible new applications of surface waves for earthquake monitoring and analysis. Through continuous back propagation of the long-period seismic wave field recorded by globally distributed stations, nearly all shallow earthquakes greater than M=5 can be detected and located with a precision of 25 km. Some of the detected events do not appear in standard earthquake catalogs and correspond to non-tectonic earthquakes, including landslides, glacier calving, and volcanic events. With the improved ability to predict complex propagation effects of surface waves across a heterogeneous Earth, moment-tensor and force representations of seismic sources can be routinely determined for all earthquakes greater than M=5 by waveform fitting of surface waves. A current area of progress in the use of surface waves for earthquake studies is the determination of precise relative locations of remote seismicity by systematic cross correlation and analysis of surface waves generated by neighboring sources. Preliminary results indicate that a location precision of 5 km may be achievable in many areas of the world.

  6. Regional intensity attenuation models for France and the estimation of magnitude and location of historical earthquakes

    USGS Publications Warehouse

    Bakun, W.H.; Scotti, O.

    2006-01-01

    Intensity assignments for 33 calibration earthquakes were used to develop intensity attenuation models for the Alps, Armorican, Provence, Pyrenees and Rhine regions of France. Intensity decreases with ?? most rapidly in the French Alps, Provence and Pyrenees regions, and least rapidly in the Armorican and Rhine regions. The comparable Armorican and Rhine region attenuation models are aggregated into a French stable continental region model and the comparable Provence and Pyrenees region models are aggregated into a Southern France model. We analyse MSK intensity assignments using the technique of Bakun & Wentworth, which provides an objective method for estimating epicentral location and intensity magnitude MI. MI for the 1356 October 18 earthquake in the French stable continental region is 6.6 for a location near Basle, Switzerland, and moment magnitude M is 5.9-7.2 at the 95 per cent (??2??) confidence level. MI for the 1909 June 11 Trevaresse (Lambesc) earthquake near Marseilles in the Southern France region is 5.5, and M is 4.9-6.0 at the 95 per cent confidence level. Bootstrap resampling techniques are used to calculate objective, reproducible 67 per cent and 95 per cent confidence regions for the locations of historical earthquakes. These confidence regions for location provide an attractive alternative to the macroseismic epicentre and qualitative location uncertainties used heretofore. ?? 2006 The Authors Journal compilation ?? 2006 RAS.

  7. Co-located ionospheric and geomagnetic disturbances caused by great earthquakes

    NASA Astrophysics Data System (ADS)

    Hao, Yongqiang; Zhang, Donghe; Xiao, Zuo

    2016-07-01

    Despite primary energy disturbances from the Sun, oscillations of the Earth surface due to a large earthquake will couple with the atmosphere and therefore the ionosphere, to generate so-called coseismic ionospheric disturbances (CIDs). In the cases of 2008 Wenchuan and 2011 Tohoku earthquakes, infrasonic waves accompanying the propagation of seismic Rayleigh waves were observed in the ionosphere by a combination of techniques, total electron content, HF Doppler, and ground magnetometer. This is the very first report to present CIDs recorded by different techniques at co-located sites and profiled with regard to changes of both ionospheric plasma and current (geomagnetic field) simultaneously. Comparison between the oceanic (2011 Tohoku) and inland (2008 Wenchuan) earthquakes revealed that the main directional lobe of latter case is more distinct which is perpendicular to the direction of the fault rupture. We argue that the different fault slip (inland or submarine) may affect the way of couplings of lithosphere with atmosphere. Zhao, B., and Y. Hao (2015), Ionospheric and geomagnetic disturbances caused by the 2008 Wenchuan earthquake: A revisit, J. Geophys. Res., doi:10.1002/2015JA021035. Hao, Y. Q., et al. (2013), Teleseismic magnetic effects (TMDs) of 2011 Tohoku earthquake, J. Geophys. Res., doi:10.1002/jgra.50326. Hao, Y. Q., et al. (2012), Multi-instrument observation on co-seismic ionospheric effects after great Tohoku earthquake, J. Geophys. Res., doi:10.1029/2011JA017036.

  8. Estimating locations and magnitudes of earthquakes in eastern North America from Modified Mercalli intensities

    USGS Publications Warehouse

    Bakun, W.H.; Johnston, A.C.; Hopper, M.G.

    2003-01-01

    We use 28 calibration events (3.7 ??? M ??? 7.3) from Texas to the Grand Banks, Newfoundland, to develop a Modified Mercalli intensity (MMI) model and associated site corrections for estimating source parameters of historical earthquakes in eastern North America. The model, MMI = 1.41 + 1.68 ?? M - 0.00345 ?? ?? - 2.08log (??), where ?? is the distance in kilometers from the epicenter and M is moment magnitude, provides unbiased estimates of M and its uncertainty, and, if site corrections are used, of source location. The model can be used for the analysis of historical earthquakes with only a few MMI assignments. We use this model, MMI site corrections, and Bakun and Wentworth's (1997 technique to estimate M and the epicenter for three important historical earthquakes. The intensity magnitude M1 is 6.1 for the 18 November 1755 earthquake near Cape Ann, Massachusetts; 6.0 for the 5 January 1843 earthquake near Marked Tree, Arkansas; and 6.0 for the 31 October 1895 earthquake. The 1895 event probably occurred in southern Illinois, about 100 km north of the site of significant ground failure effects near Charleston, Missouri.

  9. Testing small-aperture array analysis on well-located earthquakes, and application to the location of deep tremor

    USGS Publications Warehouse

    La, Rocca M.; Galluzzo, D.; Malone, S.; McCausland, W.; Saccorotti, G.; Del, Pezzo E.

    2008-01-01

    We have here analyzed local and regional earthquakes using array techniques with the double aim of quantifying the errors associated with the estimation of propagation parameters of seismic signals and testing the suitability of a probabilistic location method for the analysis of nonimpulsive signals. We have applied the zero-lag cross-correlation method to earthquakes recorded by three dense arrays in Puget Sound and Vancouver Island to estimate the slowness and back azimuth of direct P waves and S waves. The results are compared with the slowness and back azimuth computed from the source location obtained by the analysis of data recorded by the Pacific Northwest seismic network (PNSN). This comparison has allowed a quantification of the errors associated with the estimation of slowness and back azimuth obtained through the analysis of array data. The statistical analysis gives ??BP = 10?? and ??BS = 8?? as standard deviations for the back azimuth and ??SP = 0.021 sec/km and ??SS = 0.033 sec /km for the slowness results of the P and S phases, respectively. These values are consistent with the theoretical relationship between slowness and back azimuth and their uncertainties. We have tested a probabilistic source location method on the local earthquakes based on the use of the slowness estimated for two or three arrays without taking into account travel-time information. Then we applied the probabilistic method to the deep, nonvolcanic tremor recorded by the arrays during July 2004. The results of the tremor location using the probabilistic method are in good agreement with those obtained by other techniques. The wide depth range, of between 10 and 70 km, and the source migration with time are evident in our results. The method is useful for locating the source of signals characterized by the absence of pickable seismic phases.

  10. Development of an accurate transmission line fault locator using the global positioning system satellites

    NASA Technical Reports Server (NTRS)

    Lee, Harry

    1994-01-01

    A highly accurate transmission line fault locator based on the traveling-wave principle was developed and successfully operated within B.C. Hydro. A transmission line fault produces a fast-risetime traveling wave at the fault point which propagates along the transmission line. This fault locator system consists of traveling wave detectors located at key substations which detect and time tag the leading edge of the fault-generated traveling wave as if passes through. A master station gathers the time-tagged information from the remote detectors and determines the location of the fault. Precise time is a key element to the success of this system. This fault locator system derives its timing from the Global Positioning System (GPS) satellites. System tests confirmed the accuracy of locating faults to within the design objective of +/-300 meters.

  11. Seismicity patterns along the Ecuadorian subduction zone: new constraints from earthquake location in a 3-D a priori velocity model

    NASA Astrophysics Data System (ADS)

    Font, Yvonne; Segovia, Monica; Vaca, Sandro; Theunissen, Thomas

    2013-04-01

    To improve earthquake location, we create a 3-D a priori P-wave velocity model (3-DVM) that approximates the large velocity variations of the Ecuadorian subduction system. The 3-DVM is constructed from the integration of geophysical and geological data that depend on the structural geometry and velocity properties of the crust and the upper mantle. In addition, specific station selection is carried out to compensate for the high station density on the Andean Chain. 3-D synthetic experiments are then designed to evaluate the network capacity to recover the event position using only P arrivals and the MAXI technique. Three synthetic earthquake location experiments are proposed: (1) noise-free and (2) noisy arrivals used in the 3-DVM, and (3) noise-free arrivals used in a 1-DVM. Synthetic results indicate that, under the best conditions (exact arrival data set and 3-DVM), the spatiotemporal configuration of the Ecuadorian network can accurately locate 70 per cent of events in the frontal part of the subduction zone (average azimuthal gap is 289° ± 44°). Noisy P arrivals (up to ± 0.3 s) can accurately located 50 per cent of earthquakes. Processing earthquake location within a 1-DVM almost never allows accurate hypocentre position for offshore earthquakes (15 per cent), which highlights the role of using a 3-DVM in subduction zone. For the application to real data, the seismicity distribution from the 3-D-MAXI catalogue is also compared to the determinations obtained in a 1-D-layered VM. In addition to good-quality location uncertainties, the clustering and the depth distribution confirm the 3-D-MAXI catalogue reliability. The pattern of the seismicity distribution (a 13 yr record during the inter-seismic period of the seismic cycle) is compared to the pattern of rupture zone and asperity of the Mw = 7.9 1942 and the Mw = 7.7 1958 events (the Mw = 8.8 1906 asperity patch is not defined). We observe that the nucleation of 1942, 1958 and 1906 events coincides with

  12. Location capability of a sparse regional network (RSTN) using a multi-phase earthquake location algorithm (REGLOC)

    SciTech Connect

    Hutchings, L.

    1994-01-01

    The Regional Seismic Test Network (RSTN) was deployed by the US Department of Energy (DOE) to determine whether data recorded by a regional network could be used to detect and accurately locate seismic events that might be clandestine nuclear tests. The purpose of this paper is to evaluate the location capability of the RSTN. A major part of this project was the development of the location algorithm REGLOC and application of Basian a prior statistics for determining the accuracy of the location estimates. REGLOC utilizes all identifiable phases, including backazimuth, in the location. Ninty-four events, distributed throughout the network area, detected by both the RSTN and located by local networks were used in the study. The location capability of the RSTN was evaluated by estimating the location accuracy, error ellipse accuracy, and the percentage of events that could be located, as a function of magnitude. The location accuracy was verified by comparing the RSTN results for the 94 events with published locations based on data from the local networks. The error ellipse accuracy was evaluated by determining whether the error ellipse includes the actual location. The percentage of events located was assessed by combining detection capability with location capability to determine the percentage of events that could be located within the study area. Events were located with both an average crustal model for the entire region, and with regional velocity models along with station corrections obtained from master events. Most events with a magnitude <3.0 can only be located with arrivals from one station. Their average location errors are 453 and 414 km for the average- and regional-velocity model locations, respectively. Single station locations are very unreliable because they depend on accurate backazimuth estimates, and backazimuth proved to be a very unreliable computation.

  13. Hydrogen atoms can be located accurately and precisely by x-ray crystallography.

    PubMed

    Woińska, Magdalena; Grabowsky, Simon; Dominiak, Paulina M; Woźniak, Krzysztof; Jayatilaka, Dylan

    2016-05-01

    Precise and accurate structural information on hydrogen atoms is crucial to the study of energies of interactions important for crystal engineering, materials science, medicine, and pharmacy, and to the estimation of physical and chemical properties in solids. However, hydrogen atoms only scatter x-radiation weakly, so x-rays have not been used routinely to locate them accurately. Textbooks and teaching classes still emphasize that hydrogen atoms cannot be located with x-rays close to heavy elements; instead, neutron diffraction is needed. We show that, contrary to widespread expectation, hydrogen atoms can be located very accurately using x-ray diffraction, yielding bond lengths involving hydrogen atoms (A-H) that are in agreement with results from neutron diffraction mostly within a single standard deviation. The precision of the determination is also comparable between x-ray and neutron diffraction results. This has been achieved at resolutions as low as 0.8 Å using Hirshfeld atom refinement (HAR). We have applied HAR to 81 crystal structures of organic molecules and compared the A-H bond lengths with those from neutron measurements for A-H bonds sorted into bonds of the same class. We further show in a selection of inorganic compounds that hydrogen atoms can be located in bridging positions and close to heavy transition metals accurately and precisely. We anticipate that, in the future, conventional x-radiation sources at in-house diffractometers can be used routinely for locating hydrogen atoms in small molecules accurately instead of large-scale facilities such as spallation sources or nuclear reactors. PMID:27386545

  14. Hydrogen atoms can be located accurately and precisely by x-ray crystallography

    PubMed Central

    Woińska, Magdalena; Grabowsky, Simon; Dominiak, Paulina M.; Woźniak, Krzysztof; Jayatilaka, Dylan

    2016-01-01

    Precise and accurate structural information on hydrogen atoms is crucial to the study of energies of interactions important for crystal engineering, materials science, medicine, and pharmacy, and to the estimation of physical and chemical properties in solids. However, hydrogen atoms only scatter x-radiation weakly, so x-rays have not been used routinely to locate them accurately. Textbooks and teaching classes still emphasize that hydrogen atoms cannot be located with x-rays close to heavy elements; instead, neutron diffraction is needed. We show that, contrary to widespread expectation, hydrogen atoms can be located very accurately using x-ray diffraction, yielding bond lengths involving hydrogen atoms (A–H) that are in agreement with results from neutron diffraction mostly within a single standard deviation. The precision of the determination is also comparable between x-ray and neutron diffraction results. This has been achieved at resolutions as low as 0.8 Å using Hirshfeld atom refinement (HAR). We have applied HAR to 81 crystal structures of organic molecules and compared the A–H bond lengths with those from neutron measurements for A–H bonds sorted into bonds of the same class. We further show in a selection of inorganic compounds that hydrogen atoms can be located in bridging positions and close to heavy transition metals accurately and precisely. We anticipate that, in the future, conventional x-radiation sources at in-house diffractometers can be used routinely for locating hydrogen atoms in small molecules accurately instead of large-scale facilities such as spallation sources or nuclear reactors. PMID:27386545

  15. A hierarchical Bayesian approach for earthquake location and data uncertainty estimation in 3D heterogeneous media

    NASA Astrophysics Data System (ADS)

    Arroucau, Pierre; Custódio, Susana

    2015-04-01

    Solving inverse problems requires an estimate of data uncertainties. This usually takes the form of a data covariance matrix, which determines the shape of the model posterior distribution. Those uncertainties are yet not always known precisely and it is common practice to simply set them to a fixed, reasonable value. In the case of earthquake location, the hypocentral parameters (longitude, latitude, depth and origin time) are typically inverted for using seismic phase arrival times. But quantitative data variance estimates are rarely provided. Instead, arrival time catalogs usually associate phase picks with a quality factor, which is subsequently interpreted more or less arbitrarily in terms of data uncertainty in the location procedure. Here, we present a hierarchical Bayesian algorithm for earthquake location in 3D heterogeneous media, in which not only the earthquake hypocentral parameters, but also the P- and S-wave arrival time uncertainties, are inverted for, hence allowing more realistic posterior model covariance estimates. Forward modeling is achieved by means of the Fast Marching Method (FMM), an eikonal solver which has the ability to take interfaces into account, so direct, reflected and refracted phases can be used in the inversion. We illustrate the ability of our algorithm to retrieve earthquake hypocentral parameters as well as data uncertainties through synthetic examples and using a subset of arrival time catalogs for mainland Portugal and its Atlantic margin.

  16. Seismicity and crustal structure studies of southern California: tectonic implications from improved earthquake locations

    SciTech Connect

    Corbett, E.J.

    1984-01-01

    The 5.1 M/sub L/ Santa Barbara earthquake of 13 August 1978 was located 3 km southeast of Santa Barbara at a focal depth of 12.7 km. The temporal-spatial development of the aftershock zone may indicate that the initial rupture plane was considerably smaller than that of the eventual aftershock zone. The aftershock hypocenters outline a nearly horizontal plane (dipping 15/sup 0/ or less) at 13-km depth and the preferred focal mechanism indicates north-over-south thrusting. To further test the decollement hypothesis, Caltech catalog locations were reviewed to determine the depth distribution of earthquakes in the Transverse Ranges. The seismogenic zone is thickest along the southern front of the Transverse Ranges and is thinnest in the southern Mojave Desert and at the east end of the Transverse Ranges. The seismicity of the western Transverse Ranges is typified by north-dipping planar structures and the eastern Transverse Ranges are typified by pervasive seismicity extending down to the floor of the seismogenic zone. Data from a large quarry explosion on Catalina Island were utilized to derive a 3-layer Continental Borderland velocity structure to improve the locations of the 1981 Santa Barbara Island earthquakes. The Santa Barbara Island earthquake (5.3 M/sub L/) occurred on September 4, 1981. Aftershocks exhibited a clear northwest-southeast alignment that coincides with the submarine escarpment of the Santa Cruz-Catalina fault and was consistent with focal mechanisms.

  17. A Hierarchical Bayesian Approcah for Earthquake Location and Data Uncertainty Estimation in 3D Heterogeneous Media

    NASA Astrophysics Data System (ADS)

    Arroucau, P.; Custodio, S.

    2014-12-01

    Solving inverse problems requires an estimate of data uncertainties. This usually takes the form of a data covariance matrix, which determines the shape of the model posterior distribution. Those uncertainties are yet not always known precisely and it is common practice to simply set them to a fixed, reasonable value. In the case of earthquake location, the hypocentral parameters (longitude, latitude, depth and origin time) are typically inverted for using seismic phase arrival times. But quantitative data variance estimates are rarely provided. Instead, arrival time catalogs usually associate phase picks with a quality factor, which is subsequently interpreted more or less arbitrarily in terms of data uncertainty in the location procedure. Here, we present a hierarchical Bayesian algorithm for earthquake location in 3D heterogeneous media, in which not only the earthquake hypocentral parameters, but also the P- and S-wave arrival time uncertainties, are inverted for, hence allowing more realistic posterior model covariance estimates. Forward modeling is achieved by means of the Fast Marching Method (FMM), an eikonal solver which has the ability to take interfaces into account, so direct, reflected and refracted phases can be used in the inversion. We illustrate the ability of our algorithm to retrieve earthquake hypocentral parameters as well as data uncertainties through synthetic examples and using a subset of arrival time catalogs for mainland Portugal and its Atlantic margin.

  18. A Double-difference Earthquake location algorithm: Method and application to the Northern Hayward Fault, California

    USGS Publications Warehouse

    Waldhauser, F.; Ellsworth, W.L.

    2000-01-01

    We have developed an efficient method to determine high-resolution hypocenter locations over large distances. The location method incorporates ordinary absolute travel-time measurements and/or cross-correlation P-and S-wave differential travel-time measurements. Residuals between observed and theoretical travel-time differences (or double-differences) are minimized for pairs of earthquakes at each station while linking together all observed event-station pairs. A least-squares solution is found by iteratively adjusting the vector difference between hypocentral pairs. The double-difference algorithm minimizes errors due to unmodeled velocity structure without the use of station corrections. Because catalog and cross-correlation data are combined into one system of equations, interevent distances within multiplets are determined to the accuracy of the cross-correlation data, while the relative locations between multiplets and uncorrelated events are simultaneously determined to the accuracy of the absolute travel-time data. Statistical resampling methods are used to estimate data accuracy and location errors. Uncertainties in double-difference locations are improved by more than an order of magnitude compared to catalog locations. The algorithm is tested, and its performance is demonstrated on two clusters of earthquakes located on the northern Hayward fault, California. There it colapses the diffuse catalog locations into sharp images of seismicity and reveals horizontal lineations of hypocenter that define the narrow regions on the fault where stress is released by brittle failure.

  19. Assessing the benefit of 3D a priori models for earthquake location

    NASA Astrophysics Data System (ADS)

    Tilmann, F. J.; Manzanares, A.; Peters, K.; Kahle, R. L.; Lange, D.; Saul, J.; Nooshiri, N.

    2014-12-01

    Earthquake location in 1D Earth models is a routine procedure. Particularly in environments such as subduction zones where the network geometry is biased and lateral velocity variations are large, the use of a 1D model can lead to strongly biased solutions. This is well-known and it is therefore usually preferred to use three-dimensional models, e.g. from local earthquake tomography. Efficient codes for earthquake location in 3D models are available for routine use, for example NonLinLoc. However, tomographic studies are time-consuming to carry out, and a sufficient number of data might not always be available. However, in many cases, information about the three-dimensional velocity structure is available in the form of refraction surveys or other constraints such as gravity or receiver functions based models. Failing that, global or regional scale crustal models could be employed. However, it is not obvious that models derived using different types of data lead to better location results than an optimised 1D velocity model. On the other hand, correct interpretation of seismicity patterns often requires comparison and exaxt positioning in pre-existing velocity models. In this presentation we draw on examples from the Chilean and Sumatran margins as well as a mid-ocean ridge environments, using both data and synthetic examples to investigate under what conditions the use of a priori 3D models is expected to result in improved location results and modifies interpretation. Furthermore, we introduce MATLAB tools that facilitate the creation of three-dimensional models suitable for earthquake location from refraction profiles, CRUST1 and SLAB1.0 and other model types.

  20. P-wave propagation heterogeneity and earthquake location in the Mediterranean region

    NASA Astrophysics Data System (ADS)

    Piromallo, Claudia; Morelli, Andrea

    1998-10-01

    We analyse P-wave traveltimes for the Mediterranean area, using both teleseismic and regional arrivals for shallow earthquakes reported in the Bulletins of the International Seismological Centre. We model delays between pairs of 0.5° × 0.5° cells, obtaining a detailed representation of the P traveltime heterogeneities. Examination of these anomalies shows the clear presence of geographically coherent patterns-consistent with known geological features-due to significant structure in the upper mantle. We present a scheme, based on an empirical heterogeneity correction (EHC) to P-wave traveltimes, to improve earthquake location. This method provides similar benefits to those of a location procedure based on ray tracing in a 3-D model, but it is simpler and computationally more efficient. The definition of the traveltime heterogeneity model, being based on a statistical procedure, bypasses most of the critical points and possible instabilities involved in model inversion. EHC relocation, applied to Mediterranean earthquakes, allows one to predict about 70 per cent of the estimated signal due to heterogeneity and produces epicentral and origin time-shifts of, respectively, 4.22 km and 0.35 s (rms). From a synthetic experiment, in which we use the proposed algorithm to retrieve known source locations, we estimate that the rms improvement achieved by the EHC relocation over a simpler, standard, 1-D location is more than 20 per cent for both epicentral mislocation and origin time-shifts.

  1. Testing continuous earthquake detection and location in Alentejo (South Portugal) by waveform coherency analysis

    NASA Astrophysics Data System (ADS)

    Matos, Catarina; Grigoli, Francesco; Cesca, Simone; Custódio, Susana

    2015-04-01

    In the last decade a permanent seismic network of 30 broadband stations, complemented by dense temporary deployments, covered Portugal. This extraordinary network coverage enables now the computation of a high-resolution image of the seismicity of Portugal, which in turn will shed light on the seismotectonics of Portugal. The large data volumes available cannot be analyzed by traditional time-consuming manual location procedures. In this presentation we show first results on the automatic detection and location of earthquakes occurred in a selected region in the south of Portugal Our main goal is to implement an automatic earthquake detection and location routine in order to have a tool to quickly process large data sets, while at the same time detecting low magnitude earthquakes (i.e., lowering the detection threshold). We present a modified version of the automatic seismic event location by waveform coherency analysis developed by Grigoli et al. (2013, 2014), designed to perform earthquake detections and locations in continuous data. The event detection is performed by continuously computing the short-term-average/long-term-average of two different characteristic functions (CFs). For the P phases we used a CF based on the vertical energy trace, while for S phases we used a CF based on the maximum eigenvalue of the instantaneous covariance matrix (Vidale 1991). Seismic event detection and location is obtained by performing waveform coherence analysis scanning different hypocentral coordinates. We apply this technique to earthquakes in the Alentejo region (South Portugal), taking advantage from a small aperture seismic network installed in the south of Portugal for two years (2010 - 2011) during the DOCTAR experiment. In addition to the good network coverage, the Alentejo region was chosen for its simple tectonic setting and also because the relationship between seismicity, tectonics and local lithospheric structure is intriguing and still poorly understood. Inside

  2. An Improved Source-Scanning Algorithm for Locating Earthquake Clusters or Aftershock Sequences

    NASA Astrophysics Data System (ADS)

    Liao, Y.; Kao, H.; Hsu, S.

    2010-12-01

    The Source-scanning Algorithm (SSA) was originally introduced in 2004 to locate non-volcanic tremors. Its application was later expanded to the identification of earthquake rupture planes and the near-real-time detection and monitoring of landslides and mud/debris flows. In this study, we further improve SSA for the purpose of locating earthquake clusters or aftershock sequences when only a limited number of waveform observations are available. The main improvements include the application of a ground motion analyzer to separate P and S waves, the automatic determination of resolution based on the grid size and time step of the scanning process, and a modified brightness function to utilize constraints from multiple phases. Specifically, the improved SSA (named as ISSA) addresses two major issues related to locating earthquake clusters/aftershocks. The first one is the massive amount of both time and labour to locate a large number of seismic events manually. And the second one is to efficiently and correctly identify the same phase across the entire recording array when multiple events occur closely in time and space. To test the robustness of ISSA, we generate synthetic waveforms consisting of 3 separated events such that individual P and S phases arrive at different stations in different order, thus making correct phase picking nearly impossible. Using these very complicated waveforms as the input, the ISSA scans all model space for possible combination of time and location for the existence of seismic sources. The scanning results successfully associate various phases from each event at all stations, and correctly recover the input. To further demonstrate the advantage of ISSA, we apply it to the waveform data collected by a temporary OBS array for the aftershock sequence of an offshore earthquake southwest of Taiwan. The overall signal-to-noise ratio is inadequate for locating small events; and the precise arrival times of P and S phases are difficult to

  3. ISC-GEM: Global Instrumental Earthquake Catalogue (1900-2009) I. Location and Seismicity Patterns

    NASA Astrophysics Data System (ADS)

    Bondar, I.; Engdahl, E. R.; Villasenor, A.; Storchak, D. A.

    2012-12-01

    We present the final results of a two-year project sponsored by the GEM (Global Earthquake Model) Foundation. The ISC-GEM global catalogue consists of some 19 thousand instrumentally recorded, moderate to large earthquakes, spanning 110 years of seismicity. We relocated all events in the catalogue using a two-tier approach. The EHB location methodology (Engdahl et al., 1998) was applied first to obtain improved hypocentres with special focus on the depth determination. The locations were further refined in the next step by fixing the depths to those from the EHB analysis and applying the new ISC location algorithm (Bondár and Storchak, 2011) that reduces location bias by accounting for correlated travel-time prediction error structure. To facilitate the relocation effort, some 900,000 seismic P and S wave arrival-time data were added to the ISC database for the period between 1904 and 1963, either from original station bulletins in the ISC archive or by digitizing the scanned images of the ISS bulletin (Villaseñor and Engdahl, 2005; 2007). Although no substantial amount of new phase data were acquired for the modern period (1964-2009), the number of phases used in the location has still increased by 3 million, owing to fact that both the EHB and ISC locators use all ak135 (Kennett et al., 1995) phases in the location. We show that the relocation effort yielded substantially improved locations, especially in the first half of the 20th century; we demonstrate significant improvements in focal depth estimates in subduction zones and other seismically active regions; and we show that the ISC-GEM catalogue provides an improved view of 110 years of global seismicity of the Earth. The ISC-GEM Global Instrumental Earthquake Catalogue represents the final product of one of the ten global components in the GEM program, and will be made available to researchers at the ISC (www.isc.ac.uk) website.

  4. Modelling earthquake location errors at a reservoir scale: a case study in the Upper Rhine Graben

    NASA Astrophysics Data System (ADS)

    Kinnaert, X.; Gaucher, E.; Achauer, U.; Kohl, T.

    2016-05-01

    Earthquake absolute location errors which can be encountered in an underground reservoir are investigated. In such an exploitation context, earthquake hypocentre errors can have an impact on the field development and economic consequences. The approach using state-of-the-art techniques covers both the location uncertainty and the location inaccuracy - or bias - problematics. It consists, first, in creating a 3D synthetic seismic cloud of events in the reservoir and calculating the seismic travel times to a monitoring network assuming certain propagation conditions. In a second phase, the earthquakes are relocated with assumptions different from the initial conditions. Finally, the initial and relocated hypocentres are compared. As a result, location errors driven by the seismic onset time picking uncertainties and inaccuracies are quantified in 3D. Effects induced by erroneous assumptions associated with the velocity model are also modelled. In particular, 1D velocity model uncertainties, a local 3D perturbation of the velocity and a 3D geo-structural model are considered. The present approach is applied to the site of Rittershoffen (Alsace, France), which is one of the deep geothermal fields existing in the Upper Rhine Graben. This example allows setting realistic scenarios based on the knowledge of the site. In that case, the zone of interest, monitored by an existing seismic network, ranges between 1 and 5 km depth in a radius of 2 km around a geothermal well. Well log data provided a reference 1D velocity model used for the synthetic earthquake relocation. The 3D analysis highlights the role played by the seismic network coverage and the velocity model in the amplitude and orientation of the location uncertainties and inaccuracies at subsurface levels. The location errors are neither isotropic nor aleatoric in the zone of interest. This suggests that although location inaccuracies may be smaller than location uncertainties, both quantities can have a cumulative

  5. Modelling earthquake location errors at a reservoir scale: a case study in the Upper Rhine Graben

    NASA Astrophysics Data System (ADS)

    Kinnaert, X.; Gaucher, E.; Achauer, U.; Kohl, T.

    2016-08-01

    Earthquake absolute location errors which can be encountered in an underground reservoir are investigated. In such an exploitation context, earthquake hypocentre errors can have an impact on the field development and economic consequences. The approach using the state-of-the-art techniques covers both the location uncertainty and the location inaccuracy-or bias-problematics. It consists, first, in creating a 3-D synthetic seismic cloud of events in the reservoir and calculating the seismic traveltimes to a monitoring network assuming certain propagation conditions. In a second phase, the earthquakes are relocated with assumptions different from the initial conditions. Finally, the initial and relocated hypocentres are compared. As a result, location errors driven by the seismic onset time picking uncertainties and inaccuracies are quantified in 3-D. Effects induced by erroneous assumptions associated with the velocity model are also modelled. In particular, 1-D velocity model uncertainties, a local 3-D perturbation of the velocity and a 3-D geostructural model are considered. The present approach is applied to the site of Rittershoffen (Alsace, France), which is one of the deep geothermal fields existing in the Upper Rhine Graben. This example allows setting realistic scenarios based on the knowledge of the site. In that case, the zone of interest, monitored by an existing seismic network, ranges between 1 and 5 km depth in a radius of 2 km around a geothermal well. Well log data provided a reference 1-D velocity model used for the synthetic earthquake relocation. The 3-D analysis highlights the role played by the seismic network coverage and the velocity model in the amplitude and orientation of the location uncertainties and inaccuracies at subsurface levels. The location errors are neither isotropic nor aleatoric in the zone of interest. This suggests that although location inaccuracies may be smaller than location uncertainties, both quantities can have a

  6. Analysis of Geomagnetic Variations Related to Earthquakes Location: Occurred in and around the Korean Peninsula from 2012 to 2014

    NASA Astrophysics Data System (ADS)

    Min, D.; Oh, S.; Hong, J.

    2015-12-01

    This study aims at the correlation analysis of geomagnetic variations related to earthquake locations occurred in and around the Korean peninsula from 2012 to 2014. The wavelet based semblance technique was used to confirm the geomagnetic variations related to earthquakes. And as a result of the analysis, a pattern of consistent geomagnetic variations has been found from the earthquake occurred within 100 km radius at observation site. And similar correlation between earthquake location and Z-field geomagnetic data was also confirmed by the wavelet-based semblance analysis of geomagnetic data. Geomagnetic data obtained from Cheong-yang observatory, which have shown high quality, was used in analysis mainly. Geomagnetic variations from the earthquakes with magnitude greater than 3 within 100 km radius of the Cheng-yang observatory (figure 1) showed meaningful result. In addition, geomagnetic data from Bohyunsan observatory were also used to ensure the validity of the correlation between earthquake and Z-field geomagnetic data.

  7. Earthquake!

    ERIC Educational Resources Information Center

    Markle, Sandra

    1987-01-01

    A learning unit about earthquakes includes activities for primary grade students, including making inferences and defining operationally. Task cards are included for independent study on earthquake maps and earthquake measuring. (CB)

  8. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a ...

  9. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  10. Double-Difference Earthquake Locations Using imaging Magma Under St. Helens (iMUSH) Data

    NASA Astrophysics Data System (ADS)

    Williams, M. C. B.; Ulberg, C. W.; Creager, K. C.

    2015-12-01

    The imaging Magma Under St. Helens (iMUSH) project deployed a magnetotelluric survey, high-resolution active-source experiment, two-year passive-source experiment, and gathered geochemical-petrological data to better understand the magmatic architecture of Mount St. Helens. A primary goal of the passive source experiment is to create 3-D P-wave and S-wave velocity models under the volcano from the surface to the slab. We use hypoDD, a double-difference algorithm, to gain high-precision relative earthquake locations for several hundred events within tens of kilometers of the Mount St. Helens crater. We use data from the first half (2014 June- 2015 July) of the two-year passive-source component of the iMUSH array recording six hundred useable earthquakes with a high-event density near the volcanic crater. The array includes seventy evenly-spaced broadband seismometers continuously sampling at 50 Hz within a 50 km radius of Mount St. Helens, and is augmented by dozens of permanent network stations. Precise relative earthquake locations are determined for spatially clustered hypocenters using a combination of hand picked P-wave arrivals and high-precision relative times determined by cross correlation of waveforms recorded at a common station for event pairs using a 1-D velocity structure. These high-quality relative times will be used to help constrain seismic tomography models as well. We will interrupt earthquake clusters in the context of emerging 3-D wave-speed models from the active-source and passive-source observations. We are examining the relationship between hypocentral locations and regions of partial melt, as well as the relationship between hypocentral locations and the NNW-SSE trending Saint Helens seismic Zone.

  11. Structure of the subducted Cocos Plate from locations of intermediate-depth earthquakes

    NASA Astrophysics Data System (ADS)

    Lomnitz, C.; Rodríguez-Padilla, L. D.; Castaños, H.

    2013-05-01

    Locations of 3,000 earthquakes of 40 to 300 km depth are used to define the 3-D structure of the subducted Cocos Plate under central and southern Mexico. Discrepancies between deep-seated lineaments and surface tectonics are described. Features of particular interest include: (1) a belt of moderate activity at 40 to 80 km depth that parallels the southern boundary of the Mexican Volcanic Plateau; (2) an offset of 150 km across the Isthmus of Tehuantepec where all seismic activity is displaced toward the northeast; (3) three nests of frequent, deep-seated events (80 to 300 km depth) under southern Veracruz, Chiapas and the coast of Mexico-Guatemala. The active subduction process is sharply delimited along a NW-SE lineament from the Yucatan Peninsula, of insignificant earthquake activity. The focal distribution of intermediate-depth earthquakes in south-central Mexico provides evidence of stepwise deepening of the subduction angle along the Trench, starting at 15 degrees under Michoacan-Guerrero to 45 degrees under NW Guatemala. Historical evidence suggests that the hazard to Mexico City from large intermediate-depth earthquakes may have been underestimated.

  12. Reliability of the automatic procedures for locating earthquakes in southwestern Alps and northern Apennines (Italy)

    NASA Astrophysics Data System (ADS)

    Turino, Chiara; Morasca, Paola; Ferretti, Gabriele; Scafidi, Davide; Spallarossa, Daniele

    2010-04-01

    Reliable automatic procedure for locating earthquake in quasi-real time is strongly needed for seismic warning system, earthquake preparedness, and producing shaking maps. The reliability of an automatic location algorithm is influenced by several factors such as errors in picking seismic phases, network geometry, and velocity model uncertainties. The main purpose of this work is to investigate the performances of different automatic procedures to choose the most suitable one to be applied for the quasi-real-time earthquake locations in northwestern Italy. The reliability of two automatic-picking algorithms (one based on the Characteristic Function (CF) analysis, CF picker, and the other one based on the Akaike’s information criterion (AIC), AIC picker) and two location methods (“Hypoellipse” and “NonLinLoc” codes) is analysed by comparing the automatically determined hypocentral coordinates with reference ones. Reference locations are computed by the “Hypoellipse” code considering manually revised data and tested using quarry blasts. The comparison is made on a dataset composed by 575 seismic events for the period 2000-2007 as recorded by the Regional Seismic network of Northwestern Italy. For P phases, similar results, in terms of both amount of detected picks and magnitude of travel time differences with respect to manual picks, are obtained applying the AIC and the CF picker; on the contrary, for S phases, the AIC picker seems to provide a significant greater number of readings than the CF picker. Furthermore, the “NonLinLoc” software (applied to a 3D velocity model) is proved to be more reliable than the “Hypoellipse” code (applied to layered 1D velocity models), leading to more reliable automatic locations also when outliers (wrong picks) are present.

  13. Accurate Damage Location in Complex Composite Structures and Industrial Environments using Acoustic Emission

    NASA Astrophysics Data System (ADS)

    Eaton, M.; Pearson, M.; Lee, W.; Pullin, R.

    2015-07-01

    The ability to accurately locate damage in any given structure is a highly desirable attribute for an effective structural health monitoring system and could help to reduce operating costs and improve safety. This becomes a far greater challenge in complex geometries and materials, such as modern composite airframes. The poor translation of promising laboratory based SHM demonstrators to industrial environments forms a barrier to commercial up take of technology. The acoustic emission (AE) technique is a passive NDT method that detects elastic stress waves released by the growth of damage. It offers very sensitive damage detection, using a sparse array of sensors to detect and globally locate damage within a structure. However its application to complex structures commonly yields poor accuracy due to anisotropic wave propagation and the interruption of wave propagation by structural features such as holes and thickness changes. This work adopts an empirical mapping technique for AE location, known as Delta T Mapping, which uses experimental training data to account for such structural complexities. The technique is applied to a complex geometry composite aerospace structure undergoing certification testing. The component consists of a carbon fibre composite tube with varying wall thickness and multiple holes, that was loaded under bending. The damage location was validated using X-ray CT scanning and the Delta T Mapping technique was shown to improve location accuracy when compared with commercial algorithms. The onset and progression of damage were monitored throughout the test and used to inform future design iterations.

  14. Comparison of ground truth location of earthquake from InSAR and from ambient seismic noise: A case study of the 1998 Zhangbei earthquake

    NASA Astrophysics Data System (ADS)

    Xie, Jun; Zeng, Xiangfang; Chen, Weiwen; Zhan, Zhongwen

    2011-04-01

    Because ambient seismic noise provides estimated Green's function (EGF) between two sites with high accuracy, Rayleigh wave propagation along the path connecting the two sites is well resolved. Therefore, earthquakes which are close to one seismic station can be well located with calibration extracting from EGF. We test two algorithms in locating the 1998 Zhangbei earthquake, one algorithm is waveform-based, and the other is traveltime-based. We first compute EGF between station ZHB (a station about 40 km away from the epicenter) and five IC/IRIS stations. With the waveform-based approach, we calculate 1D synthetic single-force Green's functions between ZHB and other four stations, and obtain traveltime corrections by correlating synthetic Green's functions with EGFs in period band of 10-30 s. Then we locate the earthquake by minimizing the differential travel times between observed earthquake waveform and the 1D synthetic earthquake waveforms computed with focal mechanism provided by Global CMT after traveltime correction from EGFs. This waveform-based approach yields a location which error is about 13 km away from the location observed with InSAR. With the traveltime-based approach, we begin with measuring group velocity from EGFs as well as group arrival time on observed earthquake waveforms, and then locate the earthquake by minimizing the difference between observed group arrival time and arrival time measured on EGFs. This traveltime-based approach yields accuracy of 3 km, Therefore it is feasible to achieve GT5 (ground truth location with accuracy 5 km) with ambient seismic noises. The less accuracy of the waveform-based approach was mainly caused by uncertainty of focal mechanism.

  15. Probablilistic evaluation of earthquake detection and location capability for Illinois, Indiana, Kentucky, Ohio, and West Virginia

    SciTech Connect

    Mauk, F.J.; Christensen, D.H.

    1980-09-01

    Probabilistic estimations of earthquake detection and location capabilities for the states of Illinois, Indiana, Kentucky, Ohio and West Virginia are presented in this document. The algorithm used in these epicentrality and minimum-magnitude estimations is a version of the program NETWORTH by Wirth, Blandford, and Husted (DARPA Order No. 2551, 1978) which was modified for local array evaluation at the University of Michigan Seismological Observatory. Estimations of earthquake detection capability for the years 1970 and 1980 are presented in four regional minimum m/sub b/ magnitude contour maps. Regional 90% confidence error ellipsoids are included for m/sub b/ magnitude events from 2.0 through 5.0 at 0.5 m/sub b/ unit increments. The close agreement between these predicted epicentral 90% confidence estimates and the calculated error ellipses associated with actual earthquakes within the studied region suggest that these error determinations can be used to estimate the reliability of epicenter location. 8 refs., 14 figs., 2 tabs.

  16. Distributed fiber sensing system with wide frequency response and accurate location

    NASA Astrophysics Data System (ADS)

    Shi, Yi; Feng, Hao; Zeng, Zhoumo

    2016-02-01

    A distributed fiber sensing system merging Mach-Zehnder interferometer and phase-sensitive optical time domain reflectometer (Φ-OTDR) is demonstrated for vibration measurement, which requires wide frequency response and accurate location. Two narrow line-width lasers with delicately different wavelengths are used to constitute the interferometer and reflectometer respectively. A narrow band Fiber Bragg Grating is responsible for separating the two wavelengths. In addition, heterodyne detection is applied to maintain the signal to noise rate of the locating signal. Experiment results show that the novel system has a wide frequency from 1 Hz to 50 MHz, limited by the sample frequency of data acquisition card, and a spatial resolution of 20 m, according to 200 ns pulse width, along 2.5 km fiber link.

  17. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  18. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  19. Delineating the Fault Planes of the 2006 Pingtung Doublet Earthquakes by Aftershock Locations

    NASA Astrophysics Data System (ADS)

    Liao, Y.; Hsu, S.

    2011-12-01

    The 2006 Pintung doublet earthquakes (Mw=6.9) were occurred in the offshore region of southwestern Taiwan, where were rarely expected to have large earthquake. Based on the global centroid-moment-tensor(CMT) inversion result, the first one is associated with a normal-faulting and the other with a strike-slip faulting. In this study, the aftershock sequences recorded by an OBS array deployed over the source zone for one week, were relocated to estimate the true fault planes. The preliminary relocation results indicate that the most events in the northern part were aligned with the eastward dipping fault plan of the fist mainshock, and the remnants were spread sparsely but seemed to follow the westward dipping fault plan of the second mainshock. This result is not usually expected because the hypocenter of the first event was located southern than that of the second one. However, the more detailed examination is still needed.

  20. Accurate Vehicle Location System Using RFID, an Internet of Things Approach

    PubMed Central

    Prinsloo, Jaco; Malekian, Reza

    2016-01-01

    Modern infrastructure, such as dense urban areas and underground tunnels, can effectively block all GPS signals, which implies that effective position triangulation will not be achieved. The main problem that is addressed in this project is the design and implementation of an accurate vehicle location system using radio-frequency identification (RFID) technology in combination with GPS and the Global system for Mobile communication (GSM) technology, in order to provide a solution to the limitation discussed above. In essence, autonomous vehicle tracking will be facilitated with the use of RFID technology where GPS signals are non-existent. The design of the system and the results are reflected in this paper. An extensive literature study was done on the field known as the Internet of Things, as well as various topics that covered the integration of independent technology in order to address a specific challenge. The proposed system is then designed and implemented. An RFID transponder was successfully designed and a read range of approximately 31 cm was obtained in the low frequency communication range (125 kHz to 134 kHz). The proposed system was designed, implemented, and field tested and it was found that a vehicle could be accurately located and tracked. It is also found that the antenna size of both the RFID reader unit and RFID transponder plays a critical role in the maximum communication range that can be achieved. PMID:27271638

  1. Accurate Vehicle Location System Using RFID, an Internet of Things Approach.

    PubMed

    Prinsloo, Jaco; Malekian, Reza

    2016-01-01

    Modern infrastructure, such as dense urban areas and underground tunnels, can effectively block all GPS signals, which implies that effective position triangulation will not be achieved. The main problem that is addressed in this project is the design and implementation of an accurate vehicle location system using radio-frequency identification (RFID) technology in combination with GPS and the Global system for Mobile communication (GSM) technology, in order to provide a solution to the limitation discussed above. In essence, autonomous vehicle tracking will be facilitated with the use of RFID technology where GPS signals are non-existent. The design of the system and the results are reflected in this paper. An extensive literature study was done on the field known as the Internet of Things, as well as various topics that covered the integration of independent technology in order to address a specific challenge. The proposed system is then designed and implemented. An RFID transponder was successfully designed and a read range of approximately 31 cm was obtained in the low frequency communication range (125 kHz to 134 kHz). The proposed system was designed, implemented, and field tested and it was found that a vehicle could be accurately located and tracked. It is also found that the antenna size of both the RFID reader unit and RFID transponder plays a critical role in the maximum communication range that can be achieved. PMID:27271638

  2. Fault structure and mechanics of the Hayward Fault, California, from double-difference earthquake locations

    NASA Astrophysics Data System (ADS)

    Waldhauser, Felix; Ellsworth, William L.

    2002-03-01

    The relationship between small-magnitude seismicity and large-scale crustal faulting along the Hayward Fault, California, is investigated using a double-difference (DD) earthquake location algorithm. We used the DD method to determine high-resolution hypocenter locations of the seismicity that occurred between 1967 and 1998. The DD technique incorporates catalog travel time data and relative P and S wave arrival time measurements from waveform cross correlation to solve for the hypocentral separation between events. The relocated seismicity reveals a narrow, near-vertical fault zone at most locations. This zone follows the Hayward Fault along its northern half and then diverges from it to the east near San Leandro, forming the Mission trend. The relocated seismicity is consistent with the idea that slip from the Calaveras Fault is transferred over the Mission trend onto the northern Hayward Fault. The Mission trend is not clearly associated with any mapped active fault as it continues to the south and joins the Calaveras Fault at Calaveras Reservoir. In some locations, discrete structures adjacent to the main trace are seen, features that were previously hidden in the uncertainty of the network locations. The fine structure of the seismicity suggests that the fault surface on the northern Hayward Fault is curved or that the events occur on several substructures. Near San Leandro, where the more westerly striking trend of the Mission seismicity intersects with the surface trace of the (aseismic) southern Hayward Fault, the seismicity remains diffuse after relocation, with strong variation in focal mechanisms between adjacent events indicating a highly fractured zone of deformation. The seismicity is highly organized in space, especially on the northern Hayward Fault, where it forms horizontal, slip-parallel streaks of hypocenters of only a few tens of meters width, bounded by areas almost absent of seismic activity. During the interval from 1984 to 1998, when

  3. Fault structure and mechanics of the Hayward Fault, California from double-difference earthquake locations

    USGS Publications Warehouse

    Waldhauser, F.; Ellsworth, W.L.

    2002-01-01

    The relationship between small-magnitude seismicity and large-scale crustal faulting along the Hayward Fault, California, is investigated using a double-difference (DD) earthquake location algorithm. We used the DD method to determine high-resolution hypocenter locations of the seismicity that occurred between 1967 and 1998. The DD technique incorporates catalog travel time data and relative P and S wave arrival time measurements from waveform cross correlation to solve for the hypocentral separation between events. The relocated seismicity reveals a narrow, near-vertical fault zone at most locations. This zone follows the Hayward Fault along its northern half and then diverges from it to the east near San Leandro, forming the Mission trend. The relocated seismicity is consistent with the idea that slip from the Calaveras Fault is transferred over the Mission trend onto the northern Hayward Fault. The Mission trend is not clearly associated with any mapped active fault as it continues to the south and joins the Calaveras Fault at Calaveras Reservoir. In some locations, discrete structures adjacent to the main trace are seen, features that were previously hidden in the uncertainty of the network locations. The fine structure of the seismicity suggest that the fault surface on the northern Hayward Fault is curved or that the events occur on several substructures. Near San Leandro, where the more westerly striking trend of the Mission seismicity intersects with the surface trace of the (aseismic) southern Hayward Fault, the seismicity remains diffuse after relocation, with strong variation in focal mechanisms between adjacent events indicating a highly fractured zone of deformation. The seismicity is highly organized in space, especially on the northern Hayward Fault, where it forms horizontal, slip-parallel streaks of hypocenters of only a few tens of meters width, bounded by areas almost absent of seismic activity. During the interval from 1984 to 1998, when digital

  4. Automatic Earthquake Detection and Location by Waveform coherency in Alentejo (South Portugal) Using CatchPy

    NASA Astrophysics Data System (ADS)

    Custodio, S.; Matos, C.; Grigoli, F.; Cesca, S.; Heimann, S.; Rio, I.

    2015-12-01

    Seismic data processing is currently undergoing a step change, benefitting from high-volume datasets and advanced computer power. In the last decade, a permanent seismic network of 30 broadband stations, complemented by dense temporary deployments, covered mainland Portugal. This outstanding regional coverage currently enables the computation of a high-resolution image of the seismicity of Portugal, which contributes to fitting together the pieces of the regional seismo-tectonic puzzle. Although traditional manual inspections are valuable to refine automatic results they are impracticable with the big data volumes now available. When conducted alone they are also less objective since the criteria is defined by the analyst. In this work we present CatchPy, a scanning algorithm to detect earthquakes in continuous datasets. Our main goal is to implement an automatic earthquake detection and location routine in order to have a tool to quickly process large data sets, while at the same time detecting low magnitude earthquakes (i.e. lowering the detection threshold). CatchPY is designed to produce an event database that could be easily located using existing location codes (e.g.: Grigoli et al. 2013, 2014). We use CatchPy to perform automatic detection and location of earthquakes that occurred in Alentejo region (South Portugal), taking advantage of a dense seismic network deployed in the region for two years during the DOCTAR experiment. Results show that our automatic procedure is particularly suitable for small aperture networks. The event detection is performed by continuously computing the short-term-average/long-term-average of two different characteristic functions (CFs). For the P phases we used a CF based on the vertical energy trace while for S phases we used a CF based on the maximum eigenvalue of the instantaneous covariance matrix (Vidale 1991). Seismic event location is performed by waveform coherence analysis, scanning different hypocentral coordinates

  5. Location and moment tensor inversion of small earthquakes using 3D Green's functions in models with rugged topography: application to the Longmenshan fault zone

    NASA Astrophysics Data System (ADS)

    Zhou, Li; Zhang, Wei; Shen, Yang; Chen, Xiaofei; Zhang, Jie

    2016-06-01

    With dense seismic arrays and advanced imaging methods, regional three-dimensional (3D) Earth models have become more accurate. It is now increasingly feasible and advantageous to use a 3D Earth model to better locate earthquakes and invert their source mechanisms by fitting synthetics to observed waveforms. In this study, we develop an approach to determine both the earthquake location and source mechanism from waveform information. The observed waveforms are filtered in different frequency bands and separated into windows for the individual phases. Instead of picking the arrival times, the traveltime differences are measured by cross-correlation between synthetic waveforms based on the 3D Earth model and observed waveforms. The earthquake location is determined by minimizing the cross-correlation traveltime differences. We then fix the horizontal location of the earthquake and perform a grid search in depth to determine the source mechanism at each point by fitting the synthetic and observed waveforms. This new method is verified by a synthetic test with noise added to the synthetic waveforms and a realistic station distribution. We apply this method to a series of M W3.4-5.6 earthquakes in the Longmenshan fault (LMSF) zone, a region with rugged topography between the eastern margin of the Tibetan plateau and the western part of the Sichuan basin. The results show that our solutions result in improved waveform fits compared to the source parameters from the catalogs we used and the location can be better constrained than the amplitude-only approach. Furthermore, the source solutions with realistic topography provide a better fit to the observed waveforms than those without the topography, indicating the need to take the topography into account in regions with rugged topography.

  6. Robust method to detect and locate local earthquakes by means of amplitude measurements.

    NASA Astrophysics Data System (ADS)

    del Puy Papí Isaba, María; Brückl, Ewald

    2016-04-01

    In this study we present a robust new method to detect and locate medium and low magnitude local earthquakes. This method is based on an empirical model of the ground motion obtained from amplitude data of earthquakes in the area of interest, which were located using traditional methods. The first step of our method is the computation of maximum resultant ground velocities in sliding time windows covering the whole period of interest. In the second step, these maximum resultant ground velocities are back-projected to every point of a grid covering the whole area of interest while applying the empirical amplitude - distance relations. We refer to these back-projected ground velocities as pseudo-magnitudes. The number of operating seismic stations in the local network equals the number of pseudo-magnitudes at each grid-point. Our method introduces the new idea of selecting the minimum pseudo-magnitude at each grid-point for further analysis instead of searching for a minimum of the L2 or L1 norm. In case no detectable earthquake occurred, the spatial distribution of the minimum pseudo-magnitudes constrains the magnitude of weak earthquakes hidden in the ambient noise. In the case of a detectable local earthquake, the spatial distribution of the minimum pseudo-magnitudes shows a significant maximum at the grid-point nearest to the actual epicenter. The application of our method is restricted to the area confined by the convex hull of the seismic station network. Additionally, one must ensure that there are no dead traces involved in the processing. Compared to methods based on L2 and even L1 norms, our new method is almost wholly insensitive to outliers (data from locally disturbed seismic stations). A further advantage is the fast determination of the epicenter and magnitude of a seismic event located within a seismic network. This is possible due to the method of obtaining and storing a back-projected matrix, independent of the registered amplitude, for each seismic

  7. Two-year survey comparing earthquake activity and injection-well locations in the Barnett Shale, Texas

    PubMed Central

    Frohlich, Cliff

    2012-01-01

    Between November 2009 and September 2011, temporary seismographs deployed under the EarthScope USArray program were situated on a 70-km grid covering the Barnett Shale in Texas, recording data that allowed sensing and locating regional earthquakes with magnitudes 1.5 and larger. I analyzed these data and located 67 earthquakes, more than eight times as many as reported by the National Earthquake Information Center. All 24 of the most reliably located epicenters occurred in eight groups within 3.2 km of one or more injection wells. These included wells near Dallas–Fort Worth and Cleburne, Texas, where earthquakes near injection wells were reported by the media in 2008 and 2009, as well as wells in six other locations, including several where no earthquakes have been reported previously. This suggests injection-triggered earthquakes are more common than is generally recognized. All the wells nearest to the earthquake groups reported maximum monthly injection rates exceeding 150,000 barrels of water per month (24,000 m3/mo) since October 2006. However, while 9 of 27 such wells in Johnson County were near earthquakes, elsewhere no earthquakes occurred near wells with similar injection rates. A plausible hypothesis to explain these observations is that injection only triggers earthquakes if injected fluids reach and relieve friction on a suitably oriented, nearby fault that is experiencing regional tectonic stress. Testing this hypothesis would require identifying geographic regions where there is interpreted subsurface structure information available to determine whether there are faults near seismically active and seismically quiescent injection wells. PMID:22869701

  8. Two-year survey comparing earthquake activity and injection-well locations in the Barnett Shale, Texas.

    PubMed

    Frohlich, Cliff

    2012-08-28

    Between November 2009 and September 2011, temporary seismographs deployed under the EarthScope USArray program were situated on a 70-km grid covering the Barnett Shale in Texas, recording data that allowed sensing and locating regional earthquakes with magnitudes 1.5 and larger. I analyzed these data and located 67 earthquakes, more than eight times as many as reported by the National Earthquake Information Center. All 24 of the most reliably located epicenters occurred in eight groups within 3.2 km of one or more injection wells. These included wells near Dallas-Fort Worth and Cleburne, Texas, where earthquakes near injection wells were reported by the media in 2008 and 2009, as well as wells in six other locations, including several where no earthquakes have been reported previously. This suggests injection-triggered earthquakes are more common than is generally recognized. All the wells nearest to the earthquake groups reported maximum monthly injection rates exceeding 150,000 barrels of water per month (24,000 m(3)/mo) since October 2006. However, while 9 of 27 such wells in Johnson County were near earthquakes, elsewhere no earthquakes occurred near wells with similar injection rates. A plausible hypothesis to explain these observations is that injection only triggers earthquakes if injected fluids reach and relieve friction on a suitably oriented, nearby fault that is experiencing regional tectonic stress. Testing this hypothesis would require identifying geographic regions where there is interpreted subsurface structure information available to determine whether there are faults near seismically active and seismically quiescent injection wells. PMID:22869701

  9. High precision Differential Earthquake Location in 3D models: Evidence for a rheological barrier controlling the microseismicity at the Irpinia fault zone in southern Apennines

    NASA Astrophysics Data System (ADS)

    De Landro, Grazia; Amoroso, Ortensia; Alfredo Stabile, Tony; Matrullo, Emanuela; Lomax, Anthony; Zollo, Aldo

    2015-04-01

    A non-linear, global-search, probabilistic, double-difference earthquake location technique is illustrated. The main advantages of this method are the determination of comprehensive and complete solutions through the probability density function (PDF), the use of differential arrival-times as data, and the possibility to use a 3D velocity model both for absolute and relative locations, essential to obtain accurate differentials locations in structurally complex geological media. The joint use of this methodology and an accurate differential times data-set allowed us to carry out an high-resolution, earthquake location analysis, which helped to characterize the active fault geometries in the studied region. We investigated the recent micro-seismicity occurring at the Campanian-Lucanian Apennines, in the crustal volume embedding the fault system which generated the 1980, M 6.9 earthquake in Irpinia. In order to obtain highly accurate seismicity locations we applied the method to the P and S arrival time data set from 1312 events (M<3) that occurred from August 2005 to April 2011, and used the 3D P- and S-wave velocity models, optimized for the area under study. Both catalogue and cross-correlation first arrival-times have been used. The refined seismicity locations show that the events occur in a volume delimited by the faults activated during the 1980 Irpinia M 6.9 earthquake on sub-parallel, predominantly normal faults. Corresponding to a contact zone between different rheology rock formations (carbonate platform and basin residuals), we evidence an abrupt interruption of the seismicity across a SW-NE oriented structural discontinuity. This "barrier" appears to be located in the area bounded by the fault segments activated during the first (0 sec) and the second (20 sec) rupture episodes of the 80's Irpinia earthquake. We hypothesize that this geometrical barrier can have played a key role during the 1980 Irpinia event, and possibly controlled the delayed times of

  10. High-precision differential earthquake location in 3-D models: evidence for a rheological barrier controlling the microseismicity at the Irpinia fault zone in southern Apennines

    NASA Astrophysics Data System (ADS)

    De Landro, Grazia; Amoroso, Ortensia; Stabile, Tony Alfredo; Matrullo, Emanuela; Lomax, Antony; Zollo, Aldo

    2015-12-01

    A non-linear, global-search, probabilistic, double-difference earthquake location technique is illustrated. The main advantages of this method are the determination of comprehensive and complete solutions through the probability density function (PDF), the use of differential arrival times as data and the possibility to use a 3-D velocity model both for absolute and double-difference locations, all of which help to obtain accurate differential locations in structurally complex geological media. The joint use of this methodology and an accurate differential time data set allowed us to carry out a high-resolution, earthquake location analysis, which helps to characterize the active fault geometries in the studied region. We investigated the recent microseismicity occurring at the Campanian-Lucanian Apennines in the crustal volume embedding the fault system that generated the 1980 MS 6.9 earthquake in Irpinia. In order to obtain highly accurate seismicity locations, we applied the method to the P and S arrival time data set from 1312 events (ML < 3.1) that occurred from August 2005 to April 2011 and used the 3-D P- and S-wave velocity models optimized for the area under study. Both manually refined and cross-correlation refined absolute arrival times have been used. The refined seismicity locations show that the events occur in a volume delimited by the faults activated during the 1980 MS 6.9 Irpinia earthquake on subparallel, predominantly normal faults. We find an abrupt interruption of the seismicity across an SW-NE oriented structural discontinuity corresponding to a contact zone between different rheology rock formations (carbonate platform and basin residuals). This `barrier' appears to be located in the area bounded by the fault segments activated during the first (0 s) and the second (18 s) rupture episodes of the 1980s Irpinia earthquake. We hypothesize that this geometrical barrier could have played a key role during the 1980 Irpinia event, and possibly

  11. Optimized sensor location for estimating story-drift angle for tall buildings subject to earthquakes

    NASA Astrophysics Data System (ADS)

    Ozawa, Sayuki; Mita, Akira

    2016-04-01

    Structural Health Monitoring (SHM) is a technology that can evaluate the extent of the deterioration or the damage of the building quantitatively. Most SHM systems utilize only a few sensors and the sensors are placed equally including the roof. However, the location of the sensors has not been verified. Therefore, in this study, the optimal location of the sensors is studied for estimating the inter-story drift angle which is used in immediate diagnosis after an earthquake. This study proposes a practical optimal sensor location method after testing all the possible sensor location combinations. From the simulation results of all location patterns, it was proved that placing the sensor on the roof is not always optimal. This result is practically useful as it is difficult to place the sensor on the roof in most cases. Modal Assurance Criterion (MAC) is one of the practical optimal sensor location methods. I proposed MASS Modal Assurance Criterion (MAC*) which incorporate the mass matrix of the building into the MAC. Either the mass matrix or the stiffness matrix needs to be considered for the orthogonality of the mode vectors, normal MAC does not consider this condition. The location of sensors determined by MAC* was superior to the previous method, MAC. In this study, an important knowledge of the location of sensors was provided for implementing SHM systems.

  12. Soufrière Hills eruption, Montserrat, 1995 - 1997: volcanic earthquake locations and fault plane solutions

    USGS Publications Warehouse

    Aspinall, W.P.; Miller, A.D.; Lynch, L.L.; Latchman, J.L.; Stewart, R.C.; White, R.A.; Power, J.A.

    1998-01-01

    A total of 9242 seismic events, recorded since the start of the eruption on Montserrat in July 1995, have been uniformly relocated with station travel-time corrections. Early seismicity was generally diffuse under southern Montserrat, and mostly restricted to depths less than 7 km. However, a NE-SW alignment of epicentres beneath the NE flank of the volcano emerged in one swarm of volcano-tectonic earthquakes (VTs) and later nests of VT hypocentres developed beneath the volcano and at a separated location, under St. George's Hill. The overall spatial distribution of hypocentres suggests a minimum depth of about 5 km for any substantial magma body. Activity associated with the opening of a conduit to the surface became increasingly shallow, with foci concentrated below the crater and, after dome building started in Fall 1995, VTs diminished and repetitive swarms of ‘hybrid’ seismic events became predominant. By late-1996, as magma effusion rates escalated, most seismic events were originating within a volume about 2 km diameter which extended up to the surface from only about 3 km depth - the diminution of shear failure earthquakes suggests the pathway for magma discharge had become effectively unconstricted. Individual and composite fault plane solutions have been determined for a few larger earthquakes. We postulate that localised extensional stress conditions near the linear VT activity, due to interaction with stresses in the overriding lithospheric plate, may encourage normal fault growth and promote sector weaknesses in the volcano.

  13. Fine-scale structure of the San Andreas fault zone and location of the SAFOD target earthquakes

    USGS Publications Warehouse

    Thurber, C.; Roecker, S.; Zhang, H.; Baher, S.; Ellsworth, W.

    2004-01-01

    We present results from the tomographic analysis of seismic data from the Parkfield area using three different inversion codes. The models provide a consistent view of the complex velocity structure in the vicinity of the San Andreas, including a sharp velocity contrast across the fault. We use the inversion results to assess our confidence in the absolute location accuracy of a potential target earthquake. We derive two types of accuracy estimates, one based on a consideration of the location differences from the three inversion methods, and the other based on the absolute location accuracy of "virtual earthquakes." Location differences are on the order of 100-200 m horizontally and up to 500 m vertically. Bounds on the absolute location errors based on the "virtual earthquake" relocations are ??? 50 m horizontally and vertically. The average of our locations places the target event epicenter within about 100 m of the SAF surface trace. Copyright 2004 by the American Geophysical Union.

  14. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  15. Current progress in using multiple electromagnetic indicators to determine location, time, and magnitude of earthquakes in California and Peru (Invited)

    NASA Astrophysics Data System (ADS)

    Bleier, T. E.; Dunson, C.; Roth, S.; Heraud, J.; Freund, F. T.; Dahlgren, R.; Bryant, N.; Bambery, R.; Lira, A.

    2010-12-01

    Since ultra-low frequency (ULF) magnetic anomalies were discovered prior to the 1989 Loma Prieta, Ca. M7.0 earthquake, QuakeFinder, a small R&D group based in Palo Alto California has systematically monitored ULF magnetic signals with a network of 3-axis induction magnetometers since 2000 in California. This raw magnetometer data was collected at 20-50 samples per sec., with no preprocessing, in an attempt to collect an accurate time history of electromagnetic waveforms prior to, during, and after large earthquakes within 30 km. of these sensors. Finally in October 2007, the QuakeFinder team observed a series of strange magnetic pulsations at the Alum Rock, California site, 14 days prior to M5.4 earthquake. These magnetic signals observed were relatively short, random pulsations, not continuous waveform signals like Pc1 or Pc3 micropulsations. The magnetic pulses have a characteristic uni-polar shapes and 0.5 sec. to 30 sec. durations, much longer than lightning signals. In May of 2010, very similar pulses were observed at Tacna, Peru, 13 days prior to a M6.2 earthquake, using a QuakeFinder station jointly operated under collaboration with the Catholic University in Lima Peru (PUCP). More examples of these pulsations were sought, and a historical review of older California magnetic data discovered fewer but similar pulsations occurred at the Hollister, Ca. site operated by UC Berkeley (e.g. San Juan Bautista M5.1 earthquake on August 12, 1998). Further analysis of the direction of arrival of the magnetic pulses showed an interesting “azimuth clustering” observed in both Alum Rock, Ca. and Tacna, Peru data. The complete time series of the Alum Rock data allowed the team to analyze subsequent changes observed in magnetometer “filter banks” (0.001 Hz to 10 Hz filter bands, similar to those used by Fraser-Smith in 1989), but this time using time-adjusted limits based on time of day, time of year, Kp, and site background noise. These site-customized limits

  16. Accurate prediction of V1 location from cortical folds in a surface coordinate system

    PubMed Central

    Hinds, Oliver P.; Rajendran, Niranjini; Polimeni, Jonathan R.; Augustinack, Jean C.; Wiggins, Graham; Wald, Lawrence L.; Rosas, H. Diana; Potthast, Andreas; Schwartz, Eric L.; Fischl, Bruce

    2008-01-01

    Previous studies demonstrated substantial variability of the location of primary visual cortex (V1) in stereotaxic coordinates when linear volume-based registration is used to match volumetric image intensities (Amunts et al., 2000). However, other qualitative reports of V1 location (Smith, 1904; Stensaas et al., 1974; Rademacher et al., 1993) suggested a consistent relationship between V1 and the surrounding cortical folds. Here, the relationship between folds and the location of V1 is quantified using surface-based analysis to generate a probabilistic atlas of human V1. High-resolution (about 200 μm) magnetic resonance imaging (MRI) at 7 T of ex vivo human cerebral hemispheres allowed identification of the full area via the stria of Gennari: a myeloarchitectonic feature specific to V1. Separate, whole-brain scans were acquired using MRI at 1.5 T to allow segmentation and mesh reconstruction of the cortical gray matter. For each individual, V1 was manually identified in the high-resolution volume and projected onto the cortical surface. Surface-based intersubject registration (Fischl et al., 1999b) was performed to align the primary cortical folds of individual hemispheres to those of a reference template representing the average folding pattern. An atlas of V1 location was constructed by computing the probability of V1 inclusion for each cortical location in the template space. This probabilistic atlas of V1 exhibits low prediction error compared to previous V1 probabilistic atlases built in volumetric coordinates. The increased predictability observed under surface-based registration suggests that the location of V1 is more accurately predicted by the cortical folds than by the shape of the brain embedded in the volume of the skull. In addition, the high quality of this atlas provides direct evidence that surface-based intersubject registration methods are superior to volume-based methods at superimposing functional areas of cortex, and therefore are better

  17. Precise hypocenter locations of midcrustal low-frequency earthquakes beneath Mt. Fuji, Japan

    USGS Publications Warehouse

    Nakamichi, H.; Ukawa, M.; Sakai, S.

    2004-01-01

    Midcrustal low-frequency earthquakes (MLFs) have been observed at seismic stations around Mt. Fuji, Japan. In September - December 2000 and April - May 2001, abnormally high numbers of MLFs occurred. We located hypocenters for the 80 MLFs during 1998-2003 by using the hypoDD earthquake location program (Waldhauser and Ellsworth, 2000). The MLF hypocenters define an ellipsoidal volume some 5 km in diameter ranging from 11 to 16 km in focal depth. This volume is centered 3 km northeast of the summit and its long axis is directed NW-SE. The direction of the axis coincides with the major axis of tectonic compression around Mt. Fuji. The center of the MLF epicenters gradually migrated upward and 2-3 km from southeast to northwest during 1998-2001. We interpret that the hypocentral migration of MLFs reflects magma movement associated with a NW-SE oriented dike beneath Mt. Fuji. Copyright ?? The Society of Geomagnetism and Earth, Planetary and Space Sciences (SGEPSS); The Seismological Society of Japan; The Volcanological Society of Japan; The Geodetic Society of Japan; The Japanese Society for Planetary Sciences.

  18. An automatic procedure for high-resolution earthquake locations: a case study from the TABOO near fault observatory (Northern Apennines, Italy)

    NASA Astrophysics Data System (ADS)

    Valoroso, Luisa; Chiaraluce, Lauro; Di Stefano, Raffaele; Latorre, Diana; Piccinini, Davide

    2014-05-01

    The characterization of the geometry, kinematics and rheology of fault zones by seismological data depends on our capability of accurately locate the largest number of low-magnitude seismic events. To this aim, we have been working for the past three years to develop an advanced modular earthquake location procedure able to automatically retrieve high-resolution earthquakes catalogues directly from continuous waveforms data. We use seismograms recorded at about 60 seismic stations located both at surface and at depth. The network covers an area of about 80x60 km with a mean inter-station distance of 6 km. These stations are part of a Near fault Observatory (TABOO; http://taboo.rm.ingv.it/), consisting of multi-sensor stations (seismic, geodetic, geochemical and electromagnetic). This permanent scientific infrastructure managed by the INGV is devoted to studying the earthquakes preparatory phase and the fast/slow (i.e., seismic/aseismic) deformation process active along the Alto Tiberina fault (ATF) located in the northern Apennines (Italy). The ATF is potentially one of the rare worldwide examples of active low-angle (< 15°) normal fault accommodating crustal extension and characterized by a regular occurrence of micro-earthquakes. The modular procedure combines: i) a sensitive detection algorithm optimized to declare low-magnitude events; ii) an accurate picking procedure that provides consistently weighted P- and S-wave arrival times, P-wave first motion polarities and the maximum waveform amplitude for local magnitude calculation; iii) both linearized iterative and non-linear global-search earthquake location algorithms to compute accurate absolute locations of single-events in a 3D geological model (see Latorre et al. same session); iv) cross-correlation and double-difference location methods to compute high-resolution relative event locations. This procedure is now running off-line with a delay of 1 week to the real-time. We are now implementing this

  19. Resolving Rupture Directivity of Moderate Strike-Slip Earthquakes in Sparse Network with Ambient Noise Location: A Case Study with the 2011 M5.6 Oklahoma Earthquake

    NASA Astrophysics Data System (ADS)

    He, X.; Ni, S.

    2015-12-01

    Earthquake rupture directivity is essential for improving reliability of shakemap and understanding seismogenic processes by resolving the ruptured fault. Compared with field geological survey and InSAR technique, rupture directivity analysis based on seismological data provides rapid characterization of the rupture finiteness parameters or is almost the only way for resolving ruptured fault for earthquakes weaker than M5. In recent years, ambient seismic noise has been widely used in tomography and as well as earthquake location. Barmin et al. (2011) and Levshin et al. (2012) proposed to locate the epicenter by interpolating the estimated Green's functions (EGFs) determined by cross-correlation of ambient noise to arbitrary hypothetical event locations. This method does not rely on an earth model, but it requires a dense local array. Zhan et al. (2011) and Zeng et al. (2014) used the EGFs between a nearby station and remote stations as calibration for 3D velocity structure and then obtained the centroid location. In contrast, the hypocenter can be determined by P wave onsets. When assuming unilateral rupture, we can resolve the rupture directivity with relative location of the centroid location and hypocenter. We apply this method to the 2011 M5.6 Oklahoma earthquake. One M4.8 foreshock and one M4+ aftershock are chosen as reference event to calibrate the systematic bias of ambient noise location. The resolved rupture plane strikes southwest-northeast, consistent with the spatial distribution of aftershocks (McNamara et al., 2015) and finite fault inversion result (Sun et al., 2014). This method works for unilaterally ruptured strike-slip earthquakes, and more case studies are needed to test its effectiveness.

  20. Location and local magnitude of the Tocopilla earthquake sequence of Northern Chile

    NASA Astrophysics Data System (ADS)

    Fuenzalida, A.; Lancieri, M.; Madariaga, R. I.; Sobiesiak, M.

    2010-12-01

    The Northern Chile gap is generally considered to the site of the next megathurst event in Chile. The Tocopilla earthquake of 14 November 2007 (Mw 7.8) and aftershock series broke the southern end of this gap. The Tocopilla event ruptured a narrow strip of 120 km of length and a width that (Peyrat et al.; Delouis et al. 2009) estimated as 30 km. The aftershock sequence comprises five large thrust events with magnitude greater than 6. The main aftershock of Mw 6.7 occurred on November 15, at 15:06 (UTM) seawards of the Mejillones Peninsula. One month later, on December 16 2007, a strong (Mw 6.8) intraplate event with slab-push mechanism occurred near the bottom of the rupture zone. These events represent a unique opportunity for the study of earthquakes in Northern Chile because of the quantity and quality of available data. In the epicentral area, the IPOC network was deployed by GFZ, CNRS/INSU and DGF before the main event. This is a digital, continuously recording network, equipped with both strong-motion and broad-band instrument. On 29 November 2007 a second network named “Task Force” (TF) was deployed by GFZ to study the aftershocks. This is a dense network, installed near the Mejillones peninsula. It is composed by 20 short-period instruments. The slab-push event of 16 december 2007 occurred in the middle of the area covered by the TF network. Aftershocks were detected using an automatic procedure and manually revised in order to pick P and S arrivals. In the 14-28 November period, we detected 635 events recorded at the IPOC network; and a further 552 events were detected between 29 November and 16 December before the slab-push event using the TF network. The events were located using a vertically layered velocity model (Husen et al. 1999), using the NLLoc software of Lomax et al. From the broadband data we estimated the moment magnitude from the displacement spectra of the events. From the short-period instruments we evaluated local magnitudes using the

  1. Single-earthquake Location Using 3-D Vp and Vs Model - Applications in the Central USA and Taiwan Regions

    NASA Astrophysics Data System (ADS)

    Chiu, J.; Chen, H.; Kim, K.; Pujol, J.; Chiu, S.; Withers, M.

    2003-12-01

    Traditional local earthquake location using a horizontally layered homogeneous velocity model is always limited in its resolution and reliability due to the existence of frequently overlooked 3- dimensional complexity of the real earth. Simultaneous earthquake relocation during a traditional 3-D seismic tomography has only applied to a limited set of selected earthquakes that more than 50% of earthquakes in a catalog are basically ignored. A new earthquake location program has been developed to locate every local earthquake using the best available 3-D Vp and Vs model for a region. Many modern seismic networks have provided excellent spatial coverage of seismic stations to record high-resolution earthquake data to allow the determination of high-resolution 3-D Vp and Vs velocity model for the region. Once Vp and Vs information for all 3-D grid points are available, travel time from each grid point to all seismic stations can be calculated using any available 3-D ray tracing techniques and be stored in computer files for later usage. Travel times from a trial hypocenter to the recording stations can be interpolated simply from those of the adjacent 8 grid points available in computer files without the very time consuming 3-D ray tracing. Iterations continue until the hypocenter adjustments are less than the given criteria and the travel time residual, or the difference between the observed and the calculated travel times, is a minimum. Therefore, any earthquake, no matter how small or how big it is, will be efficiently and reliably located using the 3-D velocity model. This new location program has been applied to the New Madrid seismic zone of the central USA and in various seismic zones in Taiwan region. Preliminary results in these two regions indicate that earthquake hypocenters can be reliably relocated in spite of the very significant lateral structural variations. This location program can also be applied in routine earthquake location for any seismic network

  2. Location and mechanism of the 1933 Diexi earthquake and its association with the regional tectonic deformation prior to the 2008 Wenchuan earthquake

    NASA Astrophysics Data System (ADS)

    Wang, K.; Shen, Z.

    2010-12-01

    The east margin of the Tibetan plateau is composed of the Longmenshan and Minjiang-Huya fault systems, which are tectonically active and produced the 1933 M7.5 Diexi, 1976 M7.2 Songpan doublelet, and 2008 M7.9 Wenchuan earthquakes. Among all the large events the 1933 Diexi earthquake is the least known, and its location and mechanism, despite of the importance in understanding the regional tectonic process and assessing the seismic hazards, have been subject to controversy. We collect worldwide seismic records of this earthquake, among which some polarities of the first arrival phases were picked, and use the data to relocate this earthquake and obtain the fault plane solution. The relocated epicenter is at (31.9°E, 103.6°N) and one of the nodal planes trends NNW, with the azimuth ranging N5~30°W. Taking this as the rupture plane of the Diexi earthquake, we conclude that the seismogenic structure was the southern segment of the Minjiang fault, which was dominated mainly by sinistral slip with a minor thrust component. Present day GPS velocity profile across the Minshan Mountains indicates that the Huya fault absorbs ~2 mm/yr crustal shortening, associated with the rapid uplift of the Minshan Mountains since Quaternary. A discrepancy between the focal mechanism solution of the 1933 Diexi earthquake and the GPS determined present sense of motion across the Minjiang fault may be attributed to the crustal deformation processes of the Longmenshan and Minjiang-Huya fault systems and their earthquake cycles, particularly the role that the Longmen shan fault system played in altering the regional deformation field late into the earthquake cycle prior to the 2008 great Wenchuan earthquake. We are using a visco-elastic FEM code to simulate the process taking into account of the layering and lateral change of the crustal and mantle materials. A 3-D evolution of the deformation field will be evaluated, and its temporal change due to crustal and mantle rheology across the

  3. Improving the location of induced earthquakes associated with an underground gas storage in the Gulf of Valencia (Spain)

    NASA Astrophysics Data System (ADS)

    Gaite, Beatriz; Ugalde, Arantza; Villaseñor, Antonio; Blanch, Estefania

    2016-05-01

    On September 2013, increased seismic activity was recorded near the CASTOR offshore underground gas storage (UGS), in the Gulf of Valencia (Spain). According to the reports by the Spanish Instituto Geográfico Nacional (IGN), more than 550 events occurred during two months, the strongest having a magnitude of Mw = 4.2 which took place two weeks after the gas injection stopped. The low magnitude of the events (with only 17 earthquakes having mbLg greater than 3), the lack of nearby stations, and the inhomogeneous station distribution made the location problem a great challenge. Here we present improved locations for a subset of 161 well recorded events from the earthquake sequence using a probabilistic nonlinear earthquake location method. A new 3-D shear-wave velocity model is also estimated in this work from surface-wave ambient noise tomography. To further improve the locations, waveform cross-correlations are computed at each station for every event pair and new locations are obtained from an inverted set of adjusted travel time picks. The resulting hypocentral solutions show a tighter clustering with respect to the initial locations and they are distributed in a NW-SE direction. Most of the earthquakes are located near the injection well at depths of about 6 km. Our results indicate that the observed seismicity is closely associated with the injection activities at the CASTOR underground gas storage and may have resulted from the reactivation of pre-existing unmapped faults, located a few kilometers below the reservoir.

  4. Aftershock locations and rupture characteristics of the 2006 May 27, Yogyakarta-Indonesia earthquake

    NASA Astrophysics Data System (ADS)

    Irwan, M.; Ando, M.; Kimata, F.; Tadokoro, K.; Nakamichi, H.; Muto, D.; Okuda, T.; Hasanuddin, A.; Mipi A., K.; Setyadji, B.; Andreas, H.; Gamal, M.; Arif, R.

    2006-12-01

    A strong earthquake (M6.3) rocked the Bantul district, south of Yogyakarta Special Province (DIY) on the morningof May 27, 2006. We installed a temporary array of 6 seismographs to record aftershocks of the earthquake. The area of aftershocks, which may be interpreted as mainshock ruptured area has dimensions of about 25 km length and 20 km width, in the N48E direction. At depth the seismicity mainly concentrated between 5 to 15 km. The distribution of aftershock does not appear to come very close to the surface. There is no obvious surface evidence of causative fault in this area, though we find many crack and fissures that seem to have produced by the strong ground motion. We used the orientation and size of the fault determined from our aftershock results to carry out an inversion of teleseismic data for the slip distribution. We used broad- band seismograms of the IRIS network with epicentral distances between 30 and 90 degrees. We assume a single fault plane, strike 48 degree and dip 80 degree, which is inferred from the aftershock distribution. The total seismic moment is 0.369 x 10(19) Nm with maximum slip 0.4 meters. The asperity is located about 5 km away southwest of USGS estimated epicenter. Although the distances from the seismic source to heavily damaged areas Bantul and Klaten are 10 to 50 km, soft sedimentary soil likely to have generated very damaging motions within the area.

  5. Artefacts of earthquake location errors and short-term incompleteness on seismicity clusters in southern California

    NASA Astrophysics Data System (ADS)

    Zaliapin, Ilya; Ben-Zion, Yehuda

    2015-09-01

    We document and quantify effects of two types of catalogue uncertainties-earthquake location errors and short-term incompleteness-on results of statistical cluster analyses of seismicity in southern California. In the main part of the study we analyse 117 076 events with m ≥ 2 in southern California during 1981-2013 from the waveform-relocated catalogue of Hauksson et al. We present statistical evidence for three artefacts caused by the absolute and relative location errors: (1) Increased distance between offspring and parents. (2) Underestimated clustering, quantified by the number of offspring per event, the total number of clustered events, and some other statistics. (3) Overestimated background rates. We also find that short-term incompleteness leads to (4) Apparent magnitude dependence and temporal fluctuations of b-values. The reported artefacts are robustly observed in three additional catalogues of southern California: the relocated catalogue of Richards-Dinger & Shearer during 1975-1998, and the two subcatalogues-1961-1981 and 1981-2013-of the Advances National Seismic System catalogue. This implies that the reported artefacts are not specific to a particular (re)location method. The comparative quality of the four examined catalogues is reflected in the magnitude of the artefacts. The location errors in the examined catalogues mostly affect events with m < 3.5, while for larger magnitudes the location error effects are negligible. This is explained by comparing the location error and rupture lengths of events and their parents. Finally, our analysis suggests that selected aggregated cluster statistics (e.g. proportion of singles) are less prone to location artefacts than individual statistics (e.g. the distance to parent or parent-offspring assignment). The results can inform a range of studies focused on small-magnitude seismicity patterns in the presence of catalogue uncertainties.

  6. Subduction megathrust segmentation correlated with earthquake swarm locations appears to be caused by increased stress heterogeneity

    NASA Astrophysics Data System (ADS)

    Holtkamp, S.; Brudzinski, M. R.

    2011-12-01

    For each Mw≥8.5 earthquake with a publicly available finite fault rupture model, we find slip is closely bounded along-strike by earthquake swarms, either prior or subsequent. These earthquake swarms tend to have much larger spatial extents than their cumulative moment would suggest, arguing against a static stress triggering mechanism. In Japan, Chile, Sumatra, and Alaska, earthquake swarms correlate with regions of the plate interface that exhibit low interseismic strain accumulation. This low fault coupling could be a result of aseismic slip during swarms or stress heterogeneity that leads to both swarm occurrence and great earthquake termination. Geodetic studies of earthquake swarms are limited but show several cases with no evidence for aseismic slip during swarms. Moreover, the 1964 Alaska and 2010 Maule earthquakes ruptured through regions with lower coupling than where they terminated, arguing that a factor other than small pre-stress controls where large earthquakes terminate. Large variations in coupling over small spatial scales could produce a fragmented set of small asperities conducive for generating a swarm of smaller earthquakes (Figure). Great earthquakes would be unlikely to rupture through that region as homogeneity of fault zone properties seems to be conducive for generating the largest megathrust earthquakes. Earthquake swarms are one of the better proxies for along-strike segmentation of subduction megathrusts, thereby potentially providing an new method for finding margins with the potential for devastating Mw~9 scale earthquakes. Figure: Cartoon illustrating our preferred hypothesis that increased stress heterogeneity causes earthquake swarm activity and stops large earthquake rupture propagation. Stress on the fault is in grayscale with black being high fault pre-stress. In this model, the heterogeneous stress distribution fosters swarm activity by limiting the size to which an earthquake can grow (leading to a high b

  7. Analysis of Low Frequency Earthquakes (LFE) as a means of locating Non-Volcanic Tremor (NVT)

    NASA Astrophysics Data System (ADS)

    Husker, A. L.; Novo, X.; Shapiro, N. M.; Kostoglodov, V.

    2009-12-01

    Low Frequency Earthquakes (LFE) that occur during Non-Volcanic Tremor (NVT) are being increasingly used to determine the locations of the NVT in which they occur suggesting that NVT is made up of many LFE's (e.g. Shelly, et al., 2006; Shelly, et al., 2007; La Rocca, et al., 2009). Those studies used cross-correlation techniques to determine relative arrival times of the supposed LFE signals, which are viewed as pulses that are sometimes barely visible within the NVT. Our study analyzes these cross-correlated pulses or LFE locations that occurred during the NVT recorded from 2005 to 2007 by the MesoAmerican Seismic Experiment (MASE). MASE was a 550 km profile of seismic stations placed every ~5 km running nearly perpendicular to the trench starting at the coast in Acapulco, Mexico and thus provided detailed coverage of the NVT during the 2 year period. The LFE locations are compared with their parent NVT locations obtained from the maximum energy (the square of the measured velocity summed over the entire event) for each NVT event across all stations. It is found that fewer than half of the LFE are located within 10 km of the NVT peak energy. In addition, the NVT energy occurs over 100 km of the MASE profile, while LFE can be measured as point sources. Although no LFE's are found outside of the NVT, the fact that LFE are rarely measured within the peak energy of the NVT and that the scale of NVT is so much greater than LFE's lead to the conclusion that NVT is not purely made up of many small LFE's. It also calls into question the use of using LFE's to locate NVT. LFE's seem to be good at mapping the zone of NVT, but they are not more or less concentrated in any part across the zone and therefore do not reflect where the NVT energy is greatest.

  8. Detection and location of earthquakes in the central Aleutian subduction zone using island and ocean bottom seismograph stations

    SciTech Connect

    Frohlich, C.; Billington, S.; Engdahl, E.R.; Malahoff, A.

    1982-08-10

    A network of eight University of Texas ocean bottom seismographs (OBS) operated for 6 weeks in 1978 about 50 km offshore of Adak Island, Alaska, and nearly islands. In 1979 a similar network of nine instruments was deployed for 7 weeks farther offshore within and up to 100 km seaward of the Aleutian trench. For shallow earthquakes on the outer trench slope, for shallow earthquakes in the thrust zone, and for intermediate-depth events we have analyzed the OBS and island-based network data and evaluated the island network's capabilities for earthquake detection and location and for focal mechanism determination. Our three major conclusions are presented. The first concerns shallow earthquakes on the outer trench slope. In 1979 about 30 earthquakes occurred within the Aleutian trench and up to 60 km seaward of the trench axis. The island network located none of these events and detected P phases for only three of them. Ray tracing shows that the islands lie in a geometric shadow zone for events on the outer trench slope. The best located events are shallower than 20 km and exhibit first motions consistent with normal faulting. Several authors have suggested that these events are caused by bending of the oceanic lithosphere at the outer rise prior to subduction. If so, then the event locations reported here show that the bending stresses exceed the strength of lithosphere only in a narrow zone extending about 10 km landward and 60 km seaward of the trench axis. The second conclusion concerns shallow earthquakes in the thrust zone. Epicenters determined by island stations alone are virtually identical to epicenters determined using data from both island and OBS stations. The third conclusion concerns earthquakes deeper than 70 km. Epicenters determined using island network stations alone lie 10 to 80 km south of those determined using OBS and island stations, with the differences between epicenters depending both on event depth and on the velocity model used.

  9. Earthquake locations determined by the Southern Alaska seismograph network for October 1971 through May 1989

    USGS Publications Warehouse

    Fogleman, Kent A.; Lahr, John C.; Stephens, Christopher D.; Page, Robert A.

    1993-01-01

    This report describes the instrumentation and evolution of the U.S. Geological Survey's regional seismograph network in southern Alaska, provides phase and hypocenter data for seismic events from October 1971 through May 1989, reviews the location methods used, and discusses the completeness of the catalog and the accuracy of the computed hypocenters. Included are arrival time data for explosions detonated under the Trans-Alaska Crustal Transect (TACT) in 1984 and 1985. The U.S. Geological Survey (USGS) operated a regional network of seismographs in southern Alaska from 1971 to the mid 1990s. The principal purpose of this network was to record seismic data to be used to precisely locate earthquakes in the seismic zones of southern Alaska, delineate seismically active faults, assess seismic risks, document potential premonitory earthquake phenomena, investigate current tectonic deformation, and study the structure and physical properties of the crust and upper mantle. A task fundamental to all of these goals was the routine cataloging of parameters for earthquakes located within and adjacent to the seismograph network. The initial network of 10 stations, 7 around Cook Inlet and 3 near Valdez, was installed in 1971. In subsequent summers additions or modifications to the network were made. By the fall of 1973, 26 stations extended from western Cook Inlet to eastern Prince William Sound, and 4 stations were located to the east between Cordova and Yakutat. A year later 20 additional stations were installed. Thirteen of these were placed along the eastern Gulf of Alaska with support from the National Oceanic and Atmospheric Administration (NOAA) under the Outer Continental Shelf Environmental Assessment Program to investigate the seismicity of the outer continental shelf, a region of interest for oil exploration. Since then the region covered by the network remained relatively fixed while efforts were made to make the stations more reliable through improved electronic

  10. Determination of the Fault Plane of the 2013 Santa Cruz Earthquake, Bolivia, Through Relative Location of Aftershocks

    NASA Astrophysics Data System (ADS)

    Rivadeneyra Vera, J. C.; Assumpcao, M.

    2015-12-01

    The Central Andes of southern Bolivia is a highly seismic region due to the faults present in this area which eventually could generate earthquakes up to 8.5 Mw. Nevertheless most of them are shallow and have low magnitude. In 2013, an earthquake of 5.0 Mw ocurred in Santa Cruz de la Sierra, it was followed by five aftershocks in the two months after the mainshock. Distances between epicenters of the aftershocks and the mainshock are up to 34 km, which is greater than expected for an earthquake of this magnitude. Additionaly the uncertainty of the epicenters is around 20 km; this scenario is not suitable for studies looking to determine the seismogenic fault orientation. Using data from South American stations of the international network of the Incorporated Research Institutions dor Seismology (IRIS) and the relative location technique, that uses the surface waves (usually the clearest wave in noisy sismograms), the epicenters of five aftershocks of the Santa Cruz series were determinated in relation the mainshock. This method enabled to achieve epicentral locations with uncertainties smaller than 2 km, distances between the aftershocks and the mainshock are up to 7 km, in accordance with the magnitude of the earthquake. The result of the relative location showed a N - S trend of the epicenters in agreement with the location and orientation of the Mandeyapecua fault, the largest reverse fault in Bolivia. Key words: Relative location, Surface waves

  11. Improvements of the Ray-Tracing Based Method Calculating Hypocentral Loci for Earthquake Location

    NASA Astrophysics Data System (ADS)

    Zhao, A. H.

    2014-12-01

    Hypocentral loci are very useful to reliable and visual earthquake location. However, they can hardly be analytically expressed when the velocity model is complex. One of methods numerically calculating them is based on a minimum traveltime tree algorithm for tracing rays: a focal locus is represented in terms of ray paths in its residual field from the minimum point (namely initial point) to low residual points (referred as reference points of the focal locus). The method has no restrictions on the complexity of the velocity model but still lacks the ability of correctly dealing with multi-segment loci. Additionally, it is rather laborious to set calculation parameters for obtaining loci with satisfying completeness and fineness. In this study, we improve the ray-tracing based numerical method to overcome its advantages. (1) Reference points of a hypocentral locus are selected from nodes of the model cells that it goes through, by means of a so-called peeling method. (2) The calculation domain of a hypocentral locus is defined as such a low residual area that its connected regions each include one segment of the locus and hence all the focal locus segments are respectively calculated with the minimum traveltime tree algorithm for tracing rays by repeatedly assigning the minimum residual reference point among those that have not been traced as an initial point. (3) Short ray paths without branching are removed to make the calculated locus finer. Numerical tests show that the improved method becomes capable of efficiently calculating complete and fine hypocentral loci of earthquakes in a complex model.

  12. Aftershocks of the 2014 M6 South Napa Earthquake: Detection, Location, and Focal Mechanisms

    NASA Astrophysics Data System (ADS)

    Hardebeck, J.; Shelly, D. R.

    2014-12-01

    The aftershock sequence of the South Napa earthquake is notable both for its low productivity and for its geometric complexity. The aftershocks do not clearly define a fault plane consistent with the NNW-striking vertical plane implied by the mainshock moment tensor and the mapped surface rupture, but instead seem to delineate multiple secondary structures at depth. We investigate this unusual sequence by identifying additional aftershocks that do not appear in the network catalog, relocating the combined aftershock catalog using waveform cross-correlation arrival times and double-difference techniques, and determining focal mechanisms for individual events and event clusters. Additional aftershocks are detected by applying a matched filter approach to the continuous seismic data at nearby stations, with the catalog earthquakes serving as the waveform templates. In tandem with new event detections, we measure precise differential arrival times between events, which we then use in double-difference event location. We detect about 4 times as many well-located aftershocks as in the network catalog. We relocate the events using double-difference in both a 1D and a 3D velocity model. Most of the aftershocks occur between 8 and 11 km depth, similar depth to the mainshock hypocenter and deeper than most of the slip imaged seismically and geodetically. The aftershocks form a diffuse NNW-trending structure, primarily to the north of the mainshock hypocenter and on the west side of the main surface rupture. Within this diffuse trend there are clusters of aftershocks, some suggesting a N-S strike, and some that appear to dip to the east or west. Preliminary single-event and composite focal mechanisms also imply N-S striking strike-slip structures. The mainshock hypocenter and many of the aftershocks occur near the intersection of a sharply defined NE-dipping seismicity structure and the probable location of the West Napa fault, suggesting that stress is concentrated at a

  13. Influence of static stress changes on earthquake locations in southern California

    NASA Astrophysics Data System (ADS)

    Harris, Ruth A.; Simpson, Robert W.; Reasenberg, Paul A.

    1995-05-01

    EARTHQUAKES induce changes in static stress on neighbouring faults that may delay, hasten or even trigger subsequent earthquakes1-10. The length of time over which such effects persist has a bearing on the potential contribution of stress analyses to earthquake hazard assessment, but is presently unknown. Here we use an elastic half-space model11 to estimate the static stress changes generated by damaging (magnitude M>=5) earthquakes in southern California over the past 26 years, and to investigate the influence of these changes on subsequent earthquake activity. We find that, in the 1.5-year period following a M>=5 earthquake, any subsequent nearby M>=5 earthquake almost always ruptures a fault that is loaded towards failure by the first earthquake. After this period, damaging earthquakes are equally likely to rupture loaded and relaxed faults. Our results suggest that there is a short period of time following a damaging earthquake in southern California in which simple Coulomb failure stress models could be used to identify regions of increased seismic hazard.

  14. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  15. User's guide to HYPOINVERSE-2000, a Fortran program to solve for earthquake locations and magnitudes

    USGS Publications Warehouse

    Klein, Fred W.

    2002-01-01

    Hypoinverse is a computer program that processes files of seismic station data for an earthquake (like p wave arrival times and seismogram amplitudes and durations) into earthquake locations and magnitudes. It is one of a long line of similar USGS programs including HYPOLAYR (Eaton, 1969), HYPO71 (Lee and Lahr, 1972), and HYPOELLIPSE (Lahr, 1980). If you are new to Hypoinverse, you may want to start by glancing at the section “SOME SIMPLE COMMAND SEQUENCES” to get a feel of some simpler sessions. This document is essentially an advanced user’s guide, and reading it sequentially will probably plow the reader into more detail than he/she needs. Every user must have a crust model, station list and phase data input files, and glancing at these sections is a good place to begin. The program has many options because it has grown over the years to meet the needs of one the largest seismic networks in the world, but small networks with just a few stations do use the program and can ignore most of the options and commands. History and availability. Hypoinverse was originally written for the Eclipse minicomputer in 1978 (Klein, 1978). A revised version for VAX and Pro-350 computers (Klein, 1985) was later expanded to include multiple crustal models and other capabilities (Klein, 1989). This current report documents the expanded Y2000 version and it supercedes the earlier documents. It serves as a detailed user's guide to the current version running on unix and VAX-alpha computers, and to the version supplied with the Earthworm earthquake digitizing system. Fortran-77 source code (Sun and VAX compatible) and copies of this documentation is available via anonymous ftp from computers in Menlo Park. At present, the computer is swave.wr.usgs.gov and the directory is /ftp/pub/outgoing/klein/hyp2000. If you are running Hypoinverse on one of the Menlo Park EHZ or NCSN unix computers, the executable currently is ~klein/hyp2000/hyp2000. New features. The Y2000 version of

  16. Using XTE as Part of the IPN to Derive Accurate GRB Locations

    NASA Technical Reports Server (NTRS)

    Barthelmy, S.

    1998-01-01

    The objective of this final report was to integrate the Rossi X-Ray Timing Explorer PCA into the 3rd Interplanetary Network of gamma-ray burst detectors, to allow more bursts to be detected and accurately localized. Although the necessary software was implemented to do this at Goddard and at UC Berkeley, several factors made a full integration impossible or impractical.

  17. Accurate identification of centromere locations in yeast genomes using Hi-C.

    PubMed

    Varoquaux, Nelle; Liachko, Ivan; Ay, Ferhat; Burton, Joshua N; Shendure, Jay; Dunham, Maitreya J; Vert, Jean-Philippe; Noble, William S

    2015-06-23

    Centromeres are essential for proper chromosome segregation. Despite extensive research, centromere locations in yeast genomes remain difficult to infer, and in most species they are still unknown. Recently, the chromatin conformation capture assay, Hi-C, has been re-purposed for diverse applications, including de novo genome assembly, deconvolution of metagenomic samples and inference of centromere locations. We describe a method, Centurion, that jointly infers the locations of all centromeres in a single genome from Hi-C data by exploiting the centromeres' tendency to cluster in three-dimensional space. We first demonstrate the accuracy of Centurion in identifying known centromere locations from high coverage Hi-C data of budding yeast and a human malaria parasite. We then use Centurion to infer centromere locations in 14 yeast species. Across all microbes that we consider, Centurion predicts 89% of centromeres within 5 kb of their known locations. We also demonstrate the robustness of the approach in datasets with low sequencing depth. Finally, we predict centromere coordinates for six yeast species that currently lack centromere annotations. These results show that Centurion can be used for centromere identification for diverse species of yeast and possibly other microorganisms. PMID:25940625

  18. Seismicity in 2010 and major earthquakes recorded and located in Costa Rica from 1983 until 2012, by the local OVSICORI-UNA seismic network

    NASA Astrophysics Data System (ADS)

    Ronnie, Q.; Segura, J.; Burgoa, B.; Jimenez, W.; McNally, K. C.

    2013-05-01

    This work is the result of the analysis of existing information in the earthquake database of the Observatorio Sismológico y Vulcanológico de Costa Rica, Universidad Nacional (OVSICORI-UNA), and seeks disclosure of basic seismological information recorded and processed in 2010. In this year there was a transition between the software used to record, store and locate earthquakes. During the first three months of 2010, we used Earthworm (http://folkworm.ceri.memphis.edu/ew-doc), SEISAN (Haskov y Ottemoller, 1999) and Hypocenter (Lienert y Haskov, 1995) to capture, store and locate the earthquakes, respectively; in April 2010, ANTELOPE (http://www.brtt.com/software.html) start to be used for recording and storing and GENLOC (Fan at al, 2006) and LOCSAT (Bratt and Bache 1988), to locate earthquakes. GENLOC was used for local events and LOCSAT for regional and distant earthquakes. The local earthquakes were located using the 1D velocity model of Quintero and Kissling (2001) and for regional and distant earthquakes IASPEI91 (Kennett and Engdahl, 1991) was used. All the events for 2010 and shown in this work were rechecked by the authors. We located 3903 earthquakes in and around Costa Rica and 746 regional and distant seismic events were recorded (see Figure 1). In this work we also give a summary of major earthquakes recorded and located by OVSICORI-UNA network between 1983 and 2012. Seismicity recorded by OVSICORI-UNA network in 2010

  19. Oceanic transform earthquakes with unusual mechanisms or locations - Relation to fault geometry and state of stress in the adjacent lithosphere

    NASA Technical Reports Server (NTRS)

    Wolfe, Cecily J.; Bergman, Eric A.; Solomon, Sean C.

    1993-01-01

    Results are presented of a search for transform earthquakes departing from the pattern whereby they occur on the principal transform displacement zone (PTDZ) and have strike-slip mechanisms consistent with transform-parallel motion. The search was conducted on the basis of source mechanisms and locations taken from the Harvard centroid moment tensor catalog and the bulletin of the International Seismological Center. The source mechanisms and centroid depths of 10 such earthquakes on the St. Paul's, Marathon, Owen, Heezen, Tharp, Menard, and Rivera transforms are determined from inversions of long-period body waveforms. Much of the anomalous earthquake activity on oceanic transforms is associated with complexities in the geometry of the PTDZ or the presence of large structural features that may influence slip on the fault.

  20. Use of Loran-C navigation system to accurately determine sampling site location in an above ground cooling reservoir

    SciTech Connect

    Lockwood, R.E.; Blankinship, D.R.

    1994-12-31

    Environmental monitoring programs often require accurate determination of sampling site locations in aquatic environments. This is especially true when a {open_quotes}picture{close_quotes} of high resolution is needed for observing a changing variable in a given area and location is assumed to be important to the distribution of that variable. Sample site location can be difficult if few visible land marks are available for reference on a large body of water. The use of navigational systems such as Global Positioning System (GPS) and its predecessor, Loran-C, provide an excellent method for sample site location. McFarland (1992) discusses the practicality of GPS for location determination. This article discusses the use of Loran-C in a sampling scheme implemented at the South Texas Project Electrical Generating Station (STPEGS), Wadsworth, Texas.

  1. Accurate estimation of object location in an image sequence using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Tang, Yuan-Liang; Kasturi, Rangachar

    1994-01-01

    In autonomous navigation, it is essential to obtain a three-dimensional (3D) description of the static environment in which the vehicle is traveling. For a rotorcraft conducting low-latitude flight, this description is particularly useful for obstacle detection and avoidance. In this paper, we address the problem of 3D position estimation for static objects from a monocular sequence of images captured from a low-latitude flying helicopter. Since the environment is static, it is well known that the optical flow in the image will produce a radiating pattern from the focus of expansion. We propose a motion analysis system which utilizes the epipolar constraint to accurately estimate 3D positions of scene objects in a real world image sequence taken from a low-altitude flying helicopter. Results show that this approach gives good estimates of object positions near the rotorcraft's intended flight-path.

  2. Improving automatic earthquake locations in subduction zones: a case study for GEOFON catalog of Tonga-Fiji region

    NASA Astrophysics Data System (ADS)

    Nooshiri, Nima; Heimann, Sebastian; Saul, Joachim; Tilmann, Frederik; Dahm, Torsten

    2015-04-01

    Automatic earthquake locations are sometimes associated with very large residuals up to 10 s even for clear arrivals, especially for regional stations in subduction zones because of their strongly heterogeneous velocity structure associated. Although these residuals are most likely not related to measurement errors but unmodelled velocity heterogeneity, these stations are usually removed from or down-weighted in the location procedure. While this is possible for large events, it may not be useful if the earthquake is weak. In this case, implementation of travel-time station corrections may significantly improve the automatic locations. Here, the shrinking box source-specific station term method (SSST) [Lin and Shearer, 2005] has been applied to improve relative location accuracy of 1678 events that occurred in the Tonga subduction zone between 2010 and mid-2014. Picks were obtained from the GEOFON earthquake bulletin for all available station networks. We calculated a set of timing corrections for each station which vary as a function of source position. A separate time correction was computed for each source-receiver path at the given station by smoothing the residual field over nearby events. We begin with a very large smoothing radius essentially encompassing the whole event set and iterate by progressively shrinking the smoothing radius. In this way, we attempted to correct for the systematic errors, that are introduced into the locations by the inaccuracies in the assumed velocity structure, without solving for a new velocity model itself. One of the advantages of the SSST technique is that the event location part of the calculation is separate from the station term calculation and can be performed using any single event location method. In this study, we applied a non-linear, probabilistic, global-search earthquake location method using the software package NonLinLoc [Lomax et al., 2000]. The non-linear location algorithm implemented in NonLinLoc is less

  3. Revision of earthquake hypocenter locations in GEOFON bulletin data using global source-specific station terms technique

    NASA Astrophysics Data System (ADS)

    Nooshiri, N.; Saul, J.; Heimann, S.; Tilmann, F. J.; Dahm, T.

    2015-12-01

    The use of a 1D velocity model for seismic event location is often associated with significant travel-time residuals. Particularly for regional stations in subduction zones, where the velocity structure strongly deviates from the assumed 1D model, residuals of up to ±10 seconds are observed even for clear arrivals, which leads to strongly biased locations. In fact, due to mostly regional travel-time anomalies, arrival times at regional stations do not match the location obtained with teleseismic picks, and vice versa. If the earthquake is weak and only recorded regionally, or if fast locations based on regional stations are needed, the location may be far off the corresponding teleseismic location. In this case, implementation of travel-time corrections may leads to a reduction of the travel-time residuals at regional stations and, in consequence, significantly improve the relative location accuracy. Here, we have extended the source-specific station terms (SSST) technique to regional and teleseismic distances and adopted the algorithm for probabilistic, non-linear, global-search earthquake location. The method has been applied to specific test regions using P and pP phases from the GEOFON bulletin data for all available station networks. By using this method, a set of timing corrections has been calculated for each station varying as a function of source position. In this way, an attempt is made to correct for the systematic errors, introduced by limitations and inaccuracies in the assumed velocity structure, without solving for a new earth model itself. In this presentation, we draw on examples of the application of this global SSST technique to relocate earthquakes from the Tonga-Fiji subduction zone and from the Chilean margin. Our results have been showing a considerable decrease of the root-mean-square (RMS) residual in earthquake location final catalogs, a major reduction of the median absolute deviation (MAD) of the travel

  4. Condensation of earthquake location distributions: Optimal spatial information encoding and application to multifractal analysis of south Californian seismicity.

    PubMed

    Kamer, Yavor; Ouillon, Guy; Sornette, Didier; Wössner, Jochen

    2015-08-01

    We present the "condensation" method that exploits the heterogeneity of the probability distribution functions (PDFs) of event locations to improve the spatial information content of seismic catalogs. As its name indicates, the condensation method reduces the size of seismic catalogs while improving the access to the spatial information content of seismic catalogs. The PDFs of events are first ranked by decreasing location errors and then successively condensed onto better located and lower variance event PDFs. The obtained condensed catalog differs from the initial catalog by attributing different weights to each event, the set of weights providing an optimal spatial representation with respect to the spatially varying location capability of the seismic network. Synthetic tests on fractal distributions perturbed with realistic location errors show that condensation improves spatial information content of the original catalog, which is quantified by the likelihood gain per event. Applied to Southern California seismicity, the new condensed catalog highlights major mapped fault traces and reveals possible additional structures while reducing the catalog length by ∼25%. The condensation method allows us to account for location error information within a point based spatial analysis. We demonstrate this by comparing the multifractal properties of the condensed catalog locations with those of the original catalog. We evidence different spatial scaling regimes characterized by distinct multifractal spectra and separated by transition scales. We interpret the upper scale as to agree with the thickness of the brittle crust, while the lower scale (2.5 km) might depend on the relocation procedure. Accounting for these new results, the epidemic type aftershock model formulation suggests that, contrary to previous studies, large earthquakes dominate the earthquake triggering process. This implies that the limited capability of detecting small magnitude events cannot be used

  5. Condensation of earthquake location distributions: Optimal spatial information encoding and application to multifractal analysis of south Californian seismicity

    NASA Astrophysics Data System (ADS)

    Kamer, Yavor; Ouillon, Guy; Sornette, Didier; Wössner, Jochen

    2015-08-01

    We present the "condensation" method that exploits the heterogeneity of the probability distribution functions (PDFs) of event locations to improve the spatial information content of seismic catalogs. As its name indicates, the condensation method reduces the size of seismic catalogs while improving the access to the spatial information content of seismic catalogs. The PDFs of events are first ranked by decreasing location errors and then successively condensed onto better located and lower variance event PDFs. The obtained condensed catalog differs from the initial catalog by attributing different weights to each event, the set of weights providing an optimal spatial representation with respect to the spatially varying location capability of the seismic network. Synthetic tests on fractal distributions perturbed with realistic location errors show that condensation improves spatial information content of the original catalog, which is quantified by the likelihood gain per event. Applied to Southern California seismicity, the new condensed catalog highlights major mapped fault traces and reveals possible additional structures while reducing the catalog length by ˜25 % . The condensation method allows us to account for location error information within a point based spatial analysis. We demonstrate this by comparing the multifractal properties of the condensed catalog locations with those of the original catalog. We evidence different spatial scaling regimes characterized by distinct multifractal spectra and separated by transition scales. We interpret the upper scale as to agree with the thickness of the brittle crust, while the lower scale (2.5 km) might depend on the relocation procedure. Accounting for these new results, the epidemic type aftershock model formulation suggests that, contrary to previous studies, large earthquakes dominate the earthquake triggering process. This implies that the limited capability of detecting small magnitude events cannot be

  6. Analysis of spatiotemporal variation in b-value for the Sunda arc using high precision earthquake location

    NASA Astrophysics Data System (ADS)

    Nugraha, Andri Dian; Shiddiqi, Hasbi Ash; Widiyantoro, Sri; Sutiyono, Handayani, Titi

    2016-05-01

    The Sunda arc is one of the most active tectonic regions, which has a complex tectonic setting due to different tectonic regimes and subduction geometry along this arc. We analyzed variation in b-value for this region in order to obtain better information regarding the state of stress in this region. For the first step, we relocated earthquake hypocenters taken from the BMKG catalog for the period 2009 - 2015 by employing a teleseismic double-difference (DD) relocation method and using a 3D velocity model. There are 10,440 earthquakes that were successfully relocated with greatly reduced residual errors. Based on its tectonic feature and earthquake distribution, we divided the study area into 8 regions, i.e. northern Sumatra, central Sumatra, southern Sumatra, Sunda strait, western Java, eastern Java, lesser Sunda islands, and Sunda-Banda transition zone. For b-value analysis we combined the BMKG catalog with the International Seismological Centre (ISC) catalog from 2006 to 2009 to obtain a longer time period. We analyzed the spatial variation in b-value for western sunda arc and found a low b-value that matches well with earthquake locations.

  7. FCaZm intelligent recognition system for locating areas prone to strong earthquakes in the Andean and Caucasian mountain belts

    NASA Astrophysics Data System (ADS)

    Gvishiani, A. D.; Dzeboev, B. A.; Agayan, S. M.

    2016-07-01

    The fuzzy clustering and zoning method (FCAZm) of systems analysis is suggested for recognizing the areas of the probable generation of the epicenters of significant, strong, and the strongest earthquakes. FCAZm is a modified version of the previous FCAZ algorithmic system, which is advanced by the creation of the blocks of artificial intelligence that develop the system-forming algorithms. FCAZm has been applied for recognizing areas where the epicenters of the strongest ( M ≥ 73/4) earthquakes within the Andes mountain belt in the South America and significant earthquakes ( M ≥ 5) in the Caucasus can emerge. The reliability of the obtained results was assessed by the seismic-history type control experiments. The recognized highly seismic zones were compared with the ones previously recognized by the EPA method and by the initial version of the FCAZ system. The modified FCAZm system enabled us to pass from simple pattern recognition in the problem of recognizing the locations of the probable emergence of strong earthquakes to systems analysis. In particular, using FCAZm we managed to uniquely recognize a subsystem of highly seismically active zones from the nonempty complement using the exact boundary.

  8. Seismic monitoring of EGS tests at the Coso Geothermal area, California, using accurate MEQ locations and full moment tensors

    SciTech Connect

    Foulger, G.R.; B.R. Julian, B.R.; F. Monastero

    2008-04-01

    We studied high-resolution relative locations and full moment tensors of microearthquakes (MEQs) occurring before, during and following Enhanced Geothermal Systems (EGS) experiments in two wells at the Coso geothermal area, California. The objective was to map new fractures, determine the mode and sense of failure, and characterize the stress cycle associated with injection. New software developed for this work combines waveform crosscorrelation measurement of arrival times with relative relocation methods, and assesses confidence regions for moment tensors derived using linearprogramming methods. For moment tensor determination we also developed a convenient Graphical User Interface (GUI), to streamline the work. We used data from the U.S. Navy’s permanent network of three-component digital borehole seismometers and from 14 portable three-component digital instruments. The latter supplemented the permanent network during injection experiments in well 34A-9 in 2004 and well 34-9RD2 in 2005. In the experiment in well 34A-9, the co-injection earthquakes were more numerous, smaller, more explosive and had more horizontal motion, compared with the pre-injection earthquakes. In the experiment in well 34-9RD2 the relocated hypocenters reveal a well-defined planar structure, 700 m long and 600 m high in the depth range 0.8 to 1.4 km below sea level, striking N 20° E and dipping at 75° to the WNW. The moment tensors show that it corresponds to a mode I (opening) crack. For both wells, the perturbed stress state near the bottom of the well persisted for at least two months following the injection.

  9. Accurate location of nuclear explosions at Azgir, Kazakhstan, from satellite images and seismic data: Implications for monitoring decoupled explosions

    NASA Astrophysics Data System (ADS)

    Sykes, Lynn R.; Deng, Jishu; Lyubomirskiy, Paul

    1993-09-01

    The 10 largest tamped nuclear explosions detonated by the Former Soviet Union in and near two salt domes near Azgir were relocated using seismic data and the locations of shot points on a SPOT satellite image taken in 1988. Many of the shot points are clearly recognized on the satellite image and can be located with an accuracy of 60 m even though testing was carried out at those points many years earlier, i. e. between 1966 and 1979. Onsite inspections and a local seismic monitoring network combined with our accurate locations of previous explosions would insure that any cavities that remain standing from those events could not be used for undetected decoupled nuclear testing down to a very small yield. Since the Azgir area, like much of the Pre-Caspian depression, is arid, it would not be a suitable place for constructing large cavities in salt by solution mining and then using them for clandestine nuclear testing.

  10. Earthquake.

    PubMed

    Cowen, A R; Denney, J P

    1994-04-01

    On January 25, 1 week after the most devastating earthquake in Los Angeles history, the Southern California Hospital Council released the following status report: 928 patients evacuated from damaged hospitals. 805 beds available (136 critical, 669 noncritical). 7,757 patients treated/released from EDs. 1,496 patients treated/admitted to hospitals. 61 dead. 9,309 casualties. Where do we go from here? We are still waiting for the "big one." We'll do our best to be ready when Mother Nature shakes, rattles and rolls. The efforts of Los Angeles City Fire Chief Donald O. Manning cannot be overstated. He maintained department command of this major disaster and is directly responsible for implementing the fire department's Disaster Preparedness Division in 1987. Through the chief's leadership and ability to forecast consequences, the city of Los Angeles was better prepared than ever to cope with this horrendous earthquake. We also pay tribute to the men and women who are out there each day, where "the rubber meets the road." PMID:10133439

  11. High precision earthquake locations reveal seismogenic structure beneath Mammoth Mountain, California

    USGS Publications Warehouse

    Prejean, S.; Stork, A.; Ellsworth, W.; Hill, D.; Julian, B.

    2003-01-01

    In 1989, an unusual earthquake swarm occurred beneath Mammoth Mountain that was probably associated with magmatic intrusion. To improve our understanding of this swarm, we relocated Mammoth Mountain earthquakes using a double difference algorithm. Relocated hypocenters reveal that most earthquakes occurred on two structures, a near-vertical plane at 7-9 km depth that has been interpreted as an intruding dike, and a circular ring-like structure at ???5.5 km depth, above the northern end of the inferred dike. Earthquakes on this newly discovered ring structure form a conical section that dips outward away from the aseismic interior. Fault-plane solutions indicate that in 1989 the seismicity ring was slipping as a ring-normal fault as the center of the mountain rose with respect to the surrounding crust. Seismicity migrated around the ring, away from the underlying dike at a rate of ???0.4 km/month, suggesting that fluid movement triggered seismicity on the ring fault. Copyright 2003 by the American Geophysical Union.

  12. Location and source mechanism of the Karlsruhe earthquake of 24 September 2014

    NASA Astrophysics Data System (ADS)

    Barth, Andreas

    2016-02-01

    On 24 September 2014, a ML 2.3 earthquake occurred southwest of the urban area of Karlsruhe, Germany, which was felt by a few people (maximum intensity I 0 = III). It was the first seismic event in this highly populated area since an I 0 = VII earthquake in 1948. Data of 35 permanent and temporary seismometers were analysed to localise the event and to determine the focal mechanism to compare it to previous seismicity. Restricting the data to P- and S-phases from 18 nearby stations and optimising the local earth model result in an epicentre in the southwest of the city at 48.986°N/8.302°E and in a hypocentral depth of 10 km. To calculate the focal mechanism, 22 P- and 5 SH-polarities were determined that constrain a stable left lateral strike-slip focal mechanism with a minor thrusting component and nodal planes striking NE-SW and NW-SE. The epicentre lies in the vicinity of the I 0 = VII earthquake of 1948. Both events are part of the graben-parallel flower structure beneath the Upper Rhine Graben, parallel to the active Rastatt source zone, which runs 5 km further east and included the epicentre of the 1933 Rastatt I 0 = VII earthquake. The focal mechanisms of the 2014 and 1948 earthquakes show NE-SW striking nodal planes that dip to the southeast. However, for the 1948 event, a normal faulting mechanism was determined earlier. Taking the uncertainty of the epicentre and focal mechanism in 1948 and its fault dimensions into account, both events might have happened on the same fault plane.

  13. Location and source mechanism of the Karlsruhe earthquake of 24 September 2014

    NASA Astrophysics Data System (ADS)

    Barth, Andreas

    2016-07-01

    On 24 September 2014, a ML 2.3 earthquake occurred southwest of the urban area of Karlsruhe, Germany, which was felt by a few people (maximum intensity I 0 = III). It was the first seismic event in this highly populated area since an I 0 = VII earthquake in 1948. Data of 35 permanent and temporary seismometers were analysed to localise the event and to determine the focal mechanism to compare it to previous seismicity. Restricting the data to P- and S-phases from 18 nearby stations and optimising the local earth model result in an epicentre in the southwest of the city at 48.986°N/8.302°E and in a hypocentral depth of 10 km. To calculate the focal mechanism, 22 P- and 5 SH-polarities were determined that constrain a stable left lateral strike-slip focal mechanism with a minor thrusting component and nodal planes striking NE-SW and NW-SE. The epicentre lies in the vicinity of the I 0 = VII earthquake of 1948. Both events are part of the graben-parallel flower structure beneath the Upper Rhine Graben, parallel to the active Rastatt source zone, which runs 5 km further east and included the epicentre of the 1933 Rastatt I 0 = VII earthquake. The focal mechanisms of the 2014 and 1948 earthquakes show NE-SW striking nodal planes that dip to the southeast. However, for the 1948 event, a normal faulting mechanism was determined earlier. Taking the uncertainty of the epicentre and focal mechanism in 1948 and its fault dimensions into account, both events might have happened on the same fault plane.

  14. A more accurate relocation of the 2013 M s7.0 Lushan, Sichuan, China, earthquake sequence, and the seismogenic structure analysis

    NASA Astrophysics Data System (ADS)

    Long, F.; Wen, X. Z.; Ruan, X.; Zhao, M.; Yi, G. X.

    2015-07-01

    We use a combined earthquake location technique to relocate the M s7.0 Lushan, Sichuan, China, earthquake sequence of April 20, 2013. A stepwise approach, employing three existing location methods (the HYPOINVERSE method, the Minimum 1-D model, and the Double Difference method), is used to improve location precision by iteratively revising the velocity model station corrections, and hypocenter relocations throughout the process. Our stepwise approach has significantly improved the location precision of the Lushan earthquake sequence, yielding hypocenter locations with final errors of 359, 309, and 605 m in the E-W, N-S, and vertical directions, respectively, with average travel time residuals of 0.12 s. Furthermore, we analyzed the seismogenic structure surrounding the Lushan earthquake sequence by combining the results of the relocated hypocenter distribution with new focal mechanism solutions and information from regional geological and geophysical investigations. From our analysis, we conclude that the vast majority of the aftershocks of the Lushan earthquake sequence occurred at depths of 6-9 km, near the front of the southwestern segment of the NE-trending Longmenshan fault zone. Densely aligned hypocenters clearly suggest that the seismogenic structure of the mainshock consists of a set of basal thrust faults dipping to the NW at 40-50°, at a ramp of the deep basal décollement-thrust system at depths of 7-18 km. Focal mechanism solutions suggest that the seismogenic faults have produced almost pure thrusting. At least one SE-dipping back-thrust is also observed within the basement, as indicated by the hypocenter relocations, which points to either a secondary rupture plane during the mainshock or a plane of aftershock slips. A small number of minor events in the Lushan sequence are located at depths of 0-6 km, with a distribution suggesting that the three NE-trending faults with surface traces running through or passing close to the aftershock area are

  15. Accurate modeling and inversion of electrical resistivity data in the presence of metallic infrastructure with known location and dimension

    SciTech Connect

    Johnson, Timothy C.; Wellman, Dawn M.

    2015-06-26

    Electrical resistivity tomography (ERT) has been widely used in environmental applications to study processes associated with subsurface contaminants and contaminant remediation. Anthropogenic alterations in subsurface electrical conductivity associated with contamination often originate from highly industrialized areas with significant amounts of buried metallic infrastructure. The deleterious influence of such infrastructure on imaging results generally limits the utility of ERT where it might otherwise prove useful for subsurface investigation and monitoring. In this manuscript we present a method of accurately modeling the effects of buried conductive infrastructure within the forward modeling algorithm, thereby removing them from the inversion results. The method is implemented in parallel using immersed interface boundary conditions, whereby the global solution is reconstructed from a series of well-conditioned partial solutions. Forward modeling accuracy is demonstrated by comparison with analytic solutions. Synthetic imaging examples are used to investigate imaging capabilities within a subsurface containing electrically conductive buried tanks, transfer piping, and well casing, using both well casings and vertical electrode arrays as current sources and potential measurement electrodes. Results show that, although accurate infrastructure modeling removes the dominating influence of buried metallic features, the presence of metallic infrastructure degrades imaging resolution compared to standard ERT imaging. However, accurate imaging results may be obtained if electrodes are appropriately located.

  16. 3-D P- and S-wave velocity structure and low-frequency earthquake locations in the Parkfield, California region

    NASA Astrophysics Data System (ADS)

    Zeng, Xiangfang; Thurber, Clifford H.; Shelly, David R.; Harrington, Rebecca M.; Cochran, Elizabeth S.; Bennington, Ninfa L.; Peterson, Dana; Guo, Bin; McClement, Kara

    2016-09-01

    To refine the 3-D seismic velocity model in the greater Parkfield, California region, a new data set including regular earthquakes, shots, quarry blasts and low-frequency earthquakes (LFEs) was assembled. Hundreds of traces of each LFE family at two temporary arrays were stacked with time-frequency domain phase weighted stacking method to improve signal-to-noise ratio. We extend our model resolution to lower crustal depth with LFE data. Our result images not only previously identified features but also low velocity zones (LVZs) in the area around the LFEs and the lower crust beneath the southern Rinconada Fault. The former LVZ is consistent with high fluid pressure that can account for several aspects of LFE behaviour. The latter LVZ is consistent with a high conductivity zone in magnetotelluric studies. A new Vs model was developed with S picks that were obtained with a new autopicker. At shallow depth, the low Vs areas underlie the strongest shaking areas in the 2004 Parkfield earthquake. We relocate LFE families and analyse the location uncertainties with the NonLinLoc and tomoDD codes. The two methods yield similar results.

  17. Fault location and source process of the Boumerdes, Algeria, earthquake inferred from geodetic and strong motion data

    NASA Astrophysics Data System (ADS)

    Semmane, Fethi; Campillo, Michel; Cotton, Fabrice

    2005-01-01

    The Boumerdes earthquake occurred on a fault whose precise location, offshore the Algerian coast, was unknown. Geodetic data are used to determine the absolute position of the fault. The fault might emerge at about 15 km offshore. Accelerograms are used to infer the space-time history of the rupture using a two-step inversion in the spectral domain. The observed strong motion records agree with the synthetics for the fault location inferred from geodetic data. The fault plane ruptured for about 18 seconds. The slip distribution on the fault indicates one asperity northwest of the hypocenter with maximum slip amplitude about 3 m. This asperity is probably responsible for most of the damage. Another asperity with slightly smaller slip amplitude is located southeast of the hypocenter. The rupture stops its westward propagation close to the Thenia fault, a structure almost perpendicular to the main fault.

  18. Detection and location of shallow very low frequency earthquakes along the Nankai trough and the Ryukyu trench

    NASA Astrophysics Data System (ADS)

    Asano, Y.; Matsuzawa, T.; Obara, K.

    2013-12-01

    We have investigated spatiotemporal distribution of shallow very low frequency earthquakes (VLFEs) along the Nankai trough and the Ryukyu trench. Three component seismograms recorded at broadband stations of the NIED F-net were analyzed by using waveform-correlation and back-projection techniques after processing a band-pass filter (0.02 to 0.05 Hz). Here we used known VLFEs and regular interplate earthquakes near the trench axis as template events. Time series of cross-correlation function (CC) at each station was calculated from continuous waveform data and triggered seismograms of template events with a length of 180 s. Assuming surface wave propagation with a velocity of 3.8 km/s, CCs are back-propagated onto possible origin times and horizontal locations. We obtained VLFE epicenters by performing a grid search in time and space domains with spacing of 1 s and 0.025 degrees, respectively, to maximize the averaged CCs from all stations. At first, we choose grid points with averaged CCs larger than 0.5. If these grid points have similar origin times within 180 s, we assume that these grid points reflect a same event and choose the VLFE candidate having the largest averaged CC. If some grid points are detected in the same time window from different template events, we choose the VLFE candidate with the largest averaged CC from grid points located within 100 km from the template event. VLFEs were finally identified by removing regular earthquakes listed in the JMA catalogue from all candidates. As a result of the analysis for data from October, 2009 to February, 2010, two episodes of VLFE activity were detected. One episode was located east of the M6.8 interplate earthquake which occurred on October 30, 2009 along the Ryukyu trench. The VLFE seismicity was quite active just after the M6.8 earthquake and had been smoothly decreasing with the elapsed time. Such time dependent seismicity may be related to the post-seismic slip following the M6.8 earthquake. Another

  19. Magnitudes and locations of the 1811-1812 New Madrid, Missouri, and the 1886 Charleston, South Carolina, earthquakes

    USGS Publications Warehouse

    Bakun, W.H.; Hopper, M.G.

    2004-01-01

    We estimate locations and moment magnitudes M and their uncertainties for the three largest events in the 1811-1812 sequence near New Madrid, Missouri, and for the 1 September 1886 event near Charleston, South Carolina. The intensity magnitude M1, our preferred estimate of M, is 7.6 for the 16 December 1811 event that occurred in the New Madrid seismic zone (NMSZ) on the Bootheel lineament or on the Blytheville seismic zone. M1, is 7.5 for the 23 January 1812 event for a location on the New Madrid north zone of the NMSZ and 7.8 for the 7 February 1812 event that occurred on the Reelfoot blind thrust of the NMSZ. Our preferred locations for these events are located on those NMSZ segments preferred by Johnston and Schweig (1996). Our estimates of M are 0.1-0.4 M units less than those of Johnston (1996b) and 0.3-0.5 M units greater than those of Hough et al. (2000). M1 is 6.9 for the 1 September 1886 event for a location at the Summerville-Middleton Place cluster of recent small earthquakes located about 30 km northwest of Charleston.

  20. Accurate Analysis of the Change in Volume, Location, and Shape of Metastatic Cervical Lymph Nodes During Radiotherapy

    SciTech Connect

    Takao, Seishin; Tadano, Shigeru; Taguchi, Hiroshi; Yasuda, Koichi; Onimaru, Rikiya; Ishikawa, Masayori; Bengua, Gerard; Suzuki, Ryusuke; Shirato, Hiroki

    2011-11-01

    Purpose: To establish a method for the accurate acquisition and analysis of the variations in tumor volume, location, and three-dimensional (3D) shape of tumors during radiotherapy in the era of image-guided radiotherapy. Methods and Materials: Finite element models of lymph nodes were developed based on computed tomography (CT) images taken before the start of treatment and every week during the treatment period. A surface geometry map with a volumetric scale was adopted and used for the analysis. Six metastatic cervical lymph nodes, 3.5 to 55.1 cm{sup 3} before treatment, in 6 patients with head and neck carcinomas were analyzed in this study. Three fiducial markers implanted in mouthpieces were used for the fusion of CT images. Changes in the location of the lymph nodes were measured on the basis of these fiducial markers. Results: The surface geometry maps showed convex regions in red and concave regions in blue to ensure that the characteristics of the 3D tumor geometries are simply understood visually. After the irradiation of 66 to 70 Gy in 2 Gy daily doses, the patterns of the colors had not changed significantly, and the maps before and during treatment were strongly correlated (average correlation coefficient was 0.808), suggesting that the tumors shrank uniformly, maintaining the original characteristics of the shapes in all 6 patients. The movement of the gravitational center of the lymph nodes during the treatment period was everywhere less than {+-}5 mm except in 1 patient, in whom the change reached nearly 10 mm. Conclusions: The surface geometry map was useful for an accurate evaluation of the changes in volume and 3D shapes of metastatic lymph nodes. The fusion of the initial and follow-up CT images based on fiducial markers enabled an analysis of changes in the location of the targets. Metastatic cervical lymph nodes in patients were suggested to decrease in size without significant changes in the 3D shape during radiotherapy. The movements of the

  1. Locations and types of ruptures involved in the 2008 Wenchuan earthquake revealed by SAR image matching

    NASA Astrophysics Data System (ADS)

    Kobayashi, T.; Takada, Y.; Furuya, M.; Murakami, M.

    2009-12-01

    Introduction: A catastrophic earthquake with a moment magnitude of 7.9 struck China’s Sichuan area on 12 May 2008. The rupture was thought to proceed northeastward along the Longmen Shan fault zone (LMSFZ), but it remained uncertain where and how the faults were involved in the seismic event. Interferometric SAR (InSAR) analysis has an advantage of detecting ground deformation in a vast region with high precision. However, for the Sichuan event, the standard InSAR approach was not helpful in knowing the faults directly related to the seismic rupture, due to a wide coherent loss area in the proximity of the fault zone. Thus, in order to reveal the unknown surface displacements, we conducted a SAR image matching procedure that enables us to robustly detect large ground deformation even in an incoherent area. Although similar approaches can be taken with optical images to detect surface displacements, SAR images are advantageous because of the radar’s all-weather detection capability. In this presentation we will show a strong advantage of SAR data for inland large earthquakes. Analysis Method: We use ALOS/PALSAR data on the ascending orbital paths. We process the SAR data from a level-1.0 product using a software package Gamma. After conducting coregistration between two images acquired before and after the mainshock, we divide the single-look SAR amplitude images into patches and calculate an offset between the corresponding patches by an intensity tracking method. This method is performed by cross-correlating samples of backscatter intensity of a master image with those of a slave image. To reduce the artificial offsets in range component, we apply an elevation dependent correction incorporating SRTM3 DEM data. Results: We have successfully obtained the surface deformation in range component: A sharp displacement discontinuity with a relative motion of 1-2 m appears over a length of 200 km along the LMSFZ, which demonstrates that the main rupture has proceeded

  2. Finding faces among faces: human faces are located more quickly and accurately than other primate and mammal faces.

    PubMed

    Simpson, Elizabeth A; Buchin, Zachary; Werner, Katie; Worrell, Rey; Jakobsen, Krisztina V

    2014-11-01

    We tested the specificity of human face search efficiency by examining whether there is a broad window of detection for various face-like stimuli-human and animal faces-or whether own-species faces receive greater attentional allocation. We assessed the strength of the own-species face detection bias by testing whether human faces are located more efficiently than other animal faces, when presented among various other species' faces, in heterogeneous 16-, 36-, and 64-item arrays. Across all array sizes, we found that, controlling for distractor type, human faces were located faster and more accurately than primate and mammal faces, and that, controlling for target type, searches were faster when distractors were human faces compared to animal faces, revealing more efficient processing of human faces regardless of their role as targets or distractors (Experiment 1). Critically, these effects remained when searches were for specific species' faces (human, chimpanzee, otter), ruling out a category-level explanation (Experiment 2). Together, these results suggest that human faces may be processed more efficiently than animal faces, both when task-relevant (targets) and task-irrelevant (distractors), even in direct competition with other faces. These results suggest that there is not a broad window of detection for all face-like patterns but that human adults process own-species' faces more efficiently than other species' faces. Such own-species search efficiencies may arise through experience with own-species faces throughout development or may be privileged early in development, due to the evolutionary importance of conspecifics' faces. PMID:25113852

  3. Fault location and source process of the 2003 Boumerdes, Algeria, earthquake inferred from geodetic and strong motion data.

    NASA Astrophysics Data System (ADS)

    Semmane, F.; Campillo, M.; Cotton, F.

    2004-12-01

    The Boumerdes earthquake occurred on a fault which precise location, offshore the algerian coast, was unknown. Geodetic data consist of GPS measurements, levelling points and coastal uplifts. They are first used to determine the absolute position of the fault. We performed a series of inversions assuming different positions and chose the model giving the smallest misfit. According to this analysis, the fault emerge at about 15 km offshore. Accelerograms are then used to infer the space-time history of rupture on the fault plane using a two-step inversion in the spectral domain. The observed strong motion records are in good agreement with the synthetics for the fault location inferred from geodetic data. The fault plane ruptured for about 16 seconds. The slip distribution on the fault indicates one asperity north-west of the hypocenter with a maximum slip amplitude larger than 2.5 m. Another asperity with slightly smaller slip amplitude is located south-east of the hypocenter. The rupture seems to stop its propagation westward when it encounters the Thenia fault, a structure almost perpendicular to the main fault. We computed the spatial distribution of ground motion predicted by this fault model and compared it with the observed damages.

  4. Time-Reversal Location of the 2004 M6.0 Parkfield Earthquake Using the Vertical Component of Seismic Data.

    NASA Astrophysics Data System (ADS)

    Larmat, C. S.; Johnson, P.; Huang, L.; Randall, G.; Patton, H.; Montagner, J.

    2007-12-01

    In this work we describe Time Reversal experiments applying seismic waves recorded from the 2004 M6.0 Parkfield Earthquake. The reverse seismic wavefield is created by time-reversing recorded seismograms and then injecting them from the seismograph locations into a whole entire Earth velocity model. The concept is identical to acoustic Time-Reversal Mirror laboratory experiments except the seismic data are numerically backpropagated through a velocity model (Fink, 1996; Ulrich et al, 2007). Data are backpropagated using the finite element code SPECFEM3D (Komatitsch et al, 2002), employing the velocity model s20rts (Ritsema et al, 2000). In this paper, we backpropagate only the vertical component of seismic data from about 100 broadband surface stations located worldwide (FDSN), using the period band of 23-120s. We use those only waveforms that are highly correlated with forward-propagated synthetics. The focusing quality depends upon the type of waves back- propagated; for the vertical displacement component the possible types include body waves, Rayleigh waves, or their combination. We show that Rayleigh waves, both real and artifact, dominate the reverse movie in all cases. They are created during rebroadcast of the time reverse signals, including body wave phases, because we use point-like-force sources for injection. The artifact waves, termed "ghosts" manifest as surface waves, do not correspond to real wave phases during the forward propagation. The surface ghost waves can significantly blur the focusing at the source. We find that the ghosts cannot be easily eliminated in the manner described by Tsogka&Papanicolaou (2002). It is necessary to understand how they are created in order to remove them during TRM studies, particularly when using only the body waves. For this moderate magnitude of earthquake we demonstrate the robustness of the TRM as an alternative location method despite the restriction to vertical component phases. One advantage of TRM location

  5. Constraining the source location of the 30 May 2015 (Mw 7.9) Bonin deep-focus earthquake using seismogram envelopes of high-frequency P waveforms: Occurrence of deep-focus earthquake at the bottom of a subducting slab

    NASA Astrophysics Data System (ADS)

    Takemura, Shunsuke; Maeda, Takuto; Furumura, Takashi; Obara, Kazushige

    2016-05-01

    In this study, the source location of the 30 May 2015 (Mw 7.9) deep-focus Bonin earthquake was constrained using P wave seismograms recorded across Japan. We focus on propagation characteristics of high-frequency P wave. Deep-focus intraslab earthquakes typically show spindle-shaped seismogram envelopes with peak delays of several seconds and subsequent long-duration coda waves; however, both the main shock and aftershock of the 2015 Bonin event exhibited pulse-like P wave propagations with high apparent velocities (~12.2 km/s). Such P wave propagation features were reproduced by finite-difference method simulations of seismic wave propagation in the case of slab-bottom source. The pulse-like P wave seismogram envelopes observed from the 2015 Bonin earthquake show that its source was located at the bottom of the Pacific slab at a depth of ~680 km, rather than within its middle or upper regions.

  6. Compressional environment in the location and orientation of planetary dorsa and terrestrial earthquake fault structures

    NASA Technical Reports Server (NTRS)

    Raitala, J.

    1985-01-01

    Lunar mare ridges are not pure compressional ridges but their locations and orientations are most likely controlled by shear zones as seen from their Riedel-shear-like arrangements. On the Moon the crustal shortening has mostly taken place within mare areas but some young terra ridges are also to be seen indicating some crustal shortening also outside mare areas. This shortening has, however, not reached the same intensity as in the case of lobate scarp overthrusts on Mercury.

  7. Long Period (LP) volcanic earthquake source location at Merapi volcano by using dense array technics

    NASA Astrophysics Data System (ADS)

    Metaxian, Jean Philippe; Budi Santoso, Agus; Laurin, Antoine; Subandriyo, Subandriyo; Widyoyudo, Wiku; Arshab, Ghofar

    2015-04-01

    Since 2010, Merapi shows unusual activity compared to last decades. Powerful phreatic explosions are observed; some of them are preceded by LP signals. In the literature, LP seismicity is thought to be originated within the fluid, and therefore to be representative of the pressurization state of the volcano plumbing system. Another model suggests that LP events are caused by slow, quasi-brittle, low stress-drop failure driven by transient upper-edifice deformations. Knowledge of the spatial distribution of LP events is fundamental for better understanding the physical processes occurring in the conduit, as well as for the monitoring and the improvement of eruption forecasting. LP events recorded at Merapi have a spectral content dominated by frequencies between 0.8 and 3 Hz. To locate the source of these events, we installed a seismic antenna composed of 4 broadband CMG-6TD Güralp stations. This network has an aperture of 300 m. It is located on the site of Pasarbubar, between 500 and 800 m from the crater rim. Two multi-parameter stations (seismic, tiltmeter, S-P) located in the same area, equipped with broadband CMG-40T Güralp sensors may also be used to complete the data of the antenna. The source of LP events is located by using different approaches. In the first one, we used a method based on the measurement of the time delays between the early beginnings of LP events for each array receiver. The observed differences of time delays obtained for each pair of receivers are compared to theoretical values calculated from the travel times computed between grid nodes, which are positioned in the structure, and each receiver. In a second approach, we estimate the slowness vector by using MUSIC algorithm applied to 3-components data. From the slowness vector, we deduce the back-azimuth and the incident angle, which give an estimation of LP source depth in the conduit. This work is part of the Domerapi project funded by French Agence Nationale de la Recherche (https

  8. The May 29 2008 earthquake aftershock sequence within the South Iceland Seismic Zone: Fault locations and source parameters of aftershocks

    NASA Astrophysics Data System (ADS)

    Brandsdottir, B.; Parsons, M.; White, R. S.; Gudmundsson, O.; Drew, J.

    2010-12-01

    The mid-Atlantic plate boundary breaks up into a series of segments across Iceland. The South Iceland Seismic Zone (SISZ) is a complex transform zone where left-lateral E-W shear between the Reykjanes Peninsula Rift Zone and the Eastern Volcanic Zone is accommodated by bookshelf faulting along N-S lateral strike-slip faults. The SISZ is also a transient feature, migrating sideways in response to the southward propagation of the Eastern Volcanic Zone. Sequences of large earthquakes (M > 6) lasting from days to years and affecting most of the seismic zone have occurred repeatedly in historical time (last 1100 years), separated by intervals of relative quiescence lasting decades to more than a century. On May 29 2008, a Mw 6.1 earthquake struck the western part of the South Iceland Seismic Zone, followed within seconds by a slightly smaller event on a second fault ~5 km further west. Aftershocks, detected by a temporal array of 11 seismometers and three permanent Icelandic Meteorological Office stations were located using an automated Coalescence Microseismic Mapping technique. The epicenters delineate two major and several smaller N-S faults as well as an E-W zone of activity stretching further west into the Reykjanes Peninsula Rift Zone. Fault plane solutions show both right lateral and oblique strike slip mechanisms along the two major N-S faults. The aftershocks deepen from 3-5 km in the north to 8-9 km in the south, suggesting that the main faults dip southwards. The faulting is interpreted to be driven by the local stress due to transform motion between two parallel segments of the divergent plate boundary crossing Iceland.

  9. 3D Travel Time Prediction for Earthquake Location - An Assessment of Methods and Models

    NASA Astrophysics Data System (ADS)

    Begnaud, M. L.; Ballard, S.; Rowe, C. A.; Young, C. J.; Steck, L.; Hipp, J. R.

    2009-12-01

    We have selected several crustal and mantle 3D models to test for travel-time prediction in a global event location context. Included are the ak135, DoE Unified, Sun et al. (2004) and MITP08 models, among others. Using the recently published tesselated 3D global ray tracing algorithm of Ballard et al., we compare and contrast our travel-time predictions through these obtained models for a set of ~500 Ground Truth (GT) 5 or better events, most of which are chemical or nuclear explosions. We explore the degree of travel-time misfit that can be expected when integrating rays through a model using a different method, or different parameterization, from that which generated the model. For instance, we compare the effect of dynamic ray tracing vs. fixed rays through a mantle tomographic model that was generated by inverting travel-time residuals for pre-calculated, fixed rays in the 1D radial AK135 model. We examine the success of these models for not only teleseismic P arrivals but also Pn and Pg. We explore the geographic biases observed for each phase and the trade-offs encountered when models are integrated. We find that our GT travel times are best predicted through any model when the calculation is perfomed using methods as close as possible to those used in generation of the model, as expected. Such considerations as Earth ellipticity correction and fixed ray vs. dynamic ray tracing need to be applied appropriately for a fair evaluation. Models available to the community are thus of little practical use for global location unless their methods of derivation are also provided, although they may independently provide enlightening images of tectonic features. We conclude that towards our development of a seamless, global model and locator, existing models may best serve as starting models for a global inversion using a single, consistent ray tracing and travel-time calculation approach; thus we view our evaluation of available models as a search for the best starting

  10. The detection and location of low magnitude earthquakes in northern Norway using multi-channel waveform correlation at regional distances

    NASA Astrophysics Data System (ADS)

    Gibbons, Steven J.; Bøttger Sørensen, Mathilde; Harris, David B.; Ringdal, Frode

    2007-03-01

    A fortuitous sequence of closely spaced earthquakes in the Rana region of northern Norway, during 2005, has provided an ideal natural laboratory for investigating event detectability using waveform correlation over networks and arrays at regional distances. A small number of events between magnitude 2.0 and 3.5 were recorded with a high SNR by the Fennoscandian IMS seismic arrays at distances over 600 km and three of these events, including the largest on 24 June, displayed remarkable waveform similarity even at relatively high frequencies. In an effort to detect occurrences of smaller earthquakes in the immediate geographical vicinity of the 24 June event, a multi-channel correlation detector for the NORSAR array was run for the whole calender year 2005 using the signal from the master event as a template. A total of 32 detections were made and all but 2 of these coincided with independent correlation detections using the other Nordic IMS array stations; very few correspond to signals detectable using traditional energy detectors. Permanent and temporary stations of the Norwegian National Seismic Network (NNSN) at far closer epicentral distances have confirmed that all but one of the correlation detections at NORSAR in fact correspond to real events. The closest stations at distances of approximately 10 km can confirm that the smallest of these events have magnitudes down to 0.5 which represents a detection threshold reduction of over 1.5 for the large-aperture NORSAR array and over 1.0 for the almost equidistant regional ARCES array. The incompleteness of the local network recordings precludes a comprehensive double-difference location for the full set of events. However, stable double-difference relative locations can be obtained for eight of the events using only the Lg phase recorded at the array stations. All events appear to be separated by less than 0.5 km. Clear peaks were observed in the NORSAR correlation coefficient traces during the coda of some of the

  11. Problematic Location and Focal Mechanism of Weak Earthquakes: Example From The February-july 2001 Sequence In Aegion, Greece

    NASA Astrophysics Data System (ADS)

    Sokos, E.; Zahradnik, J.; Jansky, J.; Serpetsidaki, A.

    An earthquake sequence comprising of almost 200 events, with ML 2.0 to 4.7 oc- curred at about 10 km south of Aegion, the town heavily damaged by the ML = 6.2 earthquake of 1995. This region is of interest to the EC project cluster, the "Corinth Rift Laboratory". The sequence started in February 2001 and ended in July 2001. It was located by the regional short-period network of the University of Patras, PATNET, covering the Western Greece. The HYPO71PC method of Lee and Valdes (using var- ious constant-velocity layered crustal models and various starting depths) indicated that the earthquakes occupied a relatively large volume, whose horizontal and verti- cal extent is 10 x 10 km, and 20 km, respectively. The grid-search method applied in the same models, and also in a crustal model composed of gradient layers, confirmed this result but it also revealed existence of a dense cluster in the depth range 14 to 20 km, close to the mainshock depth. To retrieve the focal mechanism of the main- shock (April 8, ML=4.7), the amplitude spectra of complete waveforms at 0.1 to 0.2 Hz, below the corner frequency, were grid-searched for the strike, dip, and rake, using several trial depths. Three stations were employed: KER (Kernitsa, distance 17 km) and SER (Sergoula, 26 km) with digital accelerographs CMG5-T (Guralp), and DES (Desfina, 56 km), equipped by LE-3D/5s (Lennartz) instrument; DES is operated by the Seismological Laboratory of the University of Athens. The synthetic spectra were calculated by the discrete wavenumber method of Bouchon. The best solution (depth 8 km) is given by the strike = 220, dip = 40 , rake = -160, and its conjugate 114, 77, -52. The scalar seismic moment is 2.5e15 Nm, which gives the moment magnitude Mw=4.3. To assess the fault size, the waveforms (0.1 to 5.0 Hz) were fitted with a triangular moment-rate function. The best fit was obtained for the source duration of 0.3-0.4 sec, corresponding to the fault size of about 1 km. This implies average

  12. Quantifying capability of a local seismic network in terms of locations and focal mechanism solutions of weak earthquakes

    NASA Astrophysics Data System (ADS)

    Fojtíková, Lucia; Kristeková, Miriam; Málek, Jiří; Sokos, Efthimios; Csicsay, Kristián; Zahradník, Jiří

    2016-01-01

    Extension of permanent seismic networks is usually governed by a number of technical, economic, logistic, and other factors. Planned upgrade of the network can be justified by theoretical assessment of the network capability in terms of reliable estimation of the key earthquake parameters (e.g., location and focal mechanisms). It could be useful not only for scientific purposes but also as a concrete proof during the process of acquisition of the funding needed for upgrade and operation of the network. Moreover, the theoretical assessment can also identify the configuration where no improvement can be achieved with additional stations, establishing a tradeoff between the improvement and additional expenses. This paper presents suggestion of a combination of suitable methods and their application to the Little Carpathians local seismic network (Slovakia, Central Europe) monitoring epicentral zone important from the point of seismic hazard. Three configurations of the network are considered: 13 stations existing before 2011, 3 stations already added in 2011, and 7 new planned stations. Theoretical errors of the relative location are estimated by a new method, specifically developed in this paper. The resolvability of focal mechanisms determined by waveform inversion is analyzed by a recent approach based on 6D moment-tensor error ellipsoids. We consider potential seismic events situated anywhere in the studied region, thus enabling "mapping" of the expected errors. Results clearly demonstrate that the network extension remarkably decreases the errors, mainly in the planned 23-station configuration. The already made three-station extension of the network in 2011 allowed for a few real data examples. Free software made available by the authors enables similar application in any other existing or planned networks.

  13. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  14. Fast, Accurate and Precise Mid-Sagittal Plane Location in 3D MR Images of the Brain

    NASA Astrophysics Data System (ADS)

    Bergo, Felipe P. G.; Falcão, Alexandre X.; Yasuda, Clarissa L.; Ruppert, Guilherme C. S.

    Extraction of the mid-sagittal plane (MSP) is a key step for brain image registration and asymmetry analysis. We present a fast MSP extraction method for 3D MR images, based on automatic segmentation of the brain and on heuristic maximization of the cerebro-spinal fluid within the MSP. The method is robust to severe anatomical asymmetries between the hemispheres, caused by surgical procedures and lesions. The method is also accurate with respect to MSP delineations done by a specialist. The method was evaluated on 64 MR images (36 pathological, 20 healthy, 8 synthetic), and it found a precise and accurate approximation of the MSP in all of them with a mean time of 60.0 seconds per image, mean angular variation within a same image (precision) of 1.26o and mean angular difference from specialist delineations (accuracy) of 1.64o.

  15. Joint inversion of teleseismic body-waves and geodetic data for the Mw6.8 aftershock of the Balochistan earthquake with refined epicenter location

    NASA Astrophysics Data System (ADS)

    Wei, S.; Wang, T.; Jonsson, S.; Avouac, J. P.; Helmberger, D. V.

    2014-12-01

    Aftershocks of the 2013 Balochistan earthquake are mainly concentrated along the northeastern end of the mainshock rupture despite of much larger coseismic slip to the southwest. The largest event among them is an Mw6.8 earthquake which occurred three days after the mainshock. A kinematic slip model of the mainshock was obtained by joint inversion of the teleseismic body-waves and horizontal static deformation field derived from remote sensing optical and SAR data, which is composed of seven fault segments with gradually changing strikes and dips [Avouac et al., 2014]. The remote sensing data provide well constraints on the fault geometry and spatial distribution of slip but no timing information. Meanwhile, the initiation of the teleseismic waveform is very sensitive to fault geometry of the epicenter segment (strike and dip) and spatial slip distribution but much less sensitive to the absolute location of the epicenter. The combination of the two data sets allows a much better determination of the absolute epicenter location, which is about 25km to the southwest of the NEIC epicenter location. The well located mainshock epicenter is used to establish path calibrations for teleseismic P-waves, which are essential for relocating the Mw6.8 aftershock. Our grid search shows that the refined epicenter is located right at the northeastern end of the mainshock rupture. This is confirmed by the SAR offsets calculated from images acquired after the mainshock. The azimuth and range offsets display a discontinuity across the rupture trace of the mainshock. Teleseismic only and static only, as well as joint inversions all indicate that the aftershock ruptured an asperity with 25km along strike and range from 8km to 20km in depth. The earthquake was originated in a positive Coulomb stress change regime due to the mainshock and has complementary slip distribution to the mainshock rupture at the northeastern end, suggesting that the entire seismic generic zone in the crust was

  16. Earthquake location determination using data from DOMERAPI and BMKG seismic networks: A preliminary result of DOMERAPI project

    SciTech Connect

    Ramdhan, Mohamad; Nugraha, Andri Dian; Widiyantoro, Sri; Métaxian, Jean-Philippe; Valencia, Ayunda Aulia

    2015-04-24

    DOMERAPI project has been conducted to comprehensively study the internal structure of Merapi volcano, especially about deep structural features beneath the volcano. DOMERAPI earthquake monitoring network consists of 46 broad-band seismometers installed around the Merapi volcano. Earthquake hypocenter determination is a very important step for further studies, such as hypocenter relocation and seismic tomographic imaging. Ray paths from earthquake events occurring outside the Merapi region can be utilized to delineate the deep magma structure. Earthquakes occurring outside the DOMERAPI seismic network will produce an azimuthal gap greater than 180{sup 0}. Owing to this situation the stations from BMKG seismic network can be used jointly to minimize the azimuthal gap. We identified earthquake events manually and carefully, and then picked arrival times of P and S waves. The data from the DOMERAPI seismic network were combined with the BMKG data catalogue to determine earthquake events outside the Merapi region. For future work, we will also use the BPPTKG (Center for Research and Development of Geological Disaster Technology) data catalogue in order to study shallow structures beneath the Merapi volcano. The application of all data catalogues will provide good information as input for further advanced studies and volcano hazards mitigation.

  17. Alignment of leading-edge and peak-picking time of arrival methods to obtain accurate source locations

    SciTech Connect

    Roussel-Dupre, R.; Symbalisty, E.; Fox, C.; and Vanderlinde, O.

    2009-08-01

    The location of a radiating source can be determined by time-tagging the arrival of the radiated signal at a network of spatially distributed sensors. The accuracy of this approach depends strongly on the particular time-tagging algorithm employed at each of the sensors. If different techniques are used across the network, then the time tags must be referenced to a common fiducial for maximum location accuracy. In this report we derive the time corrections needed to temporally align leading-edge, time-tagging techniques with peak-picking algorithms. We focus on broadband radio frequency (RF) sources, an ionospheric propagation channel, and narrowband receivers, but the final results can be generalized to apply to any source, propagation environment, and sensor. Our analytic results are checked against numerical simulations for a number of representative cases and agree with the specific leading-edge algorithm studied independently by Kim and Eng (1995) and Pongratz (2005 and 2007).

  18. Swift Gamma-Ray Burst Explorer: Mission Design for Rapid, Accurate Location of Gamma-ray Bursts

    NASA Technical Reports Server (NTRS)

    Bundas, David J.

    2004-01-01

    The Swift Gamma-ray Burst Explorer is a NASA Mid-sized Explorer (MIDEX) with the primary mission of determining the origins of Gamma-Ray Bursts (GRBs). It will be the first mission to autonomously respond to newly-discovered GRBs and provide immediate follow-up narrow field instruments capable of multi-wavelength (UV, Optical, X-ray) observations. The characteristics of GRBs that are the key mission design drivers, are their non-repeating and brief duration bursts of multi-wavelength photons. In addition, rapid notification of the location and characteristics of the GRBs to ground-and-space- based observatories drive the end-to-end data analysis and distribution requirements. The Swift mission is managed by the GSFC, and includes an international team of contributors that each bring their unique perspective that have proven invaluable to the mission. The spacecraft bus, provided by Spectrum Astro, Inc. was procured through a Rapid Spacecraft Development Office (RSDO) contract by the GSFC. There are three instruments: the Burst Alert Telescope (BAT) provided by the GSFC; the X-Ray Telescope (XRT) provided by a team led by the Pennsylvania State University (PSU); and the Ultra-Violet Optical Telescope (UVOT), again managed by PSU. The Mission Operations Center (MOC) was developed by and is located at PSU. Science archiving and data analysis centers are located at the GSFC, in the UK and in Italy.

  19. Helicopter Based Magnetic Detection Of Wells At The Teapot Dome (Naval Petroleum Reserve No. 3 Oilfield: Rapid And Accurate Geophysical Algorithms For Locating Wells

    NASA Astrophysics Data System (ADS)

    Harbert, W.; Hammack, R.; Veloski, G.; Hodge, G.

    2011-12-01

    In this study Airborne magnetic data was collected by Fugro Airborne Surveys from a helicopter platform (Figure 1) using the Midas II system over the 39 km2 NPR3 (Naval Petroleum Reserve No. 3) oilfield in east-central Wyoming. The Midas II system employs two Scintrex CS-2 cesium vapor magnetometers on opposite ends of a transversely mounted, 13.4-m long horizontal boom located amidships (Fig. 1). Each magnetic sensor had an in-flight sensitivity of 0.01 nT. Real time compensation of the magnetic data for magnetic noise induced by maneuvering of the aircraft was accomplished using two fluxgate magnetometers mounted just inboard of the cesium sensors. The total area surveyed was 40.5 km2 (NPR3) near Casper, Wyoming. The purpose of the survey was to accurately locate wells that had been drilled there during more than 90 years of continuous oilfield operation. The survey was conducted at low altitude and with closely spaced flight lines to improve the detection of wells with weak magnetic response and to increase the resolution of closely spaced wells. The survey was in preparation for a planned CO2 flood to enhance oil recovery, which requires a complete well inventory with accurate locations for all existing wells. The magnetic survey was intended to locate wells that are missing from the well database and to provide accurate locations for all wells. The well location method used combined an input dataset (for example, leveled total magnetic field reduced to the pole), combined with first and second horizontal spatial derivatives of this input dataset, which were then analyzed using focal statistics and finally combined using a fuzzy combination operation. Analytic signal and the Shi and Butt (2004) ZS attribute were also analyzed using this algorithm. A parameter could be adjusted to determine sensitivity. Depending on the input dataset 88% to 100% of the wells were located, with typical values being 95% to 99% for the NPR3 field site.

  20. Current Status of a Near-Real Time High Rate (1Hz) GPS Processing applied to a Network located in Spain and surrounding for Quick Earthquake Magnitude Determination

    NASA Astrophysics Data System (ADS)

    Mendoza, Leonor; Garate, Jorge; Davila, Jose Martin; Becker, Matthias; Drescher, Ralf

    2010-05-01

    The earthquake true size and tsunami potential can be determined using GPS data up to only 15 minutes after earthquake initiation, by tracking the mean displacement of Earth's surface associated with the arrival of seismic waves (Blewitt, 2006). We are using this approach to get quick assessments of earthquakes' magnitudes. Data files with 1 Hz data sample, of Continuous GPS (CGPS) networks, located in Spain and surrounding, are analyzed with Bernese 5.0 software. Relative movements are computed to detect horizontal, but also vertical, surface's deformations due to large magnitude earthquakes. Accuracy is expected at millimetres level. Moreover, CGPS 1 Hz data is less sensitive to noise contamination than seismic data (Larson et al, 2003). Some UNIX scripts built in Perl, make Bernese to run batch processes every 15 minutes: CGPS network stations' data files are downloaded, in order to be analyzed automatically. The process output is a new set of coordinates for each station, which is compared with those we have got before, looking for deformations in near real time. The poster shows the implementation and the present status of the analysis. We present the chosen network results, and some time series examples in the three components are also shown.

  1. Swift Gamma-ray Burst Explorer: Mission Design for Rapid, Accurate Location of Gamma-ray Bursts

    NASA Technical Reports Server (NTRS)

    Bundas, David J.

    2005-01-01

    The Swift Gamma-ray Burst Explorer is a NASA Mid-sized Explorer (MIDEX) with the primary mission of determining the origins of Gamma-Ray Bursts (GRBs). It will be the first mission to autonomously respond to newly-discovered GRBs and provide immediate follow-up with narrow field instruments capable of multi-wavelength (UT, Optical, X-ray) observations. The characteristics of GRBs that are the key mission design drivers, are their non-repeating and brief duration bursts of multi-wavelength photons. In addition, rapid notification of the location and characteristics of the GRBs to ground-and-space-based observatories drive the end-to-end data analysis and distribution requirements.

  2. Swift Gamma-Ray Burst Explorer: Mission Design for Rapid, Accurate Location of Gamma-ray Bursts

    NASA Technical Reports Server (NTRS)

    Bundas, David J.

    2004-01-01

    The Swift Gamma-ray Burst Explorer is a NASA Mid-sized Explorer (MIDEX) with the primary mission of determining the origins of Gamma-Ray Bursts (GRBs). It will be the first mission to autonomously respond to newly-discovered GRBs and provide immediate follow-up with narrow field instruments capable of multi-wavelength (UV, Optical, X-ray) observations. The characteristics of GRBs that are the key mission design drivers, are their non-repeating and brief duration bursts of multi-wavelength photons. In addition, rapid notification of the location and characteristics of the GRBs to ground-and-space-based observatories drive the end-to-end data analysis and distribution requirements.

  3. Accounts of damage from historical earthquakes in the northeastern Caribbean to aid in the determination of their location and intensity magnitudes

    USGS Publications Warehouse

    Flores, Claudia H.; ten Brink, Uri S.; Bakun, William H.

    2012-01-01

    Documentation of an event in the past depended on the population and political trends of the island, and the availability of historical documents is limited by the physical resource digitization schedule and by the copyright laws of each archive. Examples of documents accessed are governors' letters, newspapers, and other circulars published within the Caribbean, North America, and Western Europe. Key words were used to search for publications that contain eyewitness accounts of various large earthquakes. Finally, this catalog provides descriptions of damage to buildings used in previous studies for the estimation of moment intensity (MI) and location of significantly damaging or felt earthquakes in Hispaniola and in the northeastern Caribbean, all of which have been described in other studies.

  4. Earthquake location, active faulting, and P-wave velocity structure near a metamorphic massif in the eastern syntaxis of the Tibetan Plateau

    NASA Astrophysics Data System (ADS)

    Brown, L. E.; Meltzer, A.

    2011-12-01

    Within the core of the eastern syntaxis of the Himalaya, the Namche Barwa - Gyala Peri massif is a site of rapid exhumation where high grade metamorphic rocks from the mid to lower crust are exposed at the surface. Some of the world's highest relief is observed in this massif. The two peaks, standing over 7000m tall, are only 20km apart and are separated by a major river, the Tsangpo, at an elevation of 2500m. This impressive relief is maintained because these mountains constitute an actively forming, localized, antiformal structure, which is rapidly uplifting, while the Tsangpo downcuts through the structure. This tectonic situation is interesting because there appear to be feedbacks between topography and tectonics. As part of an effort to understand the dynamics associated with this localized structure, a temporary seismic network was used to record earthquakes near the massif. In this study approximately 2000 local earthquakes are used to define a 3-D velocity model and the locations of active faulting. The majority of events are part of an impressive spatial cluster which occurred during a series of earthquake swarms. This NW trending cluster has a vertical dip, extending to 15km in depth, and closely correlates to a topographic ridge immediately to the north of Namche Barwa. The Tsangpo takes a sharp turn when it reaches this ridge, flowing parallel to the base of the ridge, and the river then makes a dramatic 180° turn around the ridge. Given that the river's erosional power is thought to be responsible for localizing deformation into this area, it is significant that the Tsangpo's course through this area might be fault controlled. A second cluster of events is located to the west of Gyala Peri and trends to the north. Comparing this cluster to a geologic map shows that the events fall on a mapped thrust fault. This fault extends to the south of the massif, where there were no recorded events. This portion of the fault is interpreted to be locked. The

  5. The 7.9 Denali Fault, Alaska Earthquake of November 3, 2002: Aftershock Locations, Moment Tensors and Focal Mechanisms from the Regional Seismic Network Data

    NASA Astrophysics Data System (ADS)

    Ratchkovski, N. A.; Hansen, R. A.; Kore, K. R.

    2003-04-01

    The largest earthquake ever recorded on the Denali fault system (magnitude 7.9) struck central Alaska on November 3, 2002. It was preceded by a magnitude 6.7 earthquake on October 23. This earlier earthquake and its zone of aftershocks were located ~20 km to the west of the 7.9 quake. Aftershock locations and surface slip observations from the 7.9 quake indicate that the rupture was predominately unilateral in the eastward direction. The geologists mapped a ~300-km-long rupture and measured maximum offsets of 8.8 meters. The 7.9 event ruptured three different faults. The rupture began on the northeast trending Susitna Glacier Thrust fault, a splay fault south of the Denali fault. Then the rupture transferred to the Denali fault and propagated eastward for 220 km. At about 143W the rupture moved onto the adjacent southeast-trending Totschunda fault and propagated for another 55 km. The cumulative length of the 6.7 and 7.9 aftershock zones along the Denali and Totschunda faults is about 380 km. The earthquakes were recorded and processed by the Alaska Earthquake Information Center (AEIC). The AEIC acquires and processes data from the Alaska Seismic Network, consisting of over 350 seismograph stations. Nearly 40 of these sites are equipped with the broad-band sensors, some of which also have strong motion sensors. The rest of the stations are either 1 or 3-component short-period instruments. The data from these stations are collected, processed and archived at the AEIC. The AEIC staff installed a temporary seismic network of 6 instruments following the 6.7 earthquake and an additional 20 stations following the 7.9 earthquake. Prior to the 7.9 Denali Fault event, the AEIC was locating 35 to 50 events per day. After the event, the processing load increased to over 300 events per day during the first week following the event. In this presentation, we will present and interpret the aftershock location patterns, first motion focal mechanism solutions, and regional seismic

  6. Three-dimensional P- and S-wave velocity structure and low-frequency earthquake locations in the Parkfield, California region

    NASA Astrophysics Data System (ADS)

    Zeng, Xiangfang; Thurber, Clifford H.; Shelly, David R.; Harrington, Rebecca M.; Cochran, Elizabeth S.; Bennington, Ninfa L.; Peterson, Dana; Guo, Bin; McClement, Kara

    2016-06-01

    To refine the 3D seismic velocity model in the greater Parkfield, California region, a new dataset including regular earthquakes, shots, quarry blasts, and low-frequency earthquakes (LFEs) was assembled. Hundreds of traces of each LFE family at two temporary arrays were stacked with time-frequency domain phase weighted stacking (tf-PWS) method to improve signal-to-noise ratio. We extend our model resolution to lower crustal depth with LFE data. Our result images not only previously identified features but also low velocity zones (LVZs) in the area around the LFEs and the lower crust beneath the southern Rinconada Fault. The former LVZ is consistent with high fluid pressure that can account for several aspects of LFE behavior. The latter LVZ is consistent with a high conductivity zone in magnetotelluric studies. A new Vs model was developed with S picks that were obtained with a new auto-picker. At shallow depth, the low Vs areas underlie the strongest shaking areas in the 2004 Parkfield earthquake. A high Vp/Vs zone in the middle crust on the northwest side of the San Andreas Fault was also revealed. We relocate LFE families and analyze the location uncertainties with the NonLinLoc and tomoDD codes. The two methods yield similar results.

  7. Connecting slow earthquakes to huge earthquakes

    NASA Astrophysics Data System (ADS)

    Obara, Kazushige; Kato, Aitaro

    2016-07-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.

  8. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. PMID:27418504

  9. Using a modified time-reverse imaging technique to locate low-frequency earthquakes on the San Andreas Fault near Cholame, California

    NASA Astrophysics Data System (ADS)

    Horstmann, Tobias; Harrington, Rebecca M.; Cochran, Elizabeth S.

    2015-11-01

    We present a new method to locate low-frequency earthquakes (LFEs) within tectonic tremor episodes based on time-reverse imaging techniques. The modified time-reverse imaging technique presented here is the first method that locates individual LFEs within tremor episodes within 5 km uncertainty without relying on high-amplitude P-wave arrivals and that produces similar hypocentral locations to methods that locate events by stacking hundreds of LFEs without having to assume event co-location. In contrast to classic time-reverse imaging algorithms, we implement a modification to the method that searches for phase coherence over a short time period rather than identifying the maximum amplitude of a superpositioned wavefield. The method is independent of amplitude and can help constrain event origin time. The method uses individual LFE origin times, but does not rely on a priori information on LFE templates and families. We apply the method to locate 34 individual LFEs within tremor episodes that occur between 2010 and 2011 on the San Andreas Fault, near Cholame, California. Individual LFE location accuracies range from 2.6 to 5 km horizontally and 4.8 km vertically. Other methods that have been able to locate individual LFEs with accuracy of less than 5 km have mainly used large-amplitude events where a P-phase arrival can be identified. The method described here has the potential to locate a larger number of individual low-amplitude events with only the S-phase arrival. Location accuracy is controlled by the velocity model resolution and the wavelength of the dominant energy of the signal. Location results are also dependent on the number of stations used and are negligibly correlated with other factors such as the maximum gap in azimuthal coverage, source-station distance and signal-to-noise ratio.

  10. Using a modified time-reverse imaging technique to locate low-frequency earthquakes on the San Andreas Fault near Cholame, California

    USGS Publications Warehouse

    Horstmann, Tobias; Harrington, Rebecca M.; Cochran, Elizabeth S.

    2015-01-01

    We present a new method to locate low-frequency earthquakes (LFEs) within tectonic tremor episodes based on time-reverse imaging techniques. The modified time-reverse imaging technique presented here is the first method that locates individual LFEs within tremor episodes within 5 km uncertainty without relying on high-amplitude P-wave arrivals and that produces similar hypocentral locations to methods that locate events by stacking hundreds of LFEs without having to assume event co-location. In contrast to classic time-reverse imaging algorithms, we implement a modification to the method that searches for phase coherence over a short time period rather than identifying the maximum amplitude of a superpositioned wavefield. The method is independent of amplitude and can help constrain event origin time. The method uses individual LFE origin times, but does not rely on a priori information on LFE templates and families.We apply the method to locate 34 individual LFEs within tremor episodes that occur between 2010 and 2011 on the San Andreas Fault, near Cholame, California. Individual LFE location accuracies range from 2.6 to 5 km horizontally and 4.8 km vertically. Other methods that have been able to locate individual LFEs with accuracy of less than 5 km have mainly used large-amplitude events where a P-phase arrival can be identified. The method described here has the potential to locate a larger number of individual low-amplitude events with only the S-phase arrival. Location accuracy is controlled by the velocity model resolution and the wavelength of the dominant energy of the signal. Location results are also dependent on the number of stations used and are negligibly correlated with other factors such as the maximum gap in azimuthal coverage, source–station distance and signal-to-noise ratio.

  11. The October 17, 1989, Loma Prieta, California, earthquake and its aftershocks: Geometry of the sequence from high-resolution locations

    SciTech Connect

    Dietz, L.D.; Ellsworth, W.L. )

    1990-08-01

    Hypocenters of the Loma Prieta sequence form a dipping zone that rises from the mainshock hypocenter and is parallel to the mainshock nodal plane. Most aftershocks cluster around the perimeter of the zone, surrounding a relatively aseismic center which approximates the region of mainshock rupture. At its southeastern end, the dipping aftershock zone warps into a vertical surface that corresponds to the San Andreas fault. In the central and northwestern parts of the zone at depths above {approximately}10 km, the aftershocks define numerous disjoint fault structures. The large component of reverse-slip observed in this event agrees with a simple model for slip on a dipping plane within a compressional fault bend. The authors do not believe that the Loma Prieta earthquake occurred on the Sargent fault. However, they are unable to conclude whether it ruptured the principal plate boundary fault or a less frequently active fault.

  12. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  13. High-Resolution Locations and Focal Mechanisms of Aftershocks of the September 5, 2012 Mw=7.6 Nicoya, Costa Rica Earthquake

    NASA Astrophysics Data System (ADS)

    Laure, Duboeuf; Susan, Schwartz

    2015-04-01

    Subduction beneath the Nicoya Peninsula, Costa Rica generates the largest underthrusting earthquakes in the country with a recurrence interval of about 50 years. The most recent of these events occurred on September 5th 2012 (Mw 7.6). A vigorous aftershock sequence of more than 6400 earthquakes was recorded by a local seismic network within the first 4 months of the mainshock. We identify those aftershocks occurring on the mainshock fault plane and compare their locations to the 2012 mainshock slip distribution, the location of past interplate seismicity, and slow slip phenomena to better understand the mechanical behavior of this plate interface. Our focal mechanism determination includes all aftershocks occurring within the first nine days after the mainshock and aftershocks with magnitude greater than four occurring through the end of December 2012. We use the HASH (Hardebeck and Shearer, 2002) software package, based on first motion polarities, to obtain aftershock focal mechanisms. We are able to determine reliable focal mechanisms for 583 of the aftershocks and identify 264 of them as occurring on the plate interface. All of these are relocated using HypoDD (Waldhauser and Ellsworth, 2000) and their locations are compared with other plate boundary activity. We find no significant seismicity patterns as a function of time or magnitude, but confirm that deeper underthrusting events occur in the north compared to the south as revealed by previous studies (Newman et al., 2002). Most of the aftershocks occur in and around the updip part of the coseismic rupture zone. This suggests that the Nicoya mainshock released all of the accumulated strain in the deeper part of the plate interface, leaving none to occur as aftershocks. Previous interface seismicity in this region reveals a similar distribution to the aftershocks, however it extends to deeper depth and defines the entire seismogenic zone. The coseismic slip occurs even deeper than the background interface

  14. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  15. Estimating earthquake potential

    USGS Publications Warehouse

    Page, R.A.

    1980-01-01

    The hazards to life and property from earthquakes can be minimized in three ways. First, structures can be designed and built to resist the effects of earthquakes. Second, the location of structures and human activities can be chosen to avoid or to limit the use of areas known to be subject to serious earthquake hazards. Third, preparations for an earthquake in response to a prediction or warning can reduce the loss of life and damage to property as well as promote a rapid recovery from the disaster. The success of the first two strategies, earthquake engineering and land use planning, depends on being able to reliably estimate the earthquake potential. The key considerations in defining the potential of a region are the location, size, and character of future earthquakes and frequency of their occurrence. Both historic seismicity of the region and the geologic record are considered in evaluating earthquake potential. 

  16. Safety and survival in an earthquake

    USGS Publications Warehouse

    U.S. Geological Survey

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  17. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  18. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  19. BlueDetect: An iBeacon-Enabled Scheme for Accurate and Energy-Efficient Indoor-Outdoor Detection and Seamless Location-Based Service

    PubMed Central

    Zou, Han; Jiang, Hao; Luo, Yiwen; Zhu, Jianjie; Lu, Xiaoxuan; Xie, Lihua

    2016-01-01

    The location and contextual status (indoor or outdoor) is fundamental and critical information for upper-layer applications, such as activity recognition and location-based services (LBS) for individuals. In addition, optimizations of building management systems (BMS), such as the pre-cooling or heating process of the air-conditioning system according to the human traffic entering or exiting a building, can utilize the information, as well. The emerging mobile devices, which are equipped with various sensors, become a feasible and flexible platform to perform indoor-outdoor (IO) detection. However, power-hungry sensors, such as GPS and WiFi, should be used with caution due to the constrained battery storage on mobile device. We propose BlueDetect: an accurate, fast response and energy-efficient scheme for IO detection and seamless LBS running on the mobile device based on the emerging low-power iBeacon technology. By leveraging the on-broad Bluetooth module and our proposed algorithms, BlueDetect provides a precise IO detection service that can turn on/off on-board power-hungry sensors smartly and automatically, optimize their performances and reduce the power consumption of mobile devices simultaneously. Moreover, seamless positioning and navigation services can be realized by it, especially in a semi-outdoor environment, which cannot be achieved by GPS or an indoor positioning system (IPS) easily. We prototype BlueDetect on Android mobile devices and evaluate its performance comprehensively. The experimental results have validated the superiority of BlueDetect in terms of IO detection accuracy, localization accuracy and energy consumption. PMID:26907295

  20. BlueDetect: An iBeacon-Enabled Scheme for Accurate and Energy-Efficient Indoor-Outdoor Detection and Seamless Location-Based Service.

    PubMed

    Zou, Han; Jiang, Hao; Luo, Yiwen; Zhu, Jianjie; Lu, Xiaoxuan; Xie, Lihua

    2016-01-01

    The location and contextual status (indoor or outdoor) is fundamental and critical information for upper-layer applications, such as activity recognition and location-based services (LBS) for individuals. In addition, optimizations of building management systems (BMS), such as the pre-cooling or heating process of the air-conditioning system according to the human traffic entering or exiting a building, can utilize the information, as well. The emerging mobile devices, which are equipped with various sensors, become a feasible and flexible platform to perform indoor-outdoor (IO) detection. However, power-hungry sensors, such as GPS and WiFi, should be used with caution due to the constrained battery storage on mobile device. We propose BlueDetect: an accurate, fast response and energy-efficient scheme for IO detection and seamless LBS running on the mobile device based on the emerging low-power iBeacon technology. By leveraging the on-broad Bluetooth module and our proposed algorithms, BlueDetect provides a precise IO detection service that can turn on/off on-board power-hungry sensors smartly and automatically, optimize their performances and reduce the power consumption of mobile devices simultaneously. Moreover, seamless positioning and navigation services can be realized by it, especially in a semi-outdoor environment, which cannot be achieved by GPS or an indoor positioning system (IPS) easily. We prototype BlueDetect on Android mobile devices and evaluate its performance comprehensively. The experimental results have validated the superiority of BlueDetect in terms of IO detection accuracy, localization accuracy and energy consumption. PMID:26907295

  1. New geological perspectives on earthquake recurrence models

    SciTech Connect

    Schwartz, D.P.

    1997-02-01

    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release.

  2. The Rupture Process and Location of the 2003 Zemmouri-Boumerdes Earthquake (Mw 6.8) Inferred from Seismic and Geodetic Data

    NASA Astrophysics Data System (ADS)

    Santos, Rúben; Caldeira, Bento; Bezzeghoud, Mourad; Borges, José Fernando

    2015-09-01

    This work is a study of the earthquake (Mw 6.8) that occurred on May 21, 2003 in Zemmouri-Boumerdes (Algeria) using methodology based on teleseismic data, uplift measurements, and synthetic aperture radar data. As a starting point, we fix the two source fault models obtained in this work (Solution 1: strike = 64°, dip = 50°, and rake = 97°; Solution 2: strike 256°, dip 40°, and rake = 91°) with a length of 60 km and width of 20 km to calculate the slip distribution that best explains the seismic and geodetic observations. The interferometric fringes revealed a strong displacement in the satellite direction (~53 cm) along the coast of Algeria between the cities of Boumerdes and Zemmouri. The inversion of teleseismic body waves for the two focal solution types (one plane dipping to the SE and the second plane dipping to the NW) showed distinct ruptures. However, both bilateral ruptures included two asperities, one near the hypocentre and the other at a shallower location. The maximum slip (Solution 1 = 3.8 m and Solution 2 = 4.0 m) occurred near the hypocentre in both seismic source models. The surface displacement model was obtained with Okada's equations using the EDCMP algorithm. The three components of the displacements calculated were projected regarding the satellite direction (LOS—line-of-sight) for comparison with the interferogram. The geographic location of the fault plane was determined by comparing the uplift measurements with the vertical displacement models calculated with the source at several locations. The surface displacements calculated from these source models indicate that the model based on the SE plane and the epicentre location at 36.846°N and 3.660°E produces results closer to the interferogram and the uplift measurements.

  3. Earthquake classification, location, and error analysis in a volcanic environment: implications for the magmatic system of the 1989-1990 eruptions at redoubt volcano, Alaska

    USGS Publications Warehouse

    Lahr, J.C.; Chouet, B.A.; Stephens, C.D.; Power, J.A.; Page, R.A.

    1994-01-01

    Determination of the precise locations of seismic events associated with the 1989-1990 eruptions of Redoubt Volcano posed a number of problems, including poorly known crustal velocities, a sparse station distribution, and an abundance of events with emergent phase onsets. In addition, the high relief of the volcano could not be incorporated into the hypoellipse earthquake location algorithm. This algorithm was modified to allow hypocenters to be located above the elevation of the seismic stations. The velocity model was calibrated on the basis of a posteruptive seismic survey, in which four chemical explosions were recorded by eight stations of the permanent network supplemented with 20 temporary seismographs deployed on and around the volcanic edifice. The model consists of a stack of homogeneous horizontal layers; setting the top of the model at the summit allows events to be located anywhere within the volcanic edifice. Detailed analysis of hypocentral errors shows that the long-period (LP) events constituting the vigorous 23-hour swarm that preceded the initial eruption on December 14 could have originated from a point 1.4 km below the crater floor. A similar analysis of LP events in the swarm preceding the major eruption on January 2 shows they also could have originated from a point, the location of which is shifted 0.8 km northwest and 0.7 km deeper than the source of the initial swarm. We suggest this shift in LP activity reflects a northward jump in the pathway for magmatic gases caused by the sealing of the initial pathway by magma extrusion during the last half of December. Volcano-tectonic (VT) earthquakes did not occur until after the initial 23-hour-long swarm. They began slowly just below the LP source and their rate of occurrence increased after the eruption of 01:52 AST on December 15, when they shifted to depths of 6 to 10 km. After January 2 the VT activity migrated gradually northward; this migration suggests northward propagating withdrawal of

  4. Location of largest earthquake slip and fast rupture controlled by along-strike change in fault structural maturity due to fault growth

    NASA Astrophysics Data System (ADS)

    Perrin, Clément; Manighetti, Isabelle; Ampuero, Jean-Paul; Cappa, Frédéric; Gaudemer, Yves

    2016-05-01

    Earthquake slip distributions are asymmetric along strike, but the reasons for the asymmetry are unknown. We address this question by establishing empirical relations between earthquake slip profiles and fault properties. We analyze the slip distributions of 27 large continental earthquakes in the context of available information on their causative faults, in particular on the directions of their long-term lengthening. We find that the largest slips during each earthquake systematically occurred on that half of the ruptured fault sections most distant from the long-term fault propagating tips, i.e., on the most mature half of the broken fault sections. Meanwhile, slip decreased linearly over most of the rupture length in the direction of long-term fault propagation, i.e., of decreasing structural maturity along strike. We suggest that this earthquake slip asymmetry is governed by along-strike changes in fault properties, including fault zone compliance and fault strength, induced by the evolution of off-fault damage, fault segmentation, and fault planarity with increasing structural maturity. We also find higher rupture speeds in more mature rupture sections, consistent with predicted effects of low-velocity damage zones on rupture dynamics. Since the direction(s) of long-term fault propagation can be determined from geological evidence, it might be possible to anticipate in which direction earthquake slip, once nucleated, may increase, accelerate, and possibly lead to a large earthquake. Our results could thus contribute to earthquake hazard assessment and Earthquake Early Warning.

  5. Real-time forecasts of tomorrow's earthquakes in California

    USGS Publications Warehouse

    Gerstenberger, M.C.; Wiemer, S.; Jones, L.M.; Reasenberg, P.A.

    2005-01-01

    Despite a lack of reliable deterministic earthquake precursors, seismologists have significant predictive information about earthquake activity from an increasingly accurate understanding of the clustering properties of earthquakes. In the past 15 years, time-dependent earthquake probabilities based on a generic short-term clustering model have been made publicly available in near-real time during major earthquake sequences. These forecasts describe the probability and number of events that are, on average, likely to occur following a mainshock of a given magnitude, but are not tailored to the particular sequence at hand and contain no information about the likely locations of the aftershocks. Our model builds upon the basic principles of this generic forecast model in two ways: it recasts the forecast in terms of the probability of strong ground shaking, and it combines an existing time-independent earthquake occurrence model based on fault data and historical earthquakes with increasingly complex models describing the local time-dependent earthquake clustering. The result is a time-dependent map showing the probability of strong shaking anywhere in California within the next 24 hours. The seismic hazard modelling approach we describe provides a better understanding of time-dependent earthquake hazard, and increases its usefulness for the public, emergency planners and the media.

  6. Earthquakes in South Carolina and Vicinity 1698-2009

    USGS Publications Warehouse

    Dart, Richard L.; Talwani, Pradeep; Stevenson, Donald

    2010-01-01

    This map summarizes more than 300 years of South Carolina earthquake history. It is one in a series of three similar State earthquake history maps. The current map and the previous two for Virginia and Ohio are accessible at http://pubs.usgs.gov/of/2006/1017/ and http://pubs.usgs.gov/of/2008/1221/. All three State earthquake maps were collaborative efforts between the U.S. Geological Survey and respective State agencies. Work on the South Carolina map was done in collaboration with the Department of Geological Sciences, University of South Carolina. As with the two previous maps, the history of South Carolina earthquakes was derived from letters, journals, diaries, newspaper accounts, academic journal articles, and, beginning in the early 20th century, instrumental recordings (seismograms). All historical (preinstrumental) earthquakes that were large enough to be felt have been located based on felt reports. Some of these events caused damage to buildings and their contents. The more recent widespread use of seismographs has allowed many smaller earthquakes, previously undetected, to be recorded and accurately located. The seismicity map shows historically located and instrumentally recorded earthquakes in and near South Carolina

  7. Absolute and relative locations of earthquakes at Mount St. Helens, Washington, using continuous data: implications for magmatic processes: Chapter 4 in A volcano rekindled: the renewed eruption of Mount St. Helens, 2004-2006

    USGS Publications Warehouse

    Thelen, Weston A.; Crosson, Robert S.; Creager, Kenneth C.

    2008-01-01

    This study uses a combination of absolute and relative locations from earthquake multiplets to investigate the seismicity associated with the eruptive sequence at Mount St. Helens between September 23, 2004, and November 20, 2004. Multiplets, a prominent feature of seismicity during this time period, occurred as volcano-tectonic, hybrid, and low-frequency earthquakes spanning a large range of magnitudes and lifespans. Absolute locations were improved through the use of a new one-dimensional velocity model with excellent shallow constraints on P-wave velocities. We used jackknife tests to minimize possible biases in absolute and relative locations resulting from station outages and changing station configurations. In this paper, we show that earthquake hypocenters shallowed before the October 1 explosion along a north-dipping structure under the 1980-86 dome. Relative relocations of multiplets during the initial seismic unrest and ensuing eruption showed rather small source volumes before the October 1 explosion and larger tabular source volumes after October 5. All multiplets possess absolute locations very close to each other. However, the highly dissimilar waveforms displayed by each of the multiplets analyzed suggest that different sources and mechanisms were present within a very small source volume. We suggest that multiplets were related to pressurization of the conduit system that produced a stationary source that was highly stable over long time periods. On the basis of their response to explosions occurring in October 2004, earthquakes not associated with multiplets also appeared to be pressure dependent. The pressure source for these earthquakes appeared, however, to be different from the pressure source of the multiplets.

  8. Location, Location, Location!

    ERIC Educational Resources Information Center

    Ramsdell, Kristin

    2004-01-01

    Of prime importance in real estate, location is also a key element in the appeal of romances. Popular geographic settings and historical periods sell, unpopular ones do not--not always with a logical explanation, as the author discovered when she conducted a survey on this topic last year. (Why, for example, are the French Revolution and the…

  9. Astronomical tides and earthquakes

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoping; Mao, Wei; Huang, Yong

    2001-03-01

    A review on the studies of correlation between astronomical tides and earthquakes is given in three categories, including (1) earthquakes and the relative locations of the sun, the moon and the earth, (2) earthquakes and the periods and phases of tides and (3) earthquakes and the tidal stress. The first two categories mainly investigate whether or not there exist any dominant pattern of the relative locations of the sun, the moon and the earth during earthquakes, whether or not the occurrences of earthquakes are clustered in any special phase during a tidal period, whether or not there exists any tidal periodic phenomenon in seismic activities, By empasizing the tidal stress in seismic focus, the third category investigates the relationship between various seismic faults and the triggering effects of tidal stress, which reaches the crux of the issue. Possible reasons to various inconsistent investigation results by using various methods and samples are analyzed and further investigations are proposed.

  10. Array Measurements of Earthquake Rupture.

    NASA Astrophysics Data System (ADS)

    Goldstein, Peter

    Accurate measurements of earthquake rupture are an essential step in the development of an understanding of the earthquake source process. In this dissertation new array analysis techniques are developed and used to make the first measurements of two-dimensional earthquake rupture propagation. In order to measure earthquake rupture successfully it is necessary to account for the nonstationary behavior of seismic waves and nonplanar wavefronts due to time delays caused by local heterogeneities. Short time windows are also important because they determine the precision with which it is possible to measure rupture times of earthquake sources. The subarray spatial averaging and seismogram alignment methods were developed for these reasons. The basic algorithm which is used to compute frequency-wavenumber power spectra is the multiple signal characterization (MUSIC) method. Although a variety of methods could be applied with subarray spatial averaging and seismogram alignment, MUSIC is used because it has better resolution of multiple sources than other currently available methods and it provides a unique solution. Power spectra observed at the array are converted into source locations on the fault plane by tracing rays through a layered medium. A dipping layer correction factor is introduced to account for a laterally varying basin structure such as that found beneath the SMART 1 array in Taiwan. A framework is presented that allows for the estimation of precision and resolution of array measurements of source locations and can be used to design an optimum array for a given source. These methods are used to show that the November 14th 1986, M_{L} = 7.0 Hualien, Taiwan earthquake began as a shallow event with unilateral rupture from southwest to northeast. A few seconds later a second, deeper and larger event began rupturing from below the hypocentral region from southwest to northeast slightly down-dip. Energy density estimates indicate larger energy sources at greater

  11. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  12. Local earthquake tomography of Scotland

    NASA Astrophysics Data System (ADS)

    Luckett, Richard; Baptie, Brian

    2015-03-01

    Scotland is a relatively aseismic region for the use of local earthquake tomography, but 40 yr of earthquakes recorded by a good and growing network make it possible. A careful selection is made from the earthquakes located by the British Geological Survey (BGS) over the last four decades to provide a data set maximising arrival time accuracy and ray path coverage of Scotland. A large number of 1-D velocity models with different layer geometries are considered and differentiated by employing quarry blasts as ground-truth events. Then, SIMULPS14 is used to produce a robust 3-D tomographic P-wave velocity model for Scotland. In areas of high resolution the model shows good agreement with previously published interpretations of seismic refraction and reflection experiments. However, the model shows relatively little lateral variation in seismic velocity except at shallow depths, where sedimentary basins such as the Midland Valley are apparent. At greater depths, higher velocities in the northwest parts of the model suggest that the thickness of crust increases towards the south and east. This observation is also in agreement with previous studies. Quarry blasts used as ground truth events and relocated with the preferred 3-D model are shown to be markedly more accurate than when located with the existing BGS 1-D velocity model.

  13. The Role of Color Cues in Facilitating Accurate and Rapid Location of Aided Symbols by Children with and without Down Syndrome

    ERIC Educational Resources Information Center

    Wilkinson, Krista; Carlin, Michael; Thistle, Jennifer

    2008-01-01

    Purpose: This research examined how the color distribution of symbols within a visual aided augmentative and alternative communication array influenced the speed and accuracy with which participants with and without Down syndrome located a target picture symbol. Method: Eight typically developing children below the age of 4 years, 8 typically…

  14. Earthquake swarms in Greenland

    NASA Astrophysics Data System (ADS)

    Larsen, Tine B.; Voss, Peter H.; Dahl-Jensen, Trine

    2014-05-01

    Earthquake swarms occur primarily near active volcanoes and in areas with frequent tectonic activity. However, intraplate earthquake swarms are not an unknown phenomenon. They are located near zones of weakness, e.g. in regions with geological contrasts, where dynamic processes are active. An earthquake swarm is defined as a period of increased seismicity, in the form of a cluster of earthquakes of similar magnitude, occurring in the same general area, during a limited time period. There is no obvious main shock among the earthquakes in a swarm. Earthquake swarms occur in Greenland, which is a tectonically stable, intraplate environment. The first earthquake swarms in Greenland were detected more than 30 years ago in Northern and North-Eastern Greenland. However, detection of these low-magnitude events is challenging due to the enormous distances and the relatively sparse network of seismographs. The seismograph coverage of Greenland has vastly improved since the international GLISN-project was initiated in 2008. Greenland is currently coved by an open network of 19 BB seismographs, most of them transmitting data in real-time. Additionally, earthquake activity in Greenland is monitored by seismographs in Canada, Iceland, on Jan Mayen, and on Svalbard. The time-series of data from the GLISN network is still short, with the latest station been added in NW Greenland in 2013. However, the network has already proven useful in detecting several earthquake swarms. In this study we will focus on two swarms: one occurring near/on the East Greenland coast in 2008, and another swarm occurring in the Disko-area near the west coast of Greenland in 2010. Both swarms consist of earthquakes with local magnitudes between 1.9 and 3.2. The areas, where the swarms are located, are regularly active with small earthquakes. The earthquake swarms are analyzed in the context of the general seismicity and the possible relationship to the local geological conditions.

  15. Earthquake activity in Oklahoma

    SciTech Connect

    Luza, K.V.; Lawson, J.E. Jr. )

    1989-08-01

    Oklahoma is one of the most seismically active areas in the southern Mid-Continent. From 1897 to 1988, over 700 earthquakes are known to have occurred in Oklahoma. The earliest documented Oklahoma earthquake took place on December 2, 1897, near Jefferson, in Grant County. The largest known Oklahoma earthquake happened near El Reno on April 9, 1952. This magnitude 5.5 (mb) earthquake was felt from Austin, Texas, to Des Moines, Iowa, and covered a felt area of approximately 362,000 km{sup 2}. Prior to 1962, all earthquakes in Oklahoma (59) were either known from historical accounts or from seismograph stations outside the state. Over half of these events were located in Canadian County. In late 1961, the first seismographs were installed in Oklahoma. From 1962 through 1976, 70 additional earthquakes were added to the earthquake database. In 1977, a statewide network of seven semipermanent and three radio-telemetry seismograph stations were installed. The additional stations have improved earthquake detection and location in the state of Oklahoma. From 1977 to 1988, over 570 additional earthquakes were located in Oklahoma, mostly of magnitudes less than 2.5. Most of these events occurred on the eastern margin of the Anadarko basin along a zone 135 km long by 40 km wide that extends from Canadian County to the southern edge of Garvin County. Another general area of earthquake activity lies along and north of the Ouachita Mountains in the Arkoma basin. A few earthquakes have occurred in the shelves that border the Arkoma and Anadarko basins.

  16. Everyday Earthquakes.

    ERIC Educational Resources Information Center

    Svec, Michael

    1996-01-01

    Describes methods to access current earthquake information from the National Earthquake Information Center. Enables students to build genuine learning experiences using real data from earthquakes that have recently occurred. (JRH)

  17. Regional location in western China

    SciTech Connect

    Cogbill, A.H.; Steck, L.K.

    1996-10-01

    Accurately locating seismic events in western China using only regional seismic stations is a challenge. Not only is the number of seismic stations available for locating events small, but most stations available to researchers are often over 10{degree} distant. Here the authors describe the relocation, using regional stations, of both nuclear and earthquake sources near the Lop Nor test site in western China. For such relocations, they used the Earthquake Data Reports provided by the US Geological Survey (USGS) for the reported travel times. Such reports provide a listing of all phases reported to the USGS from stations throughout the world, including many stations in the People`s Republic of China. LocSAT was used as the location code. The authors systematically relocated each event int his study several times, using fewer and fewer stations at reach relocation, with the farther stations being eliminated at each step. They found that location accuracy, judged by comparing solutions from few stations to the solution provided using all available stations, remained good typically until fewer than seven stations remained.With a good station distribution, location accuracy remained surprisingly good (within 7 km) using as few as 3 stations. Because these relocations were computed without good station corrections and without source-specific station corrections (that is, path corrections), they believe that such regional locations can be substantially improved, largely using static station corrections and source-specific station corrections, at least in the Lop nor area, where sources have known locations. Elsewhere in China, one must rely upon known locations of regionally-recorded explosions. Locating such sources is clearly one of the major problems to be overcome before one can provide event locations with any assurance from regional stations.

  18. Hypocenters (1977-1984) around the Richton Dome and the Melvin, Alabama, 1978 earthquake

    SciTech Connect

    Not Available

    1987-08-01

    Seventeen detected earthquakes (1977 to 1984) in the eastern Mississippi and Alabama region are relocated to determine how accurately these earthquakes can be located and what depth constraints are available. Arrival time data from the Southeastrn US Seismic Network (SEUSSN) bulletins and five different velocity models are used to recalculate the hypocenter locations. Differences in locations depending on the velocity model used are small both inside the seismograph network in Alabama and at the edge of the network in eastern Mississippi. The calculated standard horizontal location errors range from 1 to 19 km, although most of the locations have errors from 2 to 10 km. In most cases, the depth is unconstrained. Since only 17 earthquakes occurred during a 7-year period in a large geographical area, no simple conclusions can be drawn about the rate of seismic activity or correspondence between earthquakes and geologic structures. The December 11, 1978, Melvin, Alabama, earthquake (m/sub bLg/ = 3.5) is relocated and its possible mechanism is discussed because of its proximity to the Richton Dome. The epicenter is located near the Pickins-Gilbertown fault zone and near the Mississippi-Alabama state line. The mechanism of the Melvin earthquake cannot be determined, but the event is interpreted to be a natural tectonic event rather than an artificially induced event. 45 refs., 3 figs., 7 tabs.)

  19. Repeating Earthquakes on the Queen Charlotte Plate Boundary

    NASA Astrophysics Data System (ADS)

    Hayward, T. W.; Bostock, M. G.

    2015-12-01

    The Queen Charlotte Fault (QCF) is a major plate boundary located off the northwest coast of North America that has produced large earthquakes in 1949 (M8.1) and more recently in October, 2012 (M7.8). The 2012 event was dominated by thrusting despite the fact that plate motions at the boundary are nearly transcurrent. It is now widely believed that the plate boundary comprises the QCF (i.e., a dextral strike-slip fault) as well as an element of subduction of the Pacific Plate beneath the North American Plate. Repeating earthquakes and seismic tremor have been observed in the vicinity of the QCF; providing insight into the spatial and temporal characteristics of repeating earthquakes is the goal of this research. Due to poor station coverage and data quality, traditional methods of locating earthquakes are not applicable to these events. Instead, we have implemented an algorithm to locate local (i.e., < 100 km distance to epicenter) earthquakes using a single, three-component seismogram. This algorithm relies on the P-wave polarization and, through comparison with larger local events in the Geological Survey of Canada catalogue, is shown to yield epicentral locations accurate to within 5-10 km. A total of 24 unique families of repeating earthquakes has been identified, and 4 of these families have been located with high confidence. Their epicenters locate directly on the trace of the QCF and their depths are shallow (i.e., 5-15 km), consistent with the proposed depth of the QCF. Analysis of temporal recurrence leading up to the 2012 M7.8 event reveals a non-random pattern, with an approximately 15 day periodicity. Further analysis is planned to study whether this behaviour persists after the 2012 event and to gain insight into the effects of the 2012 event on the stress field and frictional properties of the plate boundary.

  20. Induced Earthquakes Are Not All Alike: Examples from Texas Since 2008 (Invited)

    NASA Astrophysics Data System (ADS)

    Frohlich, C.

    2013-12-01

    The EarthScope Transportable Array passed through Texas between 2008 and 2011, providing an opportunity to identify and accurately locate earthquakes near and/or within oil/gas fields and injection waste disposal operations. In five widely separated geographical locations, the results suggest seismic activity may be induced/triggered. However, the different regions exhibit different relationships between injection/production operations and seismic activity: In the Barnett Shale of northeast Texas, small earthquakes occurred only near higher-volume (volume rate > 150,000 BWPM) injection disposal wells. These included widely reported earthquakes occurring near Dallas-Fort Worth and Cleburne in 2008 and 2009. Near Alice in south Texas, M3.9 earthquakes occurred in 1997 and 2010 on the boundary of the Stratton Field, which had been highly productive for both oil and gas since the 1950's. Both earthquakes occurred during an era of net declining production, but their focal depths and location at the field boundary suggest an association with production activity. In the Eagle Ford of south central Texas, earthquakes occurred near wells following significant increases in extraction (water+produced oil) volumes as well as injection. The largest earthquake, the M4.8 Fashing earthquake of 20 October 2011, occurred after significant increases in extraction. In the Cogdell Field near Snyder (west Texas), a sequence of earthquakes beginning in 2006 followed significant increases in the injection of CO2 at nearby wells. The largest with M4.4 occurred on 11 September 2011. This is the largest known earthquake possibly attributable to CO2 injection. Near Timpson in east Texas a sequence of earthquakes beginning in 2008, including an M4.8 earthquake on 17 May 2012, occurred within three km of two high-volume injection disposal wells that had begun operation in 2007. These were the first known earthquakes at this location. In summary, the observations find possible induced

  1. Do earthquakes generate EM signals?

    NASA Astrophysics Data System (ADS)

    Walter, Christina; Onacha, Stephen; Malin, Peter; Shalev, Eylon; Lucas, Alan

    2010-05-01

    In recent years there has been significant interest in the seismoelectric effect which is the conversion of acoustic energy into electromagnetic energy. At the onset of the earthquake and at layer interfaces, it is postulated that the seismoelectric signal propagates at the speed of light and thus travels much faster than the acoustic wave. The focus has mainly been to use this method as a tool of predicting earthquakes. Our main objective is to study the possibility of using the seismoelectric effect to determine the origin time of an earthquake, establish an accurate velocity model and accurately locate microearthquakes. Another aspect of this research is to evaluate the possibility of detecting porous zones where seismic activity is postulated to generate fluid movement through porous medium. The displacement of pore fluid relative to the porous medium solid grains generates electromagnetic signals. The Institute of Earth Science and Engineering (IESE) has installed electromagnetic coils in 3 different areas to investigate the seismoelectric effect. Two of the research areas (Krafla in Iceland and Wairakei in New Zealand) are in active geothermal fields where high microearthquake activity has been recorded. The other area of research is at the site of the San Andreas Fault Observatory at Depth (SAFOD) at Parkfield area on the active San Andreas Fault which is associated with repeating earthquakes. In the Wairakei and Parkfield cases a single borehole electromagnetic coil close to borehole seismometers has been used whereas in the Krafla study area, 3 borehole electromagnetic coils coupled to borehole seismometers have been used. The technical difficulties of working in the borehole environment mean that some of these deployments had a short life span. Nevertheless in all cases data was gathered and is being analysed. At the SAFOD site, the electromagnetic coil recorded seismoelectric signals very close to a magnitude 2 earthquake. In the Wairakei and Krafla

  2. Some characteristics of the complex El Mayor-Cucapah, MW7.2, April 4, 2010, Baja California, Mexico, earthquake, from well-located aftershock data from local and regional networks.

    NASA Astrophysics Data System (ADS)

    Frez, J.; Nava Pichardo, F. A.; Acosta, J.; Munguia, L.; Carlos, J.; García, R.

    2015-12-01

    Aftershocks from the El Mayor-Cucapah (EMC), MW7.2, April 4, 2010, Baja California, Mexico, earthquake, were recorded over two months by a 31 station local array (Reftek RT130 seismographs loaned from IRIS-PASSCAL), complemented by regional data from SCSN, and CICESE. The resulting data base includes 518 aftershocks with ML ≥ 3.0, plus 181 smaller events. Reliable hypocenters were determined using HYPODD and a velocity structure determined from refraction data for a mesa located to the west of the Mexicali-Imperial Valley. Aftershock hypocenters show that the El Mayor-Cucapah earthquake was a multiple event comprising two or three different ruptures of which the last one constituted the main event. The main event rupture, which extends in a roughly N45°W direction, is complex with well-defined segments having different characteristics. The main event central segment, located close to the first event epicenter is roughly vertical, the northwest segment dips ~68°NE, while the two southeast segments dip ~60°SW and ~52°SW, respectively, which agrees with results of previous studies based on teleseismic long periods and on GPS-INSAR. All main rupture aftershock hypocenters have depths above 10-11km and, except for the central segment, they delineate the edges of zones with largest coseismic displacement. The two southern segments show seismicity concentrated below 5km and 3.5km, respectively; the paucity of shallow seismicity may be caused by the thick layer of non-consolidated sediments in this region. The ruptures delineated by aftershocks in the southern regions correspond to the Indiviso fault, unidentified until the occurrence of the EMC earthquake. The first event was relocated together with the aftershocks; the epicenter lies slightly westwards of published locations, but it definitely does not lie on, or close to, the main rupture. The focal mechanism of the first event, based on first arrival polarities, is predominantly strike-slip; the focal plane

  3. Earthquake prediction

    SciTech Connect

    Ma, Z.; Fu, Z.; Zhang, Y.; Wang, C.; Zhang, G.; Liu, D.

    1989-01-01

    Mainland China is situated at the eastern edge of the Eurasian seismic system and is the largest intra-continental region of shallow strong earthquakes in the world. Based on nine earthquakes with magnitudes ranging between 7.0 and 7.9, the book provides observational data and discusses successes and failures of earthquake prediction. Derived from individual earthquakes, observations of various phenomena and seismic activities occurring before and after earthquakes, led to the establishment of some general characteristics valid for earthquake prediction.

  4. Scientists Engage South Carolina Community in Earthquake Education and Preparedness

    NASA Astrophysics Data System (ADS)

    Hall, C.; Beutel, E.; Jaume', S.; Levine, N.; Doyle, B.

    2008-12-01

    Scientists at the College of Charleston are working with the state of South Carolina's Emergency Management Division to increase awareness and understanding of earthquake hazards throughout South Carolina. As part of this mission, the SCEEP (South Carolina Earthquake Education and Preparedness) program was formed at the College of Charleston to promote earthquake research, outreach, and education in the state of South Carolina. Working with local, regional, state and federal offices, SCEEP has developed education programs for everyone from professional hazard management teams to formal and informal educators. SCEEP also works with the media to ensure accurate reporting of earthquake and other hazard information and to increase the public's understanding of earthquake science and earthquake seismology. As part of this program, we have developed a series of activities that can be checked out by educators for use in their classrooms and in informal education venues. These activities are designed to provide educators with the information and tools they lack to adequately, informatively, and enjoyably teach about earthquake and earth science. The toolkits contain seven activities meeting a variety of National Education Standards, not only in Science, but also in Geography, Math, Social Studies, Arts Education, History and Language Arts - providing a truly multidisciplinary toolkit for educators. The activities provide information on earthquake myths, seismic waves, elastic rebound, vectors, liquefaction, location of an epicenter, and then finally South Carolina earthquakes. The activities are engaging and inquiry based, implementing proven effective strategies for peaking learners' interest in scientific phenomena. All materials are provided within the toolkit and so it is truly check and go. While the SCEEP team has provided instructions and grade level suggestions for implementing the activity in an educational setting, the educator has full reign on what to showcase

  5. Unexpected earthquake of June 25th, 2015 in Madiun, East Java

    NASA Astrophysics Data System (ADS)

    Nugraha, Andri Dian; Supendi, Pepen; Shiddiqi, Hasbi Ash; Widiyantoro, Sri

    2016-05-01

    An earthquake with magnitude 4.2 struck Madiun and its vicinity on June 25, 2015. According to Indonesian Meteorology, Climatology, and Geophysics Agency (BMKG), the earthquake occurred at 10:35:29 GMT+7 and was located in 7.73° S, 111.69 ° E, with a depth of 10 km. At least 57 houses suffered from light to medium damages. We reprocessed earthquake waveform data to obtain an accurate hypocenter location. We manually picked P- and S-waves arrival times from 12 seismic stations in the eastern part of Java. Earthquake location was determined by using Hypoellipse code that employs a single event determination method. Our inversion is able to resolve the fix-depth and shows that the earthquake occurred at 10:35:27.6 GMT+7 and was located in 7.6305° S, 111.7529 ° E with 14.81 km focus depth. Our location depicts a smaller travel time residual compared to that based on the BMKG result. Focal mechanism of the earthquake was determined by using HASH code. We used first arrival polarity of 9 seismic records with azimuthal gap less than 90°, and estimated take-off angles by using assumption of homogenous medium. Our focal mechanism solution shows a strike-slip mechanism with strike direction of 163o, which may be related to a strike-fault in Klangon, an area to the east of Madiun.

  6. The USGS National Earthquake Information Center's Response to the Wenchuan, China Earthquake

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Wald, D. J.; Benz, H.; Sipkin, S.; Dewey, J.; Allen, T.; Jaiswal, K.; Buland, R.; Choy, G.; Hayes, G.; Hutko, A.

    2008-12-01

    Immediately after detecting the May 12th, 2008 Mw 7.9 Wenchuan Earthquake, the USGS National Earthquake Information Center (NEIC) began a coordinated effort to understand and communicate the earthquake's seismological characteristics, tectonic context, and humanitarian impact. NEIC's initial estimates of magnitude and location were distributed within 30 minutes of the quake by e-mail and text message to 70,000 users via the Earthquake Notification System. The release of these basic parameters automatically triggered the generation of more sophisticated derivative products that were used by relief and government agencies to plan their humanitarian response to the disaster. Body-wave and centroid moment tensors identified the earthquake's mechanism. Predictive ShakeMaps provided the first estimates of the geographic extent and amplitude of shaking. The initial automated population exposure estimate generated and distributed by the Prompt Assessment of Global Earthquakes for Response (PAGER) system stated that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater), indicating a large-scale disaster had occurred. NEIC's modeling of the mainshock and aftershocks was continuously refined and expanded. The length and orientation of the fault were determined from aftershocks, finite-fault models, and back-projection source imaging. Firsthand accounts of shaking intensity were collected and mapped by the "Did You Feel It" system. These results were used to refine our ShakeMaps and PAGER exposure estimates providing a more accurate assessment of the extent and enormity of the disaster. The products were organized and distributed in an event-specific summary poster and via the USGS Earthquake Program web pages where they were viewed by millions and reproduced by major media outlets (over 1/2 billion hits were served that month). Rather than just a point showing magnitude and epicenter, several of the media's schematic maps

  7. Prioritizing earthquake and tsunami alerting efforts

    NASA Astrophysics Data System (ADS)

    Allen, R. M.; Allen, S.; Aranha, M. A.; Chung, A. I.; Hellweg, M.; Henson, I. H.; Melgar, D.; Neuhauser, D. S.; Nof, R. N.; Strauss, J. A.

    2015-12-01

    The timeline of hazards associated with earthquakes ranges from seconds for the strong shaking at the epicenter, to minutes for strong shaking at more distant locations in big quakes, to tens of minutes for a local tsunami. Earthquake and tsunami warning systems must therefore include very fast initial alerts, while also taking advantage of available time in bigger and tsunami-generating quakes. At the UC Berkeley Seismological Laboratory we are developing a suite of algorithms to provide the fullest possible information about earthquake shaking and tsunami inundation from seconds to minutes after a quake. The E-larmS algorithm uses the P-wave to rapidly detect an earthquake and issue a warning. It is currently issuing alerts to test users in as little as 3 sec after the origin time. Development of a new waveform detector may lead to even faster alerts. G-larmS uses permanent deformation estimates from GNSS stations to estimate the geometry and extent of rupture underway providing more accurate ground shaking estimates in big (M>~7) earthquakes. It performed well in the M6.0 2014 Napa earthquake. T-larmS is a new algorithm designed to extend alert capabilities to tsunami inundation. Rapid estimates of source characteristics for subduction zones event can not only be used to warn of the shaking hazard, but also the local tsunami inundation hazard. These algorithms are being developed, implemented and tested with a focus on the western US, but are also now being tested in other parts of the world including Israel, Turkey, Korea and Chile. Beta users in the Bay Area are receiving the alerts and beginning to implement automated actions. They also provide feedback on users needs, which has led to the development of the MyEEW smartphone app. This app allows beta users to receive the alerts on their cell phones. All these efforts feed into our ongoing assessment of directions and priorities for future development and implementation efforts.

  8. The Seminole Serpent Warrior At Miramar, FL, Shows Settlement Locations Enabled Environmental Monitoring Reminiscent Of the Four-corners Kokopelli-like EMF Phenomena, and Related to Earthquakes, Tornados and Hurricanes.

    NASA Astrophysics Data System (ADS)

    Balam Matagamon, Chan; Pawa Matagamon, Sagamo

    2004-03-01

    Certain Native Americans of the past seem to have correctly deduced that significant survival information for their tradition-respecting cultures resided in EMF-based phenomena that they were monitoring. This is based upon their myths and the place or cult-hero names they bequeathed us. The sites we have located in FL have been detectable by us visually, usually by faint blue light, or by the elicitation of pin-like prickings, by somewhat intense nervous-system response, by EMF interactions with aural electrochemical systems that can elicit tinitus, and other ways. In the northeast, Cautantowit served as a harbinger of Indian summer, and appears to be another alter ego of the EMF. The Miami, FL Tequesta site along the river clearly correlates with tornado, earthquake and hurricane locations. Sites like the Mohave Deserts giant man may have had similar significance.

  9. Exaggerated Claims About Success Rate of Earthquake Predictions: "Amazing Success" or "Remarkably Unremarkable"?

    NASA Astrophysics Data System (ADS)

    Kafka, A. L.; Ebel, J. E.

    2005-12-01

    On October 1, 2004, NASA announced on its web site, "Earthquake Forecast Program Has Amazing Success Rate." This announcement claimed that the Rundle-Tiampo earthquake forecast method has accurately predicted the locations of 15 of California's 16 largest earthquakes this decade. Since words like "amazing" carry a lot of meaning to consumers of scientific information, claims of "amazing success" should be limited only to cases where the success is truly amazing. We evaluated the statistical likelihood of the reported success rate of the Rundle-Tiampo prediction method by applying a cellular seismology approach to investigate whether proximity to past earthquakes is a sufficient hypothesis to yield the same level of success as the Rundle-Tiampo method. To delineate where to expect future earthquakes, we used the epicenters of the ANSS earthquake catalog for California from 1932 through 1999 with magnitude≥4.0 ("before" earthquakes). We then tested how many of the 15 events that are shown on the NASA web page ("after" earthquakes) occurred near the "before" earthquake epicenters. We found that with only a 4 km radius around each "before" earthquake epicenter, we successfully forecast the locations of 13/15 (87%) of the "after" earthquakes, and with a 7 km radius we successfully forecast 14/15 (93%) of the earthquakes. The zones created by filling in a 7 km radius around the "before" epicenters cover 18% of the study area. The scorecard maps on the JPL "QuakeSim" web site show an 11 km margin of error for the epicenters of the forecast earthquakes. With an 11 km radius around the past epicenters (covering 31% of the map area), we catch 14/15 of the "after" earthquakes. We conclude that the success rate referred to in the NASA announcement is perhaps better characterized as "remarkably unremarkable", rather than "amazing." The 14/15 success rate for the earthquakes listed on the NASA scorecard is not a rigorous test of the Rundle-Tiampo method, since it appears that

  10. Hidden Earthquakes.

    ERIC Educational Resources Information Center

    Stein, Ross S.; Yeats, Robert S.

    1989-01-01

    Points out that large earthquakes can take place not only on faults that cut the earth's surface but also on blind faults under folded terrain. Describes four examples of fold earthquakes. Discusses the fold earthquakes using several diagrams and pictures. (YP)

  11. The May 20 (MW 6.1) and 29 (MW 6.0), 2012, Emilia (Po Plain, northern Italy) earthquakes: New seismotectonic implications from subsurface geology and high-quality hypocenter location

    NASA Astrophysics Data System (ADS)

    Carannante, Simona; Argnani, Andrea; Massa, Marco; D'Alema, Ezio; Lovati, Sara; Moretti, Milena; Cattaneo, Marco; Augliera, Paolo

    2015-08-01

    This study presents new geological and seismological data that are used to assess the seismic hazard of a sector of the Po Plain (northern Italy), a large alluvial basin hit by two strong earthquakes on May 20 (MW 6.1) and May 29 (MW 6.0), 2012. The proposed interpretation is based on high-quality relocation of 5369 earthquakes ('Emilia sequence') and a dense grid of seismic profiles and exploration wells. The analyzed seismicity was recorded by 44 seismic stations, and initially used to calibrate new one-dimensional and three-dimensional local Vp and Vs velocity models for the area. Considering these new models, the initial sparse hypocenters were then relocated in absolute mode and adjusted using the double-difference relative location algorithm. These data define a seismicity that is elongated in the W-NW to E-SE directions. The aftershocks of the May 20 mainshock appear to be distributed on a rupture surface that dips ~ 45° SSW, and the surface projection indicates an area ~ 10 km wide and 23 km long. The aftershocks of the May 29 mainshock followed a steep rupture surface that is well constrained within the investigated volume, whereby the surface projection of the blind source indicates an area ~ 6 km wide and 33 km long. Multichannel seismic profiles highlight the presence of relevant lateral variations in the structural style of the Ferrara folds that developed during the Pliocene and Pleistocene. There is also evidence of a Mesozoic extensional fault system in the Ferrara arc, with faults that in places have been seismically reactivated. These geological and seismological observations suggest that the 2012 Emilia earthquakes were related to ruptures along blind fault surfaces that are not part of the Pliocene-Pleistocene structural system, but are instead related to a deeper system that is itself closely related to re-activation of a Mesozoic extensional fault system.

  12. The characteristic of the building damage from historical large earthquakes in Kyoto

    NASA Astrophysics Data System (ADS)

    Nishiyama, Akihito

    2016-04-01

    The Kyoto city, which is located in the northern part of Kyoto basin in Japan, has a long history of >1,200 years since the city was initially constructed. The city has been a populated area with many buildings and the center of the politics, economy and culture in Japan for nearly 1,000 years. Some of these buildings are now subscribed as the world's cultural heritage. The Kyoto city has experienced six damaging large earthquakes during the historical period: i.e., in 976, 1185, 1449, 1596, 1662, and 1830. Among these, the last three earthquakes which caused severe damage in Kyoto occurred during the period in which the urban area had expanded. These earthquakes are considered to be inland earthquakes which occurred around the Kyoto basin. The damage distribution in Kyoto from historical large earthquakes is strongly controlled by ground condition and earthquakes resistance of buildings rather than distance from estimated source fault. Therefore, it is necessary to consider not only the strength of ground shaking but also the condition of building such as elapsed years since the construction or last repair in order to more accurately and reliably estimate seismic intensity distribution from historical earthquakes in Kyoto. The obtained seismic intensity map would be helpful for reducing and mitigating disaster from future large earthquakes.

  13. Sun, Moon and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  14. The Pulse Azimuth effect as seen in induction coil magnetometers located in California and Peru 2007-2010, and its possible association with earthquakes

    NASA Astrophysics Data System (ADS)

    Dunson, J. C.; Bleier, T. E.; Roth, S.; Heraud, J.; Alvarez, C. H.; Lira, A.

    2011-07-01

    The QuakeFinder network of magnetometers has recorded geomagnetic field activity in California since 2000. Established as an effort to follow up observations of ULF activity reported from before and after the M = 7.1 Loma Prieta earthquake in 1989 by Stanford University, the QuakeFinder network has over 50 sites, fifteen of which are high-resolution QF1005 and QF1007 systems. Pairs of high-resolution sites have also been installed in Peru and Taiwan. Increases in pulse activity preceding nearby seismic events are followed by decreases in activity afterwards in the three cases that are discussed here. In addition, longer term data is shown, revealing a rich signal structure not previously known in QuakeFinder data, or by many other authors who have reported on pre-seismic ULF phenomena. These pulses occur as separate ensembles, with demonstrable repeatability and uniqueness across a number of properties such as waveform, angle of arrival, amplitude, and duration. Yet they appear to arrive with exponentially distributed inter-arrival times, which indicates a Poisson process rather than a periodic, i.e., stationary process. These pulses were observed using three-axis induction coil magnetometers that are buried 1-2 m under the surface of the Earth. Our sites use a Nyquist frequency of 16 Hertz (25 Hertz for the new QF1007 units), and they record these pulses at amplitudes from 0.1 to 20 nano-Tesla with durations of 0.1 to 12 s. They are predominantly unipolar pulses, which may imply charge migration, and they are stronger in the two horizontal (north-south and east-west) channels than they are in the vertical channels. Pulses have been seen to occur in bursts lasting many hours. The pulses have large amplitudes and study of the three-axis data shows that the amplitude ratios of the pulses taken from pairs of orthogonal coils is stable across the bursts, suggesting a similar source. This paper presents three instances of increases in pulse activity in the 30 days prior

  15. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    simulations of a Hayward Fault earthquake, (5) a new USGS Fact Sheet about the earthquake and the Hayward Fault, (6) a virtual tour of the 1868 earthquake, and (7) a new online field trip guide to the Hayward Fault using locations accessible by car and public transit. Finally, the California Geological Survey and many other Alliance members sponsored the Third Conference on Earthquake Hazards in the East Bay at CSU East Bay in Hayward for the three days following the 140th anniversary. The 1868 Alliance hopes to commemorate the anniversary of the 1868 Hayward Earthquake every year to maintain and increase public awareness of this fault, the hazards it and other East Bay Faults pose, and the ongoing need for earthquake preparedness and mitigation.

  16. Testing mechanisms of subduction zone segmentation and seismogenesis with slip distributions from recent Andean earthquakes

    NASA Astrophysics Data System (ADS)

    Loveless, J. P.; Pritchard, M. E.; Kukowski, N.

    2010-11-01

    A long-standing goal of subduction zone earthquake studies is to determine whether or not there are physical processes that control seismogenesis and the along-strike segmentation of the megathrust. Studies of individual earthquakes and global compilations of earthquakes find favorable comparison between coseismic interplate slip distributions and several different long-lived forearc characteristics, such as bathymetry, coastline morphology, crustal structure, and interplate frictional properties, but no single explanation seems to govern the location and slip distribution of all earthquakes. One possible reason for the lack of a unifying explanation is that the inferred earthquake parameters, most importantly the slip distribution, calculated in some areas were inaccurate, blurring correlation between earthquake and physical parameters. In this paper, we seek to test this possibility by comparing accurate slip distributions constrained by multiple datasets along several segments of a single subduction zone with the various physical properties that have been proposed to control or correlate with seismogenesis. We examine the rupture area and slip distribution of 6 recent and historical large ( Mw > 7) earthquakes on the Peru-northern Chile subduction zone. This analysis includes a new slip distribution of the 14 November 2007 Mw = 7.7 earthquake offshore Tocopilla, Chile constrained by teleseismic body wave and InSAR data. In studying the 6 events, we find that no single mechanism can explain the location or extent of rupture of all earthquakes, but analysis of the forearc gravity field and its gradients shows correlation with many of the observed slip patterns, as suggested by previous studies. Additionally, large-scale morphological features including the Nazca Ridge, Arica Bend, Mejillones Peninsula, and transverse crustal fault systems serve as boundaries between distinct earthquake segments.

  17. Update NEMC Database using Arcgis Software and Example of Simav-Kutahya earthquake sequences

    NASA Astrophysics Data System (ADS)

    Altuncu Poyraz, S.; Kalafat, D.; Kekovali, K.

    2011-12-01

    In this study, totally 144043 earthquake data from the Kandilli Observatory Earthquake Research Institute & National Earthquake Monitoring Center (KOERI-NEMC) seismic catalog between 2.0≤M≤7.9 occured in Turkey for the time interval 1900-2011 were used. The data base includes not only coordinates, date, magnitude and depth of these earthquakes but also location and installation information, field studies, geology, technical properties of 154 seismic stations. Additionally, 1063 historical earthquakes included to the data base. Source parameters of totally 738 earthquakes bigger than M≥4.0 occured between the years 1938-2008 were added to the database. In addition, 103 earthquake's source parameters were calculated (bigger than M≥4.5) since 2008. In order to test the charateristics of earthquakes, questioning, visualization and analyzing aftershock sequences on 19 May 2011 Simav-Kutahya earthquake were selected and added to the data base. The Simav earthquake (western part of Anatolia) with magnitude Ml= 5.9 occurred at local time 23:15 is investigated, in terms of accurate event locations and source properties of the largest events. The aftershock distribution of Simav earthquake shows the activation of a 17-km long zone, which extends in depth between 5 and 10 km. In order to make contribution to better understand the neotectonics of this region, we analysed the earthquakes using the KOERI (Kandilli Observatory and Earthquake Research Institute) seismic stations along with the seismic stations that are operated by other communities and recorded suscessfuly the Simav seismic activity in 2011. Source mechanisms of 19 earthquakes with magnitudes between 3.8 ≤ML<6.0 were calculated by means of Regional Moment Tensor Inversion (RMT) technique. The mechanism solutions show the presence of east-west direction normal faults in the region. As a result an extensional regime is dominated in the study area. The aim of this study is to store and compile earthquake

  18. Source Parameter Studies of Historical and Recent Earthquakes of the Cascadia Subduction Zone and Mendocino Triple Junction Region

    NASA Astrophysics Data System (ADS)

    Wiest, K. R.; Theiner, T. R.; Velasco, A. A.; Doser, D. I.

    2003-12-01

    We are comparing the seismograms of recent (post-1989) earthquakes to those of historic earthquakes (pre-1966) recorded at the same (or similar) station locations to determine how comparable the historic earthquakes are to recent events with well determined hypocenters and rupture parameters. The seismograms of recent events will be used as empirical Greens functions in a deconvolution process to more accurately determine the directivity and rupture process of the older events. Our initial work in the Cascadia Subduction zone has focused on earthquakes within the subducting Pacific plate occurring in 1939, 1946, 1949 and 1965, in locations similar to the 1999 Satsop and 2001 Nisqually earthquakes. Preliminary analysis of waveform information suggests that the 1949 Olympia earthquake has a different rupture history to the east and south of the epicenter, as compared to the Nisqually earthquake. In the Mendocino Triple Junction region, we have focused on modeling smaller magnitude earthquakes using an earth simplification transform. We will calibrate the technique using several intermediate magnitude events that have occurred on the Mendocino fault and within the Gorda plate.

  19. Intrastab Earthquakes: Dehydration of the Cascadia Slab

    USGS Publications Warehouse

    Preston, L.A.; Creager, K.C.; Crosson, R.S.; Brocher, T.M.; Trehu, A.M.

    2003-01-01

    We simultaneously invert travel times of refracted and wide-angle reflected waves for three-dimensional compressional-wave velocity structure, earthquake locations, and reflector geometry in northwest Washington state. The reflector, interpreted to be the crust-mantle boundary (Moho) of the subducting Juan de Fuca plate, separates intrastab earthquakes into two groups, permitting a new understanding of the origins of intrastab earthquakes in Cascadia. Earthquakes up-dip of the Moho's 45-kilometer depth contour occur below the reflector, in the subducted oceanic mantle, consistent with serpentinite dehydration; earthquakes located down-dip occur primarily within the subducted crust, consistent with the basalt-to-eclogite transformation.

  20. Intraslab earthquakes: dehydration of the Cascadia slab.

    PubMed

    Preston, Leiph A; Creager, Kenneth C; Crosson, Robert S; Brocher, Thomas M; Trehu, Anne M

    2003-11-14

    We simultaneously invert travel times of refracted and wide-angle reflected waves for three-dimensional compressional-wave velocity structure, earthquake locations, and reflector geometry in northwest Washington state. The reflector, interpreted to be the crust-mantle boundary (Moho) of the subducting Juan de Fuca plate, separates intraslab earthquakes into two groups, permitting a new understanding of the origins of intraslab earthquakes in Cascadia. Earthquakes up-dip of the Moho's 45-kilometer depth contour occur below the reflector, in the subducted oceanic mantle, consistent with serpentinite dehydration; earthquakes located down-dip occur primarily within the subducted crust, consistent with the basalt-to-eclogite transformation. PMID:14615535

  1. Earthquakes and the urban environment. Volume I

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 1 contains chapters on earthquake parameters and hazards.

  2. Earthquakes and the urban environment. Volume II

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 2 contains chapters on earthquake prediction, control, building design and building response.

  3. Injection-induced earthquakes.

    PubMed

    Ellsworth, William L

    2013-07-12

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard. PMID:23846903

  4. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  5. Hidden earthquakes

    SciTech Connect

    Stein, R.S.; Yeats, R.S.

    1989-06-01

    Seismologists generally look for earthquakes to happen along visible fault lines, e.g., the San Andreas fault. The authors maintain that another source of dangerous quakes has been overlooked: the release of stress along a fault that is hidden under a fold in the earth's crust. The paper describes the differences between an earthquake which occurs on a visible fault and one which occurs under an anticline and warns that Los Angeles greatest earthquake threat may come from a small quake originating under downtown Los Angeles, rather than a larger earthquake which occurs 50 miles away at the San Andreas fault.

  6. Triggering of repeated earthquakes

    NASA Astrophysics Data System (ADS)

    Sobolev, G. A.; Zakrzhevskaya, N. A.; Sobolev, D. G.

    2016-03-01

    Based on the analysis of the world's earthquakes with magnitudes M ≥ 6.5 for 1960-2013, it is shown that they cause global-scale coherent seismic oscillations which most distinctly manifest themselves in the period interval of 4-6 min during 1-3 days after the event. After these earthquakes, a repeated shock has an increased probability to occur in different seismically active regions located as far away as a few thousand km from the previous event, i.e., a remote interaction of seismic events takes place. The number of the repeated shocks N( t) decreases with time, which characterizes the memory of the lithosphere about the impact that has occurred. The time decay N( t) can be approximated by the linear, exponential, and powerlaw dependences. No distinct correlation between the spatial locations of the initial and repeated earthquakes is revealed. The probable triggering mechanisms of the remote interaction between the earthquakes are discussed. Surface seismic waves traveling several times around the Earth's, coherent oscillations, and global source are the most preferable candidates. This may lead to the accumulation and coalescence of ruptures in the highly stressed or weakened domains of a seismically active region, which increases the probability of a repeated earthquake.

  7. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    NASA Astrophysics Data System (ADS)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    detections is very small compared to the 5,175 earthquakes in the USGS PDE global earthquake catalog for the same five month time period, and no accurate location or magnitude can be assigned based on Tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 80% occurred within 2 minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided (very) short first-impression narratives from people who experienced the shaking. The USGS will continue investigating how to use Twitter and other forms of social media to augment is current suite of seismographically derived products.

  8. Preferential earthquake-nucleating locations on faults determined by heterogeneous direct- and evolution-effect parameters of rate- and state-dependent friction

    NASA Astrophysics Data System (ADS)

    Viesca, R. C.; Ray, S.

    2015-12-01

    Rock friction experiments show that low-velocity fault friction may have a direct and subsequent evolutionary response to changes in slip velocity; the magnitude of which are respectively proportional to parameters a and b in constitutive relations of such rate- and state-dependent friction [e.g., Dieterich 1979; Ruina, 1983]. When a and b are uniform on a fault, translational invariance implies any location is a potential nucleation site, the choice determined by pre-instability conditions and external forcing. With heterogeneous parameters, symmetry is broken, which can create preferred nucleation sites. Recent work showed such heterogeneity does create favorable sites (Ray and Viesca, AGU '14). Here we study how distributions of (i) relative (0location of preferred sites. We examine the influence of (i) and (ii) by varying one or varying both (similarly or disparately). The smallest wavelength of variation is comparable to or larger than the size of the developing instability. We consider that elasticity may set either nonlocal (slip between half-spaces) or local (slip below and near a free surface) interactions. We use a dynamical system approach (Viesca, AGU'14) complemented by solutions for slip rate and state evolution during instability development to determine the preferred sites. When (i) varies and (ii) is fixed or varied, an instability develops where relative rate-weakening is locally or globally strongest (a minimum of i) for both types of elastic interactions. This may or may not coincide with the strongest absolute rate-weakening (a minimum of ii). This indicates that parameter (i) is comparatively dominant in deciding the location of a slip instability. However, fixing (i) and varying (ii), we find that elasticity contributes to determining the preferred site: i.e., nucleation occurs at the local minimum and maximum of (ii) for nonlocal and local interactions, respectively

  9. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  10. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  11. Analog earthquakes

    SciTech Connect

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  12. Recent earthquakes in northern New York

    SciTech Connect

    Revetta, F.A.; Bockus, C.; O'Brian, B. . Dept. of Geology)

    1993-03-01

    The Massena, New York area located along the St. Lawrence River in northern New York has been the site of significant earthquake activity including the largest earthquake in New York (m = 6.0) on September 5, 1944. Historic earthquake data indicates the Cornwall-Massena area is a region of relatively high seismic activity, and the earthquake activity has been persistent for over a 400 year period. During the past year eleven small earthquakes have been recorded by the Potsdam Seismic Network in northern New York. Four of these earthquakes had epicenters located in the Massena-Cornwall area. One epicenter was located along the Carthage-Colton Mylonite Zone and one epicenter was located n the epicentral region of the October 7, 1984 Goodnow earthquake. Five earthquakes had epicenters located in Ontario and Quebec. These earthquake epicenters lie in a belt of seismicity that extends north-westerly from the northern Adirondacks into the Canadian Shield of western Quebec. Several explanations that have been presented to explain these earthquakes are (1) mafic intrusions (2) unmapped northwest trending faults (3) extension of the New England seamount chain and (4) crustal fractures due to the area passing over a hotspot. Four earthquakes in the Massena area lie very near extensions of the Gloucester and Winchester Spring faults into New York and may be related to the faults. Focal mechanism solutions of two earthquakes indicate thrusting along NW striking fault planes. Another possibility is the earthquakes are related to the Carthage-Colton Mylonite Zone. One earthquake is within four kms of the CCMZ and if the zone is extended northward beneath the lower Paleozoics, it passes through the epicenters on the Cornwall-Massena area.

  13. Complex earthquake rupture and local tsunamis

    USGS Publications Warehouse

    Geist, E.L.

    2002-01-01

    In contrast to far-field tsunami amplitudes that are fairly well predicted by the seismic moment of subduction zone earthquakes, there exists significant variation in the scaling of local tsunami amplitude with respect to seismic moment. From a global catalog of tsunami runup observations this variability is greatest for the most frequently occuring tsunamigenic subduction zone earthquakes in the magnitude range of 7 < Mw < 8.5. Variability in local tsunami runup scaling can be ascribed to tsunami source parameters that are independent of seismic moment: variations in the water depth in the source region, the combination of higher slip and lower shear modulus at shallow depth, and rupture complexity in the form of heterogeneous slip distribution patterns. The focus of this study is on the effect that rupture complexity has on the local tsunami wave field. A wide range of slip distribution patterns are generated using a stochastic, self-affine source model that is consistent with the falloff of far-field seismic displacement spectra at high frequencies. The synthetic slip distributions generated by the stochastic source model are discretized and the vertical displacement fields from point source elastic dislocation expressions are superimposed to compute the coseismic vertical displacement field. For shallow subduction zone earthquakes it is demonstrated that self-affine irregularities of the slip distribution result in significant variations in local tsunami amplitude. The effects of rupture complexity are less pronounced for earthquakes at greater depth or along faults with steep dip angles. For a test region along the Pacific coast of central Mexico, peak nearshore tsunami amplitude is calculated for a large number (N = 100) of synthetic slip distribution patterns, all with identical seismic moment (Mw = 8.1). Analysis of the results indicates that for earthquakes of a fixed location, geometry, and seismic moment, peak nearshore tsunami amplitude can vary by a

  14. Joint Determination of Event Location and Magnitude from Historical Seismic Damage Records

    NASA Astrophysics Data System (ADS)

    Park, S.; Hong, T. K.

    2014-12-01

    Large earthquakes have long recurrence intervals. It is crucial to consider long-time seismicity for a proper assessment of potential seismic hazards. It is required to use historical earthquake records to complement the long-time seismicity records. Historical earthquake records remain as in seismic damage description with limited accuracy in source parameters including event location and its size. It is important to determine epicenters and magnitudes of historical earthquakes accurately. A noble method to determine the event location and magnitude from historical seismic damage records is introduced. Seismic damage is typically proportional to the event magnitude, and is inversely proportional to the distance. This feature allows us to deduce the event magnitude and location from spatial distribution of seismic intensities. However, the magnitude and distance trade off each other, inhibiting unique determination of event magnitude and location. The Gutenberg-Richter frequency-magnitude relationship is additionally considered to constrain the source parameters. The Gutenberg-Richter frequency-magnitude relationship is assumed to be consistent between instrumental and historical seismicity. A set of event location and magnitude that satisfy the chance of event occurrence according to the Gutenberg-Richter frequency-magnitude relationship is selected. The accuracy of the method is tested for synthetic data sets, and the validity of the method is examined. The synthetic tests present high accuracy of the method. The method is applied to historical seismic damage records, which allows us to calibrate the source parameters of historical earthquakes.

  15. Earthquakes and the urban environment. Volume III

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 3 contains chapters on seismic planning, social aspects and future prospects.

  16. Application of collocated GPS and seismic sensors to earthquake monitoring and early warning.

    PubMed

    Li, Xingxing; Zhang, Xiaohong; Guo, Bofeng

    2013-01-01

    We explore the use of collocated GPS and seismic sensors for earthquake monitoring and early warning. The GPS and seismic data collected during the 2011 Tohoku-Oki (Japan) and the 2010 El Mayor-Cucapah (Mexico) earthquakes are analyzed by using a tightly-coupled integration. The performance of the integrated results is validated by both time and frequency domain analysis. We detect the P-wave arrival and observe small-scale features of the movement from the integrated results and locate the epicenter. Meanwhile, permanent offsets are extracted from the integrated displacements highly accurately and used for reliable fault slip inversion and magnitude estimation. PMID:24284765

  17. Application of Collocated GPS and Seismic Sensors to Earthquake Monitoring and Early Warning

    PubMed Central

    Li, Xingxing; Zhang, Xiaohong; Guo, Bofeng

    2013-01-01

    We explore the use of collocated GPS and seismic sensors for earthquake monitoring and early warning. The GPS and seismic data collected during the 2011 Tohoku-Oki (Japan) and the 2010 El Mayor-Cucapah (Mexico) earthquakes are analyzed by using a tightly-coupled integration. The performance of the integrated results is validated by both time and frequency domain analysis. We detect the P-wave arrival and observe small-scale features of the movement from the integrated results and locate the epicenter. Meanwhile, permanent offsets are extracted from the integrated displacements highly accurately and used for reliable fault slip inversion and magnitude estimation. PMID:24284765

  18. Predicting Predictable: Accuracy and Reliability of Earthquake Forecasts

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2014-12-01

    Earthquake forecast/prediction is an uncertain profession. The famous Gutenberg-Richter relationship limits magnitude range of prediction to about one unit. Otherwise, the statistics of outcomes would be related to the smallest earthquakes and may be misleading when attributed to the largest earthquakes. Moreover, the intrinsic uncertainty of earthquake sizing allows self-deceptive picking of justification "just from below" the targeted magnitude range. This might be important encouraging evidence but, by no means, can be a "helpful" additive to statistics of a rigid testing that determines reliability and efficiency of a farecast/prediction method. Usually, earthquake prediction is classified in respect to expectation time while overlooking term-less identification of earthquake prone areas, as well as spatial accuracy. The forecasts are often made for a "cell" or "seismic region" whose area is not linked to the size of target earthquakes. This might be another source for making a wrong choice in parameterization of an forecast/prediction method and, eventually, for unsatisfactory performance in a real-time application. Summing up, prediction of time and location of an earthquake of a certain magnitude range can be classified into categories listed in the Table below - Classification of earthquake prediction accuracy Temporal, in years Spatial, in source zone size (L) Long-term 10 Long-range Up to 100 Intermediate-term 1 Middle-range 5-10 Short-term 0.01-0.1 Narrow-range 2-3 Immediate 0.001 Exact 1 Note that a wide variety of possible combinations that exist is much larger than usually considered "short-term exact" one. In principle, such an accurate statement about anticipated seismic extreme might be futile due to the complexities of the Earth's lithosphere, its blocks-and-faults structure, and evidently nonlinear dynamics of the seismic process. The observed scaling of source size and preparation zone with earthquake magnitude implies exponential scales for

  19. Rapid Assessment of Earthquakes with Radar and Optical Geodetic Imaging and Finite Fault Models (Invited)

    NASA Astrophysics Data System (ADS)

    Fielding, E. J.; Sladen, A.; Simons, M.; Rosen, P. A.; Yun, S.; Li, Z.; Avouac, J.; Leprince, S.

    2010-12-01

    Earthquake responders need to know where the earthquake has caused damage and what is the likely intensity of damage. The earliest information comes from global and regional seismic networks, which provide the magnitude and locations of the main earthquake hypocenter and moment tensor centroid and also the locations of aftershocks. Location accuracy depends on the availability of seismic data close to the earthquake source. Finite fault models of the earthquake slip can be derived from analysis of seismic waveforms alone, but the results can have large errors in the location of the fault ruptures and spatial distribution of slip, which are critical for estimating the distribution of shaking and damage. Geodetic measurements of ground displacements with GPS, LiDAR, or radar and optical imagery provide key spatial constraints on the location of the fault ruptures and distribution of slip. Here we describe the analysis of interferometric synthetic aperture radar (InSAR) and sub-pixel correlation (or pixel offset tracking) of radar and optical imagery to measure ground coseismic displacements for recent large earthquakes, and lessons learned for rapid assessment of future events. These geodetic imaging techniques have been applied to the 2010 Leogane, Haiti; 2010 Maule, Chile; 2010 Baja California, Mexico; 2008 Wenchuan, China; 2007 Tocopilla, Chile; 2007 Pisco, Peru; 2005 Kashmir; and 2003 Bam, Iran earthquakes, using data from ESA Envisat ASAR, JAXA ALOS PALSAR, NASA Terra ASTER and CNES SPOT5 satellite instruments and the NASA/JPL UAVSAR airborne system. For these events, the geodetic data provided unique information on the location of the fault or faults that ruptured and the distribution of slip that was not available from the seismic data and allowed the creation of accurate finite fault source models. In many of these cases, the fault ruptures were on previously unknown faults or faults not believed to be at high risk of earthquakes, so the area and degree of

  20. Southeast Indian Ocean-Ridge Earthquake Sequences from Cross-correlation Analysis of Hydro-acoustic Data

    NASA Astrophysics Data System (ADS)

    Yun, S.; Ni, S.; Park, M.

    2006-12-01

    Earthquake sequences (location and timing of foreshocks and aftershocks) are critical for understanding dynamics of mid-ocean ridge and transform faults. Unfortunately whole sequences (including very small earthquakes) in the ocean can not be well recorded by land-based seismometers mostly because of large epicentral distances. Recent hydro-acoustic studies have demonstrated that T waves are very effective in detecting small submarine earthquakes because of little energy loss of T waves propagating in SOFAR channel. For example, a Mw6.2 (03/06/2006, Latitude -40.11, Longitude 78.49) transform earthquake occurred at the Southeastern Indian Ocean Ridge (an intermediate spreading rate ridge, 58-76 mm/year), but NEIC only reports 3 aftershocks in the first following week. We applied progressive multi-channel cross-correlation methods to hydro-acoustic data from the IMS arrays in Indian Ocean to detect the whole earthquake sequence. We also correlate waveform envelopes to accurately locate aftershocks and found consistent pattern of earthquake migration along the transform fault. In contrast to transform fault earthquake sequences at fast spreading ridge (East Pacific Rise, 142 mm/year) where foreshocks are observed, we failed to detect any foreshocks for the Mw6.2 earthquake though we found many aftershocks. The lack of foreshocks may be caused by lower spreading rate (hence lower temperature) or too small scale ridge segmentation. Still the number of aftershocks is much less than that of typical tectonic earthquake such as subduction or continental earthquakes, arguing different fault dynamics for mid ocean ridge systems, perhaps due to higher water content or presence of melt.

  1. The U.S. Geological Survey Earthquake Hazards Program Website: Summary of Recent and Ongoing Developments

    NASA Astrophysics Data System (ADS)

    Wald, L. A.; Zirbes, M.; Robert, S.; Wald, D.; Presgrace, B.; Earle, P.; Schwarz, S.; Haefner, S.; Haller, K.; Rhea, S.

    2003-12-01

    The U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) website (http://earthquake.usgs.gov/) focuses on 1) earthquake reporting for informed decisions after an earthquake, 2) hazards information for informed decisions and planning before an earthquake, and 3) the basics of earthquake science to help the users of the information understand what is presented. The majority of website visitors are looking for information about current earthquakes in the U.S. and around the world, and the second most visited portion of the website are the education-related pages. People are eager for information, and they are most interested in "what's in my backyard?" Recent and future web developments are aimed at answering this question, making the information more relevant to users, and enabling users to more quickly and easily find the information they are looking for. Recent and/or current web developments include the new enhanced Recent Global Earthquakes and U.S. Earthquakes webpages, the Earthquake in the News system, the Rapid Accurate Tectonic Summaries (RATS), online Significant Earthquake Summary Posters (ESP's), and the U.S. Quaternary Fault & Fold Database, the details of which are covered individually in greater detail in this or other sessions. Future planned developments include a consistent look across all EHP webpages, an integrated one-stop-shopping earthquake notification (EQMail) subscription webpage, new navigation tabs, and a backend database allowing the user to search for earthquake information across all the various EHP websites (on different webservers) based on a topic or region. Another goal is to eventually allow a user to input their address (Zip Code?) and in return receive all the relevant EHP information (and links to more detailed information) such as closest fault, the last significant nearby earthquake, a local seismicity map, and a local hazard map, for example. This would essentially be a dynamic report based on the entered location

  2. Earthquake Simulations and Historical Patterns of Events: Forecasting the Next Great Earthquake in California

    NASA Astrophysics Data System (ADS)

    Sachs, M. K.; Rundle, J. B.; Heien, E. M.; Turcotte, D. L.; Yikilmaz, M.; Kellogg, L. H.

    2013-12-01

    The fault system in California combined with some of the United States most densely populated regions is a recipe for devastation. It has been estimated that a repeat of the 1906 m=7.8 San Francisco earthquake could cause as much as $84 billion in damage. Earthquake forecasting can help alleviate the effects of these events by targeting disaster relief and preparedness in regions that will need it the most. However, accurate earthquake forecasting has proven difficult. We present a forecasting technique that uses simulated earthquake catalogs generated by Virtual California and patterns of historical events. As background, we also describe internal details of the Virtual California earthquake simulator.

  3. Deep Earthquakes.

    ERIC Educational Resources Information Center

    Frohlich, Cliff

    1989-01-01

    Summarizes research to find the nature of deep earthquakes occurring hundreds of kilometers down in the earth's mantle. Describes further research problems in this area. Presents several illustrations and four references. (YP)

  4. Earthquake Analysis.

    ERIC Educational Resources Information Center

    Espinoza, Fernando

    2000-01-01

    Indicates the importance of the development of students' measurement and estimation skills. Analyzes earthquake data recorded at seismograph stations and explains how to read and modify the graphs. Presents an activity for student evaluation. (YDS)

  5. Optimizing the real-time automatic location of the events produced in Romania using an advanced processing system

    NASA Astrophysics Data System (ADS)

    Neagoe, Cristian; Grecu, Bogdan; Manea, Liviu

    2016-04-01

    National Institute for Earth Physics (NIEP) operates a real time seismic network which is designed to monitor the seismic activity on the Romanian territory, which is dominated by the intermediate earthquakes (60-200 km) from Vrancea area. The ability to reduce the impact of earthquakes on society depends on the existence of a large number of high-quality observational data. The development of the network in recent years and an advanced seismic acquisition are crucial to achieving this objective. The software package used to perform the automatic real-time locations is Seiscomp3. An accurate choice of the Seiscomp3 setting parameters is necessary to ensure the best performance of the real-time system i.e., the most accurate location for the earthquakes and avoiding any false events. The aim of this study is to optimize the algorithms of the real-time system that detect and locate the earthquakes in the monitored area. This goal is pursued by testing different parameters (e.g., STA/LTA, filters applied to the waveforms) on a data set of representative earthquakes of the local seismicity. The results are compared with the locations from the Romanian Catalogue ROMPLUS.

  6. The Uniform California Earthquake Rupture Forecast, Version 2 (UCERF 2)

    USGS Publications Warehouse

    2007 Working Group on California Earthquake Probabilities

    2008-01-01

    California?s 35 million people live among some of the most active earthquake faults in the United States. Public safety demands credible assessments of the earthquake hazard to maintain appropriate building codes for safe construction and earthquake insurance for loss protection. Seismic hazard analysis begins with an earthquake rupture forecast?a model of probabilities that earthquakes of specified magnitudes, locations, and faulting types will occur during a specified time interval. This report describes a new earthquake rupture forecast for California developed by the 2007 Working Group on California Earthquake Probabilities (WGCEP 2007).

  7. Development of fragility functions to estimate homelessness after an earthquake

    NASA Astrophysics Data System (ADS)

    Brink, Susan A.; Daniell, James; Khazai, Bijan; Wenzel, Friedemann

    2014-05-01

    Immediately after an earthquake, many stakeholders need to make decisions about their response. These decisions often need to be made in a data poor environment as accurate information on the impact can take months or even years to be collected and publicized. Social fragility functions have been developed and applied to provide an estimate of the impact in terms of building damage, deaths and injuries in near real time. These rough estimates can help governments and response agencies determine what aid may be required which can improve their emergency response and facilitate planning for longer term response. Due to building damage, lifeline outages, fear of aftershocks, or other causes, people may become displaced or homeless after an earthquake. Especially in cold and dangerous locations, the rapid provision of safe emergency shelter can be a lifesaving necessity. However, immediately after an event there is little information available about the number of homeless, their locations and whether they require public shelter to aid the response agencies in decision making. In this research, we analyze homelessness after historic earthquakes using the CATDAT Damaging Earthquakes Database. CATDAT includes information on the hazard as well as the physical and social impact of over 7200 damaging earthquakes from 1900-2013 (Daniell et al. 2011). We explore the relationship of both earthquake characteristics and area characteristics with homelessness after the earthquake. We consider modelled variables such as population density, HDI, year, measures of ground motion intensity developed in Daniell (2014) over the time period from 1900-2013 as well as temperature. Using a base methodology based on that used for PAGER fatality fragility curves developed by Jaiswal and Wald (2010), but using regression through time using the socioeconomic parameters developed in Daniell et al. (2012) for "socioeconomic fragility functions", we develop a set of fragility curves that can be

  8. The threat of silent earthquakes

    USGS Publications Warehouse

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  9. Improved centroid moment tensor analyses in the NIED AQUA (Accurate and QUick Analysis system for source parameters)

    NASA Astrophysics Data System (ADS)

    Kimura, H.; Asano, Y.; Matsumoto, T.

    2012-12-01

    The rapid determination of hypocentral parameters and their transmission to the public are valuable components of disaster mitigation. We have operated an automatic system for this purpose—termed the Accurate and QUick Analysis system for source parameters (AQUA)—since 2005 (Matsumura et al., 2006). In this system, the initial hypocenter, the moment tensor (MT), and the centroid moment tensor (CMT) solutions are automatically determined and posted on the NIED Hi-net Web site (www.hinet.bosai.go.jp). This paper describes improvements made to the AQUA to overcome limitations that became apparent after the 2011 Tohoku Earthquake (05:46:17, March 11, 2011 in UTC). The improvements included the processing of NIED F-net velocity-type strong motion records, because NIED F-net broadband seismographs are saturated for great earthquakes such as the 2011 Tohoku Earthquake. These velocity-type strong motion seismographs provide unsaturated records not only for the 2011 Tohoku Earthquake, but also for recording stations located close to the epicenters of M>7 earthquakes. We used 0.005-0.020 Hz records for M>7.5 earthquakes, in contrast to the 0.01-0.05 Hz records employed in the original system. The initial hypocenters determined based on arrival times picked by using seismograms recorded by NIED Hi-net stations can have large errors in terms of magnitude and hypocenter location, especially for great earthquakes or earthquakes located far from the onland Hi-net network. The size of the 2011 Tohoku Earthquake was initially underestimated in the AQUA to be around M5 at the initial stage of rupture. Numerous aftershocks occurred at the outer rise east of the Japan trench, where a great earthquake is anticipated to occur. Hence, we modified the system to repeat the MT analyses assuming a larger size, for all earthquakes for which the magnitude was initially underestimated. We also broadened the search range of centroid depth for earthquakes located far from the onland Hi

  10. Earthquakes for Kids

    MedlinePlus

    ... Hazards Data & Products Learn Monitoring Research Earthquakes for Kids Kid's Privacy Policy Earthquake Topics for Education FAQ Earthquake Glossary For Kids Prepare Google Earth/KML Files Earthquake Summary Posters ...

  11. Testing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Luen, Brad; Stark, Philip B.

    2008-01-01

    Statistical tests of earthquake predictions require a null hypothesis to model occasional chance successes. To define and quantify 'chance success' is knotty. Some null hypotheses ascribe chance to the Earth: Seismicity is modeled as random. The null distribution of the number of successful predictions - or any other test statistic - is taken to be its distribution when the fixed set of predictions is applied to random seismicity. Such tests tacitly assume that the predictions do not depend on the observed seismicity. Conditioning on the predictions in this way sets a low hurdle for statistical significance. Consider this scheme: When an earthquake of magnitude 5.5 or greater occurs anywhere in the world, predict that an earthquake at least as large will occur within 21 days and within an epicentral distance of 50 km. We apply this rule to the Harvard centroid-moment-tensor (CMT) catalog for 2000-2004 to generate a set of predictions. The null hypothesis is that earthquake times are exchangeable conditional on their magnitudes and locations and on the predictions - a common "nonparametric" assumption in the literature. We generate random seismicity by permuting the times of events in the CMT catalog. We consider an event successfully predicted only if (i) it is predicted and (ii) there is no larger event within 50 km in the previous 21 days. The P-value for the observed success rate is <0.001: The method successfully predicts about 5% of earthquakes, far better than 'chance' because the predictor exploits the clustering of earthquakes - occasional foreshocks - which the null hypothesis lacks. Rather than condition on the predictions and use a stochastic model for seismicity, it is preferable to treat the observed seismicity as fixed, and to compare the success rate of the predictions to the success rate of simple-minded predictions like those just described. If the proffered predictions do no better than a simple scheme, they have little value.

  12. Improve earthquake hypocenter using adaptive simulated annealing inversion in regional tectonic, volcano tectonic, and geothermal observation

    NASA Astrophysics Data System (ADS)

    Ry, Rexha Verdhora; Nugraha, Andri Dian

    2015-04-01

    Observation of earthquakes is routinely used widely in tectonic activity observation, and also in local scale such as volcano tectonic and geothermal activity observation. It is necessary for determining the location of precise hypocenter which the process involves finding a hypocenter location that has minimum error between the observed and the calculated travel times. When solving this nonlinear inverse problem, simulated annealing inversion method can be applied to such global optimization problems, which the convergence of its solution is independent of the initial model. In this study, we developed own program codeby applying adaptive simulated annealing inversion in Matlab environment. We applied this method to determine earthquake hypocenter using several data cases which are regional tectonic, volcano tectonic, and geothermal field. The travel times were calculated using ray tracing shooting method. We then compared its results with the results using Geiger's method to analyze its reliability. Our results show hypocenter location has smaller RMS error compared to the Geiger's result that can be statistically associated with better solution. The hypocenter of earthquakes also well correlated with geological structure in the study area. Werecommend using adaptive simulated annealing inversion to relocate hypocenter location in purpose to get precise and accurate earthquake location.

  13. Teleseismic Double-Difference Earthquake Hypocenter Relocation in the Indonesian Region

    NASA Astrophysics Data System (ADS)

    Nugraha, A. D.; Shiddiqi, H. A.; Widiyantoro, S.; Ramdhan, M.; Wandono, W.; Sutiyono, S.; Handayani, T.

    2014-12-01

    Accuracy of hypocenter location is a crucial obstacle for seismicity study. Therefore, it is important to obtain accurate earthquake location using an adequate relocation method. We have relocated nearly 30,000 earthquakes (with magnitude greater than 2.0) compiled by BMKG from April 2009 to June 2014 around the Indonesian region using teleseismic double-difference relocation algorithm. We used arrival time data from local, regional and teleseismic stations. For the inversion procedure, we have applied 1-D and 3-D seismic velocity models to determine earthquake hypocenter location. Our relocation results show that the travel-time RMS errors were greatly reduced. The hypocenter locations distribution shows significantly improved locations after the relocation. The relocated hypocenters also exhibit improvement in hypocenter depths particularly for shallow earthquakes. Overall, our relocation results were well correlated with tectonic features in this region, e.g. major subduction zones beneath Sumatra, Java, Bali, Banda, Sulawesi and Molluca and inland fault zones such as the Sumatra faut zone. These results will provide better information for updating seismic hazard maps and further advanced studies in the Indonesian region.

  14. Improve earthquake hypocenter using adaptive simulated annealing inversion in regional tectonic, volcano tectonic, and geothermal observation

    SciTech Connect

    Ry, Rexha Verdhora; Nugraha, Andri Dian

    2015-04-24

    Observation of earthquakes is routinely used widely in tectonic activity observation, and also in local scale such as volcano tectonic and geothermal activity observation. It is necessary for determining the location of precise hypocenter which the process involves finding a hypocenter location that has minimum error between the observed and the calculated travel times. When solving this nonlinear inverse problem, simulated annealing inversion method can be applied to such global optimization problems, which the convergence of its solution is independent of the initial model. In this study, we developed own program codeby applying adaptive simulated annealing inversion in Matlab environment. We applied this method to determine earthquake hypocenter using several data cases which are regional tectonic, volcano tectonic, and geothermal field. The travel times were calculated using ray tracing shooting method. We then compared its results with the results using Geiger’s method to analyze its reliability. Our results show hypocenter location has smaller RMS error compared to the Geiger’s result that can be statistically associated with better solution. The hypocenter of earthquakes also well correlated with geological structure in the study area. Werecommend using adaptive simulated annealing inversion to relocate hypocenter location in purpose to get precise and accurate earthquake location.

  15. United States earthquakes, 1984

    SciTech Connect

    Stover, C.W.

    1988-01-01

    The report contains information for eartthquakes in the 50 states and Puerto Rico and the area near their shorelines. The data consist of earthquake locations (date, time, geographic coordinates, depth, and magnitudes), intensities, macroseismic information, and isoseismal and seismicity maps. Also, included are sections detailing the activity of seismic networks operated by universities and other government agencies and a list of results form strong-motion seismograph records.

  16. Using DART-recorded Rayleigh waves for rapid CMT and finite fault analyses of large megathrust earthquakes.

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Polet, J.; Ryan, K. J.

    2015-12-01

    We study the use of long-period Rayleigh waves recorded by the DART-type ocean bottom pressure sensors. The determination of accurate moment and slip distribution after a megathrust subduction zone earthquake is essential for tsunami early warning. The two main reasons why the DART data are o interest to this problem are; 1 - contrary to the broadband data used in the early stages of earthquake analysis, the DART data do not saturate for large magnitude earthquakes, and 2 - DART stations are located offshore and thus often fill gaps in the instrumental coverage at local and regional distances. Thus, by including DART recorded Rayleigh waves into the rapid response systems we may be able to gain valuable time in determining accurate moment estimates and slip distributions needed for tsunami warning and other rapid response products. Large megathrust earthquakes are among the most destructive natural disasters in history but also pose a significant challenge real-time analysis. The scales involved in such large earthquakes, with ruptures as long as a thousand kilometers and durations of several minutes are formidable. There are still issues with rapid analysis at the short timescales, such as minutes after the event since many of the nearby seismic stations will saturate due to the large ground motions. Also, on the seaward side of megathrust earthquakes, the nearest seismic stations are often thousands of kilometers away on oceanic islands. The deployment of DART buoys can fill this gap, since these instruments do not saturate and are located close in on the seaward side of the megathrusts. We are evaluating the use of DART-recorded Rayleigh waves, by including them in the dataset used for Centroid Moment Tensor analyses, and by using the near-field DART stations to constrain source finiteness for megathrust earthquakes such as the recent Tohoku, Haida Gwaii and Chile earthquakes.

  17. Earthquake Apparent Stress Scaling

    NASA Astrophysics Data System (ADS)

    Mayeda, K.; Walter, W. R.

    2003-04-01

    There is currently a disagreement within the geophysical community on the way earthquake energy scales with magnitude. One set of recent papers finds evidence that energy release per seismic moment (apparent stress) is constant (e.g. Choy and Boatwright, 1995; McGarr, 1999; Ide and Beroza, 2001). Another set of recent papers finds the apparent stress increases with magnitude (e.g. Kanamori et al., 1993 Abercrombie, 1995; Mayeda and Walter, 1996; Izutani and Kanamori, 2001). The resolution of this issue is complicated by the difficulty of accurately accounting for and determining the seismic energy radiated by earthquakes over a wide range of event sizes in a consistent manner. We have just started a project to reexamine this issue by applying the same methodology to a series of datasets that spans roughly 10 orders in seismic moment, M0. We will summarize recent results using a coda envelope methodology of Mayeda et al, (2003) which provide the most stable source spectral estimates to date. This methodology eliminates the complicating effects of lateral path heterogeneity, source radiation pattern, directivity, and site response (e.g., amplification, f-max and kappa). We find that in tectonically active continental crustal areas the total radiated energy scales as M00.25 whereas in regions of relatively younger oceanic crust, the stress drop is generally lower and exhibits a 1-to-1 scaling with moment. In addition to answering a fundamental question in earthquake source dynamics, this study addresses how one would scale small earthquakes in a particular region up to a future, more damaging earthquake. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under Contract No. W-7405-Eng-48.

  18. The EM Earthquake Precursor

    NASA Astrophysics Data System (ADS)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  19. A new reference global instrumental earthquake catalogue (1900-2009)

    NASA Astrophysics Data System (ADS)

    Di Giacomo, D.; Engdahl, B.; Bondar, I.; Storchak, D. A.; Villasenor, A.; Bormann, P.; Lee, W.; Dando, B.; Harris, J.

    2011-12-01

    For seismic hazard studies on a global and/or regional scale, accurate knowledge of the spatial distribution of seismicity, the magnitude-frequency relation and the maximum magnitudes is of fundamental importance. However, such information is normally not homogeneous (or not available) for the various seismically active regions of the Earth. To achieve the GEM objectives (www.globalquakemodel.org) of calculating and communicating earthquake risk worldwide, an improved reference global instrumental catalogue for large earthquakes spanning the entire 100+ years period of instrumental seismology is an absolute necessity. To accomplish this task, we apply the most up-to-date techniques and standard observatory practices for computing the earthquake location and magnitude. In particular, the re-location procedure benefits both from the depth determination according to Engdahl and Villaseñor (2002), and the advanced technique recently implemented at the ISC (Bondár and Storchak, 2011) to account for correlated error structure. With regard to magnitude, starting from the re-located hypocenters, the classical surface and body-wave magnitudes are determined following the new IASPEI standards and by using amplitude-period data of phases collected from historical station bulletins (up to 1970), which were not available in digital format before the beginning of this work. Finally, the catalogue will provide moment magnitude values (including uncertainty) for each seismic event via seismic moment, via surface wave magnitude or via other magnitude types using empirical relationships. References Engdahl, E.R., and A. Villaseñor (2002). Global seismicity: 1900-1999. In: International Handbook of Earthquake and Engineering Seismology, eds. W.H.K. Lee, H. Kanamori, J.C. Jennings, and C. Kisslinger, Part A, 665-690, Academic Press, San Diego. Bondár, I., and D. Storchak (2011). Improved location procedures at the International Seismological Centre, Geophys. J. Int., doi:10.1111/j

  20. Ground Motion Prediction of Subduction Earthquakes using the Onshore-Offshore Ambient Seismic Field

    NASA Astrophysics Data System (ADS)

    Viens, L.; Miyake, H.; Koketsu, K.

    2014-12-01

    Seismic waves produced by earthquakes already caused plenty of damages all around the world and are still a real threat to human beings. To reduce seismic risk associated with future earthquakes, accurate ground motion predictions are required, especially for cities located atop sedimentary basins that can trap and amplify these seismic waves. We focus this study on long-period ground motions produced by subduction earthquakes in Japan which have the potential to damage large-scale structures, such as high-rise buildings, bridges, and oil storage tanks. We extracted the impulse response functions from the ambient seismic field recorded by two stations using one as a virtual source, without any preprocessing. This method allows to recover the reliable phases and relative, rather than absolute, amplitudes. To retrieve corresponding Green's functions, the impulse response amplitudes need to be calibrated using observational records of an earthquake which happened close to the virtual source. We show that Green's functions can be extracted between offshore submarine cable-based sea-bottom seismographic observation systems deployed by JMA located atop subduction zones and on-land NIED/Hi-net stations. In contrast with physics-based simulations, this approach has the great advantage to predict ground motions of moderate earthquakes (Mw ~5) at long-periods in highly populated sedimentary basin without the need of any external information about the velocity structure.

  1. Relocation of Gulf of Aqaba earthquakes using the JSOP Bulletin

    SciTech Connect

    Sweeney, J.J.

    1997-07-03

    Ground truth information (i.e. precise information about the hypocenter and origin time of aseismic event) is difficult to obtain in the Middle East and North Africa region. One source of ground truth we are attempting to exploit is data from local seismic networks. An electronic bulletin from the second phase of the Joint Seismic Observation Period (JSOP), with participating countries in the eastern Mediterranean region, provides a source of local network data not ordinarily available. I have used JSOP bulletin data for the period January 1996 through June 1996 to relocate over 100 earthquakes occurring in and around the Gulf of Aqaba. Fourteen of these earthquakes have picks in the bulletin for stations surrounding the Gulf (Egypt Saudi Arabia, Israel,and Jordan). The rest of the data involves picks for stations either in Israel, Jordan, and Saudi Arabia (east side and north of the Gulf) or for stations in Israel, Jordan, and Egypt (west side and north of the Gulf). The VELEST code (Joint Hypocenter Determination method) was used to calculate improved locations (over what can be obtained from single event determinations--SED with poor station configurations) for the all the earthquakes in the data set. Location differences between the JHD solution and SED are discussed, along with determination of the minimum 1-Velocity model. Waveform correlation was used to validate observed event clusters in the VELEST solutions. This provided evidence that some of the VELEST solutions are more accurate than NEIC solutions. The subset of 14 events with good station coverage provides a good set of ground truth (location uncertainty {lt}5 km). The rest of the events are probably located more accurately with local data than is available from NEIC determinations, but such a conclusion needs to be supported by further study.

  2. Extraction Method for Earthquake-Collapsed Building Information Based on High-Resolution Remote Sensing

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Wu, Jian; Liu, Yaolin; Wang, Jing

    2014-03-01

    At present, the extraction of earthquake disaster information from remote sensing data relies on visual interpretation. However, this technique cannot effectively and quickly obtain precise and efficient information for earthquake relief and emergency management. Collapsed buildings in the town of Zipingpu after the Wenchuan earthquake were used as a case study to validate two kinds of rapid extraction methods for earthquake-collapsed building information based on pixel-oriented and object-oriented theories. The pixel-oriented method is based on multi-layer regional segments that embody the core layers and segments of the object-oriented method. The key idea is to mask layer by layer all image information, including that on the collapsed buildings. Compared with traditional techniques, the pixel-oriented method is innovative because it allows considerably rapid computer processing. As for the object-oriented method, a multi-scale segment algorithm was applied to build a three-layer hierarchy. By analyzing the spectrum, texture, shape, location, and context of individual object classes in different layers, the fuzzy determined rule system was established for the extraction of earthquake-collapsed building information. We compared the two sets of results using three variables: precision assessment, visual effect, and principle. Both methods can extract earthquake-collapsed building information quickly and accurately. The object-oriented method successfully overcomes the pepper salt noise caused by the spectral diversity of high-resolution remote sensing data and solves the problem of same object, different spectrums and that of same spectrum, different objects. With an overall accuracy of 90.38%, the method achieves more scientific and accurate results compared with the pixel-oriented method (76.84%). The object-oriented image analysis method can be extensively applied in the extraction of earthquake disaster information based on high-resolution remote sensing.

  3. Fracking, wastewater disposal, and earthquakes

    NASA Astrophysics Data System (ADS)

    McGarr, Arthur

    2016-03-01

    In the modern oil and gas industry, fracking of low-permeability reservoirs has resulted in a considerable increase in the production of oil and natural gas, but these fluid-injection activities also can induce earthquakes. Earthquakes induced by fracking are an inevitable consequence of the injection of fluid at high pressure, where the intent is to enhance permeability by creating a system of cracks and fissures that allow hydrocarbons to flow to the borehole. The micro-earthquakes induced during these highly-controlled procedures are generally much too small to be felt at the surface; indeed, the creation or reactivation of a large fault would be contrary to the goal of enhancing permeability evenly throughout the formation. Accordingly, the few case histories for which fracking has resulted in felt earthquakes have been due to unintended fault reactivation. Of greater consequence for inducing earthquakes, modern techniques for producing hydrocarbons, including fracking, have resulted in considerable quantities of coproduced wastewater, primarily formation brines. This wastewater is commonly disposed by injection into deep aquifers having high permeability and porosity. As reported in many case histories, pore pressure increases due to wastewater injection were channeled from the target aquifers into fault zones that were, in effect, lubricated, resulting in earthquake slip. These fault zones are often located in the brittle crystalline rocks in the basement. Magnitudes of earthquakes induced by wastewater disposal often exceed 4, the threshold for structural damage. Even though only a small fraction of disposal wells induce earthquakes large enough to be of concern to the public, there are so many of these wells that this source of seismicity contributes significantly to the seismic hazard in the United States, especially east of the Rocky Mountains where standards of building construction are generally not designed to resist shaking from large earthquakes.

  4. Earthquake Alert System feasibility study

    SciTech Connect

    Harben, P.E.

    1991-12-01

    An Earthquake Alert System (EAS) could give several seconds to several tens of seconds warning before the strong motion from a large earthquake arrives. Such a system would include a large network of sensors distributed within an earthquake-prone region. The sensors closest to the epicenter of a particular earthquake would transmit data at the speed of light to a central processing center, which would broadcast an area-wide alarm in advance of the spreading elastic wave energy from the earthquake. This is possible because seismic energy travels slowly (3--6 km/s) compared to the speed of light. Utilities, public and private institutions, businesses, and the general public would benefit from an EAS. Although many earthquake protection systems exist that automatically shut down power, gas mains, etc. when ground motion at a facility reaches damaging levels, not EAS -- that is, a system that can provide warning in advance of elastic wave energy arriving at a facility -- has ever been developed in the United States. A recent study by the National Academy of Sciences (NRC, 1991) concludes that an EAS is technically feasible and strongly recommends installing a prototype system that makes use of existing microseismic stations as much as possible. The EAS concept discussed here consists of a distributed network of remote seismic stations that measure weak and strong earth motion and transmit the data in real time to central facility. This facility processes the data and issues warning broadcasts in the form of information packets containing estimates of earthquake location, zero time (the time the earthquake began), magnitude, and reliability of the predictions. User of the warning broadcasts have a dedicated receiver that monitors the warning broadcast frequency. The user also has preprogrammed responses that are automatically executed when the warning information packets contain location and magnitude estimates above a facility`s tolerance.

  5. Earthquake Alert System feasibility study

    SciTech Connect

    Harben, P.E.

    1991-12-01

    An Earthquake Alert System (EAS) could give several seconds to several tens of seconds warning before the strong motion from a large earthquake arrives. Such a system would include a large network of sensors distributed within an earthquake-prone region. The sensors closest to the epicenter of a particular earthquake would transmit data at the speed of light to a central processing center, which would broadcast an area-wide alarm in advance of the spreading elastic wave energy from the earthquake. This is possible because seismic energy travels slowly (3--6 km/s) compared to the speed of light. Utilities, public and private institutions, businesses, and the general public would benefit from an EAS. Although many earthquake protection systems exist that automatically shut down power, gas mains, etc. when ground motion at a facility reaches damaging levels, not EAS -- that is, a system that can provide warning in advance of elastic wave energy arriving at a facility -- has ever been developed in the United States. A recent study by the National Academy of Sciences (NRC, 1991) concludes that an EAS is technically feasible and strongly recommends installing a prototype system that makes use of existing microseismic stations as much as possible. The EAS concept discussed here consists of a distributed network of remote seismic stations that measure weak and strong earth motion and transmit the data in real time to central facility. This facility processes the data and issues warning broadcasts in the form of information packets containing estimates of earthquake location, zero time (the time the earthquake began), magnitude, and reliability of the predictions. User of the warning broadcasts have a dedicated receiver that monitors the warning broadcast frequency. The user also has preprogrammed responses that are automatically executed when the warning information packets contain location and magnitude estimates above a facility's tolerance.

  6. Earthquakes in Arkansas and vicinity 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Ausbrooks, Scott M.

    2011-01-01

    This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  7. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    PubMed

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-01

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  8. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  9. Characterization of earthquake rupture characteristics using hydroacoustic data

    NASA Astrophysics Data System (ADS)

    de Groot-Hedlin, C.

    2006-12-01

    Hydroacoustic signals (T-waves) generated by the 2004 Great Sumatra earthquake were recorded by a network of 5 small hydroacoustic arrays located in the Indian Ocean at distances of 2800 to 7000 km from the epicenter. The array configurations allow for accurate determination of the receiver to source azimuth given coherent arrivals. Analysis of a series of short time windows within the T-wave coda shows that the receiver to source azimuth varies smoothly as a function of time, suggesting a non-stationary T-wave source. The data indicate that the rupture proceeded in two distinct phases; initially it progressed northwest at approximately 2.4 km/s along the Sunda trench. At 600km from the epicenter the rupture slowed to approximately 1.5 km/s. However, T-waves generated by small earthquakes are also generated over a wide range of azimuths, reflecting seismic to acoustic over a broad expanse of the seafloor. Although the azimuthal variations for the great Sumatra event are shown to be inconsistent with a small-scale source, it is difficult in general to distinguish between azimuthal variations associated with the physics of T-wave excitation and those associated with an extended rupture zone. A method of determining rupture length based on the apparent motion of the T-wave source location is presented here and applied to several events, including the Great Sumatra earthquake of Dec 26, 2004 and the magnitude 8.6 event of March 28, 2005.

  10. America's faulty earthquake plans

    SciTech Connect

    Rosen, J

    1989-10-01

    In this article, the author discusses the liklihood of major earthquakes in both the western and eastern United States as well as the level of preparedness of each region of the U.S. for a major earthquake. Current technology in both earthquake-resistance design and earthquake detection is described. Governmental programs for earthquake hazard reduction are outlined and critiqued.

  11. Darwin's earthquake.

    PubMed

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant. PMID:21038753

  12. Earthquakes in Mississippi and vicinity 1811-2010

    USGS Publications Warehouse

    Dart, Richard L.; Bograd, Michael B.E.

    2011-01-01

    This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  13. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  14. Nucleation of the 2014 Pisagua, N. Chile earthquake : seismic analysis of the foreshock sequence.

    NASA Astrophysics Data System (ADS)

    Fuenzalida, A.; Tavera, H.; Ruiz, S.; Ryder, I. M. A.; Fernandez, E.; Garth, T.; Neto, O. D. L.; Metois, M.; De Angelis, S.; Rietbrock, A.

    2014-12-01

    The April 2014 Mw 8.1 Pisagua earthquake occurred in the Northern Chile seismic gap. This part of the subduction zone was believed to have not experienced a large earthquake since 1877. As part of an international collaboration the "The Integrated Plate boundary Observatory Chile (IPOC)" network was installed in 2007 to monitor this region. As well as recording the 2014 Pisagua mainshock, the IPOC network was able to record the full foreshock and aftershock sequences, providing a unique opportunity to study the nucleation and rupture process of large megathrust earthquakes. As most seismic activity occurred ~100 km offshore of the coastline, the onshore nature of the network only covers the rupture area to the east resulting in poor azimuthal coverage and hindering accurate depth estimation of seismic events. To improve the location accuracy of the Pisagua seismic sequences, we installed a temporary seismic network that was operative from 1 May 2014. The network comprised 12 short-period stations located in the coastal area between Moquegua and Tacna and three stations at the slopes of Ticsiani volcano to monitor any possible change in volcanic activity following the Pisagua earthquake.Our study focuses on the nucleation area, where part of the precursory sequence and a slow slip event occurred (Ruiz et al., 2014). This region became significantly stronger in the two weeks preceding the Pisagua mainshock. On 16 March 2014 the strongest foreshock (Mw 6.7) occurred offshore of Pisagua with a centroid depth of 10 km, shallower than the estimated subduction interface.In this study aftershock locations are further constrained using observations from the new network installed in Peru. We carefully estimate event locations and we compute regional moment tensor solutions by 1-D full waveform inversion of the broadband data. To improve our solutions, we are currently relocating aftershocks, to correct for foreshock mislocations by using the double-difference earthquake

  15. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California

    PubMed Central

    Lee, Ya-Ting; Turcotte, Donald L.; Holliday, James R.; Sachs, Michael K.; Rundle, John B.; Chen, Chien-Chih; Tiampo, Kristy F.

    2011-01-01

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M≥4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M≥4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor–Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most “successful” in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts. PMID:21949355

  16. Low-Frequency Earthquakes in Cascadia

    NASA Astrophysics Data System (ADS)

    Sweet, J. R.; Creager, K. C.; Ghosh, A.; Vidale, J. E.

    2009-12-01

    Low-frequency earthquakes (LFEs) are a recently identified class of earthquakes that have been observed to occur coincidentally with non-volcanic tremor in time and space. These LFEs also have a frequency spectra that is nearly identical to that of tremor—implying a common source for these two phenomena. Indeed, it has been proposed that tremor may simply be a superposition of many individual LFEs (Shelly et al., 2006, 2007, Nature). As such, LFEs have been used to constrain the location of tremor. We first reported LFEs in Cascadia last year, following the deployment of an 80-station, 1-km aperture seismic array on the Olympic Peninsula of western Washington State. This past year we have deployed 8 small aperture, 3-component seismic arrays across the northern Olympic Peninsula in the hopes of recording and locating additional tremor and LFEs. These arrays are composed of 10 3-component and 10 vertical component EarthScope seismometers. We use a combination of methods to identify and locate LFEs in our new, expanded dataset. Potential LFEs are first flagged by searching for peaks in the cross correlation of vertical and horizontal components that correspond to S minus P times of arriving energy (La Rocca, 2009, Science). These targets are then used as template events and are cross correlated with several hours of continuous data to find matching events. Using stacking and correlation we obtain accurate S minus P times for some arrays, and differential S and P times between arrays. We use these times to obtain robust estimates of LFE hypocenters. Unfortunately none of the 2009 data from the array of arrays covers a period of Episodic Tremor and Slip (ETS), but several smaller tremor bursts were recorded.

  17. Detecting Earthquakes--Part 2.

    ERIC Educational Resources Information Center

    Isenberg, C.; And Others

    1983-01-01

    Basic concepts associated with seismic wave propagation through the earth and the location of seismic events were explained in part 1 (appeared in January 1983 issue). This part focuses on the construction of a student seismometer for detecting earthquakes and underground nuclear explosions anywhere on the earth's surface. (Author/JN)

  18. Source Parameters Inversion for Recent Large Undersea Earthquakes from GRACE Data

    NASA Astrophysics Data System (ADS)

    Dai, Chunli

    region. For the 2011 Tohoku earthquake, the inversions from two different GRACE data products and two different forward modeling produce similar source characteristics, with the centroid location southwest of and the slip azimuth 10° larger than the GPS/seismic solutions. The GRACE-estimated dip angles are larger than that from GPS/seismic data for the 2004 Sumatra-Andaman and 2005 Nias earthquakes, the 2010 Maule, Chile earthquake, and the 2007 Bengkulu earthquake. These differences potentially show the additional offshore constraint from GRACE data, compared to GPS/seismic data. With more accurate and higher spatial resolution measurements anticipated from the GRACE Follow-on mission, with a scheduled launch date in 2017, we anticipate the data will be sensitive to even smaller earthquake signals. Therefore, GRACE type observations will hopefully become a more viable measurement to further constrain earthquake focal mechanisms.

  19. Evaluation of earthquake parameters used in the Indonesian Tsunami Early Warning System

    NASA Astrophysics Data System (ADS)

    Madlazim; Prastowo, Tjipto

    2016-02-01

    Twenty-two of a total of 30 earthquake events reported by the Indonesian Agency for Geophysics, Climatology and Meteorology during the time period 2007-2010 were falsely issued as tsunamigenic by the Indonesian Tsunami Early Warning System (Ina-TEWS). These 30 earthquakes were of different magnitudes and occurred in different locations. This study aimed to evaluate the performance of the Ina-TEWS using common earthquake parameters, including the earthquake magnitude, origin time, depth, and epicenter. In total, 298 datasets assessed by the Ina-TEWS and the global centroid moment tensor (CMT) method were assessed. The global CMT method is considered by almost all seismologists to be a reference for the determination of these parameters as they have been proved to be accurate. It was found that the earthquake magnitude, origin time, and depth provided by the Ina-TEWS were significantly different from those given in the global CMT catalog, whereas the latitude and longitude positions of the events provided by both tsunami assessment systems were coincident. The performance of the Ina-TEWS, particularly in terms of accuracy, remains questionable and needs to be improved.

  20. Earthquake history, 1769-1989

    SciTech Connect

    Ellsworth, W.L.

    1990-01-01

    Motion between the North American and Pacific plates at the latitude of the San Andreas fault produces a broad zone of large-magnitude earthquake activity extending more than 500 km into the continental interior. The San Andreas fault system defines the western limits of plate interaction and dominates the overall pattern of seismic strain release. Few of the M {ge} 6 earthquakes that have occurred in the past 2 centuries were located on the San Andreas fault proper, an observation emphasizing the importance of secondary faults for both seismic-hazard assessment and tectonic processes.

  1. Earthquakes triggered by fluid extraction

    USGS Publications Warehouse

    Segall, P.

    1989-01-01

    Seismicity is correlated in space and time with production from some oil and gas fields where pore pressures have declined by several tens of megapascals. Reverse faulting has occurred both above and below petroleum reservoirs, and normal faulting has occurred on the flanks of at least one reservoir. The theory of poroelasticity requires that fluid extraction locally alter the state of stress. Calculations with simple geometries predict stress perturbations that are consistent with observed earthquake locations and focal mechanisms. Measurements of surface displacement and strain, pore pressure, stress, and poroelastic rock properties in such areas could be used to test theoretical predictions and improve our understanding of earthquake mechanics. -Author

  2. Earthquake networks based on similar activity patterns.

    PubMed

    Tenenbaum, Joel N; Havlin, Shlomo; Stanley, H Eugene

    2012-10-01

    Earthquakes are a complex spatiotemporal phenomenon, the underlying mechanism for which is still not fully understood despite decades of research and analysis. We propose and develop a network approach to earthquake events. In this network, a node represents a spatial location while a link between two nodes represents similar activity patterns in the two different locations. The strength of a link is proportional to the strength of the cross correlation in activities of two nodes joined by the link. We apply our network approach to a Japanese earthquake catalog spanning the 14-year period 1985-1998. We find strong links representing large correlations between patterns in locations separated by more than 1000 kilometers, corroborating prior observations that earthquake interactions have no characteristic length scale. We find network characteristics not attributable to chance alone, including a large number of network links, high node assortativity, and strong stability over time. PMID:23214652

  3. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  4. Relocation of earthquakes at southwestern Indian Ocean Ridge and its tectonic significance

    NASA Astrophysics Data System (ADS)

    Luo, W.; Zhao, M.; Haridhi, H.; Lee, C. S.; Qiu, X.; Zhang, J.

    2015-12-01

    The southwest Indian Ridge (SWIR) is a typical ultra-slow spreading ridge (Dick et al., 2003) and further plate boundary where the earthquakes often occurred. Due to the lack of the seismic stations in SWIR, positioning of earthquakes and micro-earthquakes is not accurate. The Ocean Bottom Seismometers (OBS) seismic experiment was carried out for the first time in the SWIR 49 ° 39 'E from Jan. to March, 2010 (Zhao et al., 2013). These deployed OBS also recorded the earthquakes' waveforms during the experiment. Two earthquakes occurred respectively in Feb. 7 and Feb. 9, 2010 with the same magnitude of 4.4 mb. These two earthquakes were relocated using the software HYPOSAT based on the spectrum analysis and band-pass (3-5 Hz) filtering and picking up the travel-times of Pn and Sn. Results of hypocentral determinations show that there location error is decreased significantly by joined OBS's recording data. This study do not only provide the experiences for the next step deploying long-term wide-band OBSs, but also deepen understanding of the structure of SWIR and clarify the nature of plate tectonic motivation. This research was granted by the Natural Science Foundation of China (41176053, 91028002, 91428204). Keywords: southwest Indian Ridge (SWIR), relocation of earthquakes, Ocean Bottom Seismometers (OBS), HYPOSAT References:[1] Dick, H. J. B., Lin J., Schouten H. 2003. An ultraslow-spreading class of ocean ridge. Nature, 426(6965): 405-412. [2] Zhao M. H., et al. 2013. Three-dimensional seismic structure of the Dragon Flag oceanic core complex at the ultraslow spreading Southwest Indian Ridge (49°39' E). Geochemistry Geophysics Geosystems, 14(10): 4544-4563.

  5. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  6. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  7. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  8. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  9. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  10. A COMPARATIVE STUDY ON CHARACTERISTICS OF EARTHQUAKES IN 2011 TOHOKU-PACIFIC OCEAN EARTHQUAKE AND 2003 SANRIKU MINAMI EARTHQUAKE

    NASA Astrophysics Data System (ADS)

    Nakaaki, Shusuke; Sakai, Kimitoshi; Murono, Yoshitaka

    In the 2001 Tohoku-Pacific Ocean Earthquake (TPO-EQ), damage of railway structures were limited although the scale of earthquake was too large as compared with past large earthquakes. In this paper, relationships between properties of seismic waves and damage of viaducts were investigated. As a comparative example, 2003 Sanriku minami Earthquake(SM-EQ) was chosen, in which damage of viaducts was almost the same with that in TPO-EQ. Analytical results showed that difference of structural response were not significant although scale of earthquake were very different. In addition, it was found that structures located at damaged area observed in TPO-EQ showed 1.1 ~ 1.5 times larger response as compared in SM-EQ at the area, where severe damage was observed in TPO-EQ.

  11. The 8 February 1843 Lesser Antilles Earthquake: A "Missing" Great Earthquake

    NASA Astrophysics Data System (ADS)

    Hough, S. E.

    2012-12-01

    The seismic potential of the Lesser Antilles subduction zone and the adjacent Puerto Rico trench remains a matter of debate. The central arc of the Lesser Antilles subduction zone is currently accumulating elastic strain at a rate slower than the plate motion (Manaker et al., 2008), and a recent study concludes that no major subduction zone earthquake has occurred along the Puerto Rico trench during the 500-year historical record (tenBrink et al., 2012). The 8 February 1843 earthquake is the largest historical event on the Lesser Antilles arc. A recent study estimated a preferred magnitude of 8.5 based on near-field macroseismic effects (Feuillet et al., 2011), but the generally accepted value has been 7.5-8. A consideration of the regional and far-field macroseismic effects reveals a felt distribution comparable to those of recent great (Mw ≥ 9.0) earthquakes. Credible archival accounts provide compelling evidence that the earthquake was felt throughout a wide region of the eastern United States, as far north as New York City. The event was also felt at three locations in South America, and in Bermuda. A modest tsunami is described by two witnesses; two other accounts describe uplift of a stone wharf in Antigua, and along the northern coast of Guadaloupe . The distribution and character of intensities in the near field of any earthquake will reflect the complexity of the source; the pattern of high- and low-frequency radiation from the 2011 Tohoku, Japan, earthquake, demonstrates that these patterns can be complex for great earthquakes. For the 1843 earthquake, the far-field intensity distribution provides a stronger constraint on magnitude. The observations support the inference of a high (M≥8.5) magnitude, and significant moment release towards or possibly around the northern corner of the Lesser Antilles Arc. Possible modern analogs for such an event can be identified, including the Mw8.6 2005 Nias earthquake and the 11 April 2012 Mw8.6 strike

  12. Recent Mega-Thrust Tsunamigenic Earthquakes and PTHA

    NASA Astrophysics Data System (ADS)

    Lorito, S.

    2013-05-01

    The occurrence of several mega-thrust tsunamigenic earthquakes in the last decade, including but not limited to the 2004 Sumatra-Andaman, the 2010 Maule, and 2011 Tohoku earthquakes, has been a dramatic reminder of the limitations in our capability of assessing earthquake and tsunami hazard and risk. However, the increasingly high-quality geophysical observational networks allowed the retrieval of most accurate than ever models of the rupture process of mega-thrust earthquakes, thus paving the way for future improved hazard assessments. Probabilistic Tsunami Hazard Analysis (PTHA) methodology, in particular, is less mature than its seismic counterpart, PSHA. Worldwide recent research efforts of the tsunami science community allowed to start filling this gap, and to define some best practices that are being progressively employed in PTHA for different regions and coasts at threat. In the first part of my talk, I will briefly review some rupture models of recent mega-thrust earthquakes, and highlight some of their surprising features that likely result in bigger error bars associated to PTHA results. More specifically, recent events of unexpected size at a given location, and with unexpected rupture process features, posed first-order open questions which prevent the definition of an heterogeneous rupture probability along a subduction zone, despite of several recent promising results on the subduction zone seismic cycle. In the second part of the talk, I will dig a bit more into a specific ongoing effort for improving PTHA methods, in particular as regards epistemic and aleatory uncertainties determination, and the computational PTHA feasibility when considering the full assumed source variability. Only logic trees are usually explicated in PTHA studies, accounting for different possible assumptions on the source zone properties and behavior. The selection of the earthquakes to be actually modelled is then in general made on a qualitative basis or remains implicit

  13. Differences in tsunami generation between the December 26, 2004 and March 28, 2005 Sumatra earthquakes

    USGS Publications Warehouse

    Geist, E.L.; Bilek, S.L.; Arcas, D.; Titov, V.V.

    2006-01-01

    Source parameters affecting tsunami generation and propagation for the Mw > 9.0 December 26, 2004 and the Mw = 8.6 March 28, 2005 earthquakes are examined to explain the dramatic difference in tsunami observations. We evaluate both scalar measures (seismic moment, maximum slip, potential energy) and finite-source repre-sentations (distributed slip and far-field beaming from finite source dimensions) of tsunami generation potential. There exists significant variability in local tsunami runup with respect to the most readily available measure, seismic moment. The local tsunami intensity for the December 2004 earthquake is similar to other tsunamigenic earthquakes of comparable magnitude. In contrast, the March 2005 local tsunami was deficient relative to its earthquake magnitude. Tsunami potential energy calculations more accurately reflect the difference in tsunami severity, although these calculations are dependent on knowledge of the slip distribution and therefore difficult to implement in a real-time system. A significant factor affecting tsunami generation unaccounted for in these scalar measures is the location of regions of seafloor displacement relative to the overlying water depth. The deficiency of the March 2005 tsunami seems to be related to concentration of slip in the down-dip part of the rupture zone and the fact that a substantial portion of the vertical displacement field occurred in shallow water or on land. The comparison of the December 2004 and March 2005 Sumatra earthquakes presented in this study is analogous to previous studies comparing the 1952 and 2003 Tokachi-Oki earthquakes and tsunamis, in terms of the effect slip distribution has on local tsunamis. Results from these studies indicate the difficulty in rapidly assessing local tsunami runup from magnitude and epicentral location information alone.

  14. Earthquake history of the United States

    USGS Publications Warehouse

    Coffman, Jerry L., (Edited By); Von Hake, Carl A.; Stover, Carl W.

    1982-01-01

    part of Texas located in the Western Mountain Region. The map facing page 1 shows locations of all earthquakes in the regions that follow. A small map showing the area covered by each region immediately precedes the résumé of each chapter (except for the Alaska, Puerto Rico, and Hawaii regions). The seismic risk map below was developed in January 1969 for the conterminous United States by Dr. S. T. Algermissen of NOAA's Environmental Research Laboratories. Subject to revision as continuing research warrants, it is an updated edition of a map divides the United States into four zones: Zone 0, areas with no reasonable expectancy of earthquake damage; Zone 1, expected minor damage; Zone 2, expected moderate damage; and Zone 3, major destructive earthquakes may occur.

  15. Earthquake occurrence and effects.

    PubMed

    Adams, R D

    1990-01-01

    Although earthquakes are mainly concentrated in zones close to boundaries of tectonic plates of the Earth's lithosphere, infrequent events away from the main seismic regions can cause major disasters. The major cause of damage and injury following earthquakes is elastic vibration, rather than fault displacement. This vibration at a particular site will depend not only on the size and distance of the earthquake but also on the local soil conditions. Earthquake prediction is not yet generally fruitful in avoiding earthquake disasters, but much useful planning to reduce earthquake effects can be done by studying the general earthquake hazard in an area, and taking some simple precautions. PMID:2347628

  16. Statistical earthquake focal mechanism forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2014-04-01

    Forecasts of the focal mechanisms of future shallow (depth 0-70 km) earthquakes are important for seismic hazard estimates and Coulomb stress, and other models of earthquake occurrence. Here we report on a high-resolution global forecast of earthquake rate density as a function of location, magnitude and focal mechanism. In previous publications we reported forecasts of 0.5° spatial resolution, covering the latitude range from -75° to +75°, based on the Global Central Moment Tensor earthquake catalogue. In the new forecasts we have improved the spatial resolution to 0.1° and the latitude range from pole to pole. Our focal mechanism estimates require distance-weighted combinations of observed focal mechanisms within 1000 km of each gridpoint. Simultaneously, we calculate an average rotation angle between the forecasted mechanism and all the surrounding mechanisms, using the method of Kagan & Jackson proposed in 1994. This average angle reveals the level of tectonic complexity of a region and indicates the accuracy of the prediction. The procedure becomes problematical where longitude lines are not approximately parallel, and where shallow earthquakes are so sparse that an adequate sample spans very large distances. North or south of 75°, the azimuths of points 1000 km away may vary by about 35°. We solved this problem by calculating focal mechanisms on a plane tangent to the Earth's surface at each forecast point, correcting for the rotation of the longitude lines at the locations of earthquakes included in the averaging. The corrections are negligible between -30° and +30° latitude, but outside that band uncorrected rotations can be significantly off. Improved forecasts at 0.5° and 0.1° resolution are posted at http://eq.ess.ucla.edu/kagan/glob_gcmt_index.html.

  17. Aftershock mechanisms from the 2010 Mw 8.8 Maule, Chile earthquake: detailed analysis using full waveform inversion

    NASA Astrophysics Data System (ADS)

    Rietbrock, A.; Hicks, S. P.; Chagas, B.; Detzel, H. A.

    2014-12-01

    Since the earthquake rupture process is extremely heterogeneous, it is vital to understand how structural variations in the overriding plate and downgoing slab may control slip style along the subduction megathrust. The large-scale 3-D geometry of subduction plate boundaries is rapidly becoming well understood; however, the nature of any finer-scale structure along the plate interface remains elusive. A detailed study of earthquake source mechanisms along a megathrust region can shed light on the nature of fine-scale structures along the megathrust. The Mw 8.8 Maule earthquake that struck central Chile in 2010 is the sixth largest earthquake ever recorded. Following the earthquake, there was an international deployment of seismic stations in the rupture area, making this one of the best datasets of an aftershock sequence following a large earthquake. This dataset provides a unique opportunity to perform a detailed study of megathrust earthquake source mechanisms. Based on a high-resolution 3-D velocity model and robust earthquake locations [Hicks et al., 2014], we calculate regional moment tensors using the ISOLA software package [Sokos & Zahradnik, 2008]. We incorporate accelerometer recordings, important for constraining solutions of large earthquakes in the overriding plate. We also validate the robustness of our solutions by assessing the consistency of mechanisms with P-wave polarities observed at both onshore and offshore seismic stations, and compare them to already published solutions. We find that accurate earthquake locations are vital for the fine-scale interpretation of focal mechanisms, particularly for offshore events. Our results show that most moment tensor solutions with thrusting mechanisms have a nodal plane dipping parallel to the subducting plate interface. Interestingly, we also find earthquakes with normal faulting mechanisms lying along to the megathrust plate interface in the south of the rupture area. This finding suggests that megathrust

  18. Rupture process of the 2013 Okhotsk deep mega earthquake from iterative backprojection and compress sensing methods

    NASA Astrophysics Data System (ADS)

    Qin, W.; Yin, J.; Yao, H.

    2013-12-01

    On May 24th 2013 a Mw 8.3 normal faulting earthquake occurred at a depth of approximately 600 km beneath the sea of Okhotsk, Russia. It is a rare mega earthquake that ever occurred at such a great depth. We use the time-domain iterative backprojection (IBP) method [1] and also the frequency-domain compressive sensing (CS) technique[2] to investigate the rupture process and energy radiation of this mega earthquake. We currently use the teleseismic P-wave data from about 350 stations of USArray. IBP is an improved method of the traditional backprojection method, which more accurately locates subevents (energy burst) during earthquake rupture and determines the rupture speeds. The total rupture duration of this earthquake is about 35 s with a nearly N-S rupture direction. We find that the rupture is bilateral in the beginning 15 seconds with slow rupture speeds: about 2.5km/s for the northward rupture and about 2 km/s for the southward rupture. After that, the northward rupture stopped while the rupture towards south continued. The average southward rupture speed between 20-35 s is approximately 5 km/s, lower than the shear wave speed (about 5.5 km/s) at the hypocenter depth. The total rupture length is about 140km, in a nearly N-S direction, with a southward rupture length about 100 km and a northward rupture length about 40 km. We also use the CS method, a sparse source inversion technique, to study the frequency-dependent seismic radiation of this mega earthquake. We observe clear along-strike frequency dependence of the spatial and temporal distribution of seismic radiation and rupture process. The results from both methods are generally similar. In the next step, we'll use data from dense arrays in southwest China and also global stations for further analysis in order to more comprehensively study the rupture process of this deep mega earthquake. Reference [1] Yao H, Shearer P M, Gerstoft P. Subevent location and rupture imaging using iterative backprojection for

  19. Bayesian historical earthquake relocation: an example from the 1909 Taipei earthquake

    USGS Publications Warehouse

    Minson, Sarah E.; Lee, William H.K.

    2014-01-01

    Locating earthquakes from the beginning of the modern instrumental period is complicated by the fact that there are few good-quality seismograms and what traveltimes do exist may be corrupted by both large phase-pick errors and clock errors. Here, we outline a Bayesian approach to simultaneous inference of not only the hypocentre location but also the clock errors at each station and the origin time of the earthquake. This methodology improves the solution for the source location and also provides an uncertainty analysis on all of the parameters included in the inversion. As an example, we applied this Bayesian approach to the well-studied 1909 Mw 7 Taipei earthquake. While our epicentre location and origin time for the 1909 Taipei earthquake are consistent with earlier studies, our focal depth is significantly shallower suggesting a higher seismic hazard to the populous Taipei metropolitan area than previously supposed.

  20. Bayesian historical earthquake relocation: an example from the 1909 Taipei earthquake

    NASA Astrophysics Data System (ADS)

    Minson, Sarah E.; Lee, William H. K.

    2014-09-01

    Locating earthquakes from the beginning of the modern instrumental period is complicated by the fact that there are few good-quality seismograms and what traveltimes do exist may be corrupted by both large phase-pick errors and clock errors. Here, we outline a Bayesian approach to simultaneous inference of not only the hypocentre location but also the clock errors at each station and the origin time of the earthquake. This methodology improves the solution for the source location and also provides an uncertainty analysis on all of the parameters included in the inversion. As an example, we applied this Bayesian approach to the well-studied 1909 Mw 7 Taipei earthquake. While our epicentre location and origin time for the 1909 Taipei earthquake are consistent with earlier studies, our focal depth is significantly shallower suggesting a higher seismic hazard to the populous Taipei metropolitan area than previously supposed.

  1. Assessment of Liquefaction Susceptibility of Kutahya Soils Based on Recent Earthquakes in Turkey

    NASA Astrophysics Data System (ADS)

    Zengin, Enes; Abiddin Erguler, Zeynal

    2014-05-01

    The plate tectonic setting of Turkey resulted many destructive earthquakes having magnitude higher than 7 in several cities situated close to faulting system. The city of Kutahya and its surrounding counties are notable examples to be located in the earthquake prone region and therefore, several earthquakes have been recently recorded particularly in its Simav district. A significant part of the residential area of Kutahya is found on alluvial deposits dominated by silt and fine sand size materials, and its southern boundary is controlled by Kutahya fault zone (KFZ) extending parallel to the city settlement. In this study, considering the possibility of a potential destructive earthquake in future as well as increasing population dependent further demand for new building in this city, investigation liquefaction potential of these soils is aimed for using in earthquake risk mitigation strategies. For this purpose, physical, ground water condition and standard penetration test (SPT) results from 283 different boreholes spreading over a wide area were examined to understand the behaviour this soil under earthquake induced dynamic loading. The total assessed drilling depth is about 2140 m. Required corrections were applied to all SPT data for obtaining SPT-(N1)60 values for liquefaction analyses. The estimation representative magnitude, depth of epicentre and maximum ground acceleration (amax) based on previous earthquakes and faulting characteristics of KFZ were initial targets for accurately assessment liquefaction phenomena of this city. For determination of amax in this region, in addition to attenuation relationship based on Turkish strong ground motion data, individual measurements from earthquakes stations closing to study site were also collected. As a result of all analyses and reviewing previous earthquakes records in this region, earthquake magnitudes vary between 5.0 and 7.4, and amax values changing between 400 and 800 gal were used in liquefaction

  2. A revised “earthquake report” questionaire

    USGS Publications Warehouse

    Stover, C.; Reagor, G.; Simon, R.

    1976-01-01

    The U.S geological Survey is responsible for conducting intensity and damage surveys following felt or destructive earthquakes in the United States. Shortly after a felt or damaging earthquake occurs, a canvass of the affected area is made. Specially developed questionnaires are mailed to volunteer observers located within the estimated felt area. These questionnaires, "Earthquake Reports," are filled out by the observers and returned to the Survey's National Earthquake Information Service, which is located in Colorado. They are then evaluated, and, based on answers to questions about physical effects seen or felt, each canvassed location is assigned to the various locations, they are plotted on an intensity distribution map. When all of the intensity data have been plotted, isoseismals can then be contoured through places where equal intensity was experienced. The completed isoseismal map yields a detailed picture of the earthquake, its effects, and its felt area. All of the data and maps are published quarterly in a U.S Geological Survey Circular series entitled "Earthquakes in the United States".  

  3. Seismicity map tools for earthquake studies

    NASA Astrophysics Data System (ADS)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  4. Remote triggering of deep earthquakes in the 2002 Tonga sequences.

    PubMed

    Tibi, Rigobert; Wiens, Douglas A; Inoue, Hiroshi

    2003-08-21

    It is well established that an earthquake in the Earth's crust can trigger subsequent earthquakes, but such triggering has not been documented for deeper earthquakes. Models for shallow fault interactions suggest that static (permanent) stress changes can trigger nearby earthquakes, within a few fault lengths from the causative earthquake, whereas dynamic (transient) stresses carried by seismic waves may trigger earthquakes both nearby and at remote distances. Here we present a detailed analysis of the 19 August 2002 Tonga deep earthquake sequences and show evidence for both static and dynamic triggering. Seven minutes after a magnitude 7.6 earthquake occurred at a depth of 598 km, a magnitude 7.7 earthquake (664 km depth) occurred 300 km away, in a previously aseismic region. We found that nearby aftershocks of the first mainshock are preferentially located in regions where static stresses are predicted to have been enhanced by the mainshock. But the second mainshock and other triggered events are located at larger distances where static stress increases should be negligible, thus suggesting dynamic triggering. The origin times of the triggered events do not correspond to arrival times of the main seismic waves from the mainshocks and the dynamically triggered earthquakes frequently occur in aseismic regions below or adjacent to the seismic zone. We propose that these events are triggered by transient effects in regions near criticality, but where earthquakes have difficulty nucleating without external influences. PMID:12931183

  5. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2007

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.

    2008-01-01

    Between January 1 and December 31, 2007, AVO located 6,664 earthquakes of which 5,660 occurred within 20 kilometers of the 33 volcanoes monitored by the Alaska Volcano Observatory. Monitoring highlights in 2007 include: the eruption of Pavlof Volcano, volcanic-tectonic earthquake swarms at the Augustine, Illiamna, and Little Sitkin volcanic centers, and the cessation of episodes of unrest at Fourpeaked Mountain, Mount Veniaminof and the northern Atka Island volcanoes (Mount Kliuchef and Korovin Volcano). This catalog includes descriptions of : (1) locations of seismic instrumentation deployed during 2007; (2) earthquake detection, recording, analysis, and data archival systems; (3) seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2007; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2007.

  6. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2006

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl

    2008-01-01

    Between January 1 and December 31, 2006, AVO located 8,666 earthquakes of which 7,783 occurred on or near the 33 volcanoes monitored within Alaska. Monitoring highlights in 2006 include: an eruption of Augustine Volcano, a volcanic-tectonic earthquake swarm at Mount Martin, elevated seismicity and volcanic unrest at Fourpeaked Mountain, and elevated seismicity and low-level tremor at Mount Veniaminof and Korovin Volcano. A new seismic subnetwork was installed on Fourpeaked Mountain. This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field during 2006, (2) a description of earthquake detection, recording, analysis, and data archival systems, (3) a description of seismic velocity models used for earthquake locations, (4) a summary of earthquakes located in 2006, and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2006.

  7. A Filter Bank Approach to Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Meier, Men-Andrin; Heaton, Tom; Clinton, John

    2014-05-01

    Earthquake Early Warning (EEW) is a race against time. The longer it takes to detect and characterize an ongoing event, the larger is the blind zone - the region where a warning arrives only after the most damaging ground motion has occurred. The problem is most acute during medium size earthquakes, where damaging ground motion is confined to a small zone around the epicenter. An ideal EEW algorithm which is fast enough to provide relevant alerts for such scenario events would have to produce reliable event characterization based on observations of very short snippets of data recorded at only very few stations. For such a scheme to work, without significant numbers of false alarms (which continue to hamper both single-station and network based approaches today), the real-time information that is available for an earthquake has to be exploited in a more optimal way than what is currently done. Our approach is to fully mine the broadband frequency content of incoming waveforms that contains significant information on the size and epicentral distance of the ongoing event. We propose a filter bank approach with minimum phase delay filters which allows us to use frequency information from each frequency band at each triggered station at the earliest possible time. We have compiled and processed an extensive dataset of near-field earthquake waveforms. In an empirical maximum likelihood scheme, we use the filter bank output from the first seconds after the P-wave onset of each waveform to estimate the most likely magnitude and epicentral distance to have caused this waveform. We show how our single station approach can be integrated into an evolutionary and fully probabilistic network EEW system. We demonstrate that our method can allow sufficiently accurate characterization of an ongoing event with two stations, with consistent characterization of the evolving uncertainty of the location and magnitude.

  8. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  9. Catalog of Hawaiian earthquakes, 1823-1959

    USGS Publications Warehouse

    Klein, Fred W.; Wright, Thomas L.

    2000-01-01

    This catalog of more than 17,000 Hawaiian earthquakes (of magnitude greater than or equal to 5), principally located on the Island of Hawaii, from 1823 through the third quarter of 1959 is designed to expand our ability to evaluate seismic hazard in Hawaii, as well as our knowledge of Hawaiian seismic rhythms as they relate to eruption cycles at Kilauea and Mauna Loa volcanoes and to subcrustal earthquake patterns related to the tectonic evolution of the Hawaiian chain.

  10. Structural Aspects of the Iquique Area With Possible Influence on the Mw 8.2, 2014, Pisagua Earthquake

    NASA Astrophysics Data System (ADS)

    Sobiesiak, M.; Schaller, T.; Meneses, G.; Goetze, H. J.; Satriano, C.; Poiata, N.; Ruiz, S.; Comte, D.; Bernard, P.; Vilotte, J. P.; Métois, M.; Olcay, M.; Tassera, C.; Campos, J. A.

    2014-12-01

    The Mw 8.2, 2014, Pisagua earthquake in Northern Chile did not come as a complete surprise as it was anticipated that in the "near future" a large earthquake could happen in the North Chile seismic gap. Whether the gap would rupture in a single M~9 event or in several M 7-8 events has been subject of debate. Now it is clear that the Pisagua earthquake ruptured the shallower part of one segment of the North Chilean seismogenic subduction interface and leaves the questions why the new rupture started here and what could be a future scenario for the failure of the seismic gaps' residuals. To identify seismogenic structures which define areas where large events might nucleate, asperities develop or segment boundaries form, we need large catalogues of accurately located seismic events in all magnitude ranges. Therefore, we apply a new method to automatically detect and locate seismic events based on the backprojection algorithm and multi-band kurtosis signal representation (see also abstracts Satriano et al. and Poiata et al.) using the data basis of the Iquique Local Network and the Integrated Plate Boundary Observatory in North Chile. Precise earthquake locations, seismicity rate changes and spatial b-value distributions can then refer to material boundaries, and distinguish between locked and creeping sections, which lead to the sites where actual deformation also on small scales is taking place.While seismicity distribution and its temporal changes help to identify the outlines of seismogenic structures, congruent gravity isostatic residual anomalies and modeled density distributions tell us something about the physical nature of earthquake nucleation zones and asperities. We present new results from density modeling on narrow profiles over the entire Pisagua earthquake rupture plane revealing dense bodies which we suggest have influenced the start of the main shock rupture as well as its propagation by linking spatial background and aftershock distributions.

  11. Izmit, Turkey 1999 Earthquake Interferogram

    NASA Technical Reports Server (NTRS)

    2001-01-01

    This image is an interferogram that was created using pairs of images taken by Synthetic Aperture Radar (SAR). The images, acquired at two different times, have been combined to measure surface deformation or changes that may have occurred during the time between data acquisition. The images were collected by the European Space Agency's Remote Sensing satellite (ERS-2) on 13 August 1999 and 17 September 1999 and were combined to produce these image maps of the apparent surface deformation, or changes, during and after the 17 August 1999 Izmit, Turkey earthquake. This magnitude 7.6 earthquake was the largest in 60 years in Turkey and caused extensive damage and loss of life. Each of the color contours of the interferogram represents 28 mm (1.1 inches) of motion towards the satellite, or about 70 mm (2.8 inches) of horizontal motion. White areas are outside the SAR image or water of seas and lakes. The North Anatolian Fault that broke during the Izmit earthquake moved more than 2.5 meters (8.1 feet) to produce the pattern measured by the interferogram. Thin red lines show the locations of fault breaks mapped on the surface. The SAR interferogram shows that the deformation and fault slip extended west of the surface faults, underneath the Gulf of Izmit. Thick black lines mark the fault rupture inferred from the SAR data. Scientists are using the SAR interferometry along with other data collected on the ground to estimate the pattern of slip that occurred during the Izmit earthquake. This then used to improve computer models that predict how this deformation transferred stress to other faults and to the continuation of the North Anatolian Fault, which extends to the west past the large city of Istanbul. These models show that the Izmit earthquake further increased the already high probability of a major earthquake near Istanbul.

  12. Earthquake rate and magnitude distributions of great earthquakes for use in global forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2016-04-01

    We have obtained new results in the statistical analysis of global earthquake catalogs with special attention to the largest earthquakes, and we examined the statistical behavior of earthquake rate variations. These results can serve as an input for updating our recent earthquake forecast, known as the "Global Earthquake Activity Rate 1" model (GEAR1), which is based on past earthquakes and geodetic strain rates. The GEAR1 forecast is expressed as the rate density of all earthquakes above magnitude 5.8 within 70 km of sea level everywhere on earth at 0.1 by 0.1 degree resolution, and it is currently being tested by the Collaboratory for Study of Earthquake Predictability. The seismic component of the present model is based on a smoothed version of the Global Centroid Moment Tensor (GCMT) catalog from 1977 through 2013. The tectonic component is based on the Global Strain Rate Map, a "General Earthquake Model" (GEM) product. The forecast was optimized to fit the GCMT data from 2005 through 2012, but it also fit well the earthquake locations from 1918 to 1976 reported in the International Seismological Centre-Global Earthquake Model (ISC-GEM) global catalog of instrumental and pre-instrumental magnitude determinations. We have improved the recent forecast by optimizing the treatment of larger magnitudes and including a longer duration (1918-2011) ISC-GEM catalog of large earthquakes to estimate smoothed seismicity. We revised our estimates of upper magnitude limits, described as corner magnitudes, based on the massive earthquakes since 2004 and the seismic moment conservation principle. The new corner magnitude estimates are somewhat larger than but consistent with our previous estimates. For major subduction zones we find the best estimates of corner magnitude to be in the range 8.9 to 9.6 and consistent with a uniform average of 9.35. Statistical estimates tend to grow with time as larger earthquakes occur. However, by using the moment conservation principle that

  13. Earthquake rate and magnitude distributions of great earthquakes for use in global forecasts

    NASA Astrophysics Data System (ADS)

    Kagan, Yan Y.; Jackson, David D.

    2016-07-01

    We have obtained new results in the statistical analysis of global earthquake catalogues with special attention to the largest earthquakes, and we examined the statistical behaviour of earthquake rate variations. These results can serve as an input for updating our recent earthquake forecast, known as the `Global Earthquake Activity Rate 1' model (GEAR1), which is based on past earthquakes and geodetic strain rates. The GEAR1 forecast is expressed as the rate density of all earthquakes above magnitude 5.8 within 70 km of sea level everywhere on earth at 0.1 × 0.1 degree resolution, and it is currently being tested by the Collaboratory for Study of Earthquake Predictability. The seismic component of the present model is based on a smoothed version of the Global Centroid Moment Tensor (GCMT) catalogue from 1977 through 2013. The tectonic component is based on the Global Strain Rate Map, a `General Earthquake Model' (GEM) product. The forecast was optimized to fit the GCMT data from 2005 through 2012, but it also fit well the earthquake locations from 1918 to 1976 reported in the International Seismological Centre-Global Earthquake Model (ISC-GEM) global catalogue of instrumental and pre-instrumental magnitude determinations. We have improved the recent forecast by optimizing the treatment of larger magnitudes and including a longer duration (1918-2011) ISC-GEM catalogue of large earthquakes to estimate smoothed seismicity. We revised our estimates of upper magnitude limits, described as corner magnitudes, based on the massive earthquakes since 2004 and the seismic moment conservation principle. The new corner magnitude estimates are somewhat larger than but consistent with our previous estimates. For major subduction zones we find the best estimates of corner magnitude to be in the range 8.9 to 9.6 and consistent with a uniform average of 9.35. Statistical estimates tend to grow with time as larger earthquakes occur. However, by using the moment conservation

  14. The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?

    USGS Publications Warehouse

    Kilb, Debi; Gomberg, J.

    1999-01-01

    We examine the initial subevent (ISE) of the M?? 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the 'preslip' and 'cascade' models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that can be described as purely dynamic ruptures. In this model a large earthquake is triggered by smaller earthquakes; there is no size scaling between triggering and triggered events and a variety of stress transfer mechanisms are possible. Alternatively, in the preslip model, a large earthquake nucleates as an aseismically slipping patch in which the patch dimension grows and scales with the earthquake's ultimate size; the byproduct of this loading process is the ISE. In this model, the duration of the ISE signal scales with the ultimate size of the earthquake, suggesting that nucleation and earthquake size are determined by a more predictable, measurable, and organized process. To distinguish between these two end-member models we use short period seismograms recorded by the Southern California Seismic Network. We address questions regarding the similarity in hypocenter locations and focal mechanisms of the ISE and the mainshock. We also compare the ISE's waveform characteristics to those of small earthquakes and to the beginnings of earthquakes with a range of magnitudes. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both models and thus do not discriminate between them. However, further tests show the ISE's waveform characteristics are similar to those of typical small earthquakes in the vicinity and more importantly, do not scale with the mainshock magnitude. These results are more consistent with the cascade model.

  15. Detection of hydrothermal precursors to large northern california earthquakes.

    PubMed

    Silver, P G; Valette-Silver, N J

    1992-09-01

    During the period 1973 to 1991 the interval between eruptions from a periodic geyser in Northern California exhibited precursory variations 1 to 3 days before the three largest earthquakes within a 250-kilometer radius of the geyser. These include the magnitude 7.1 Loma Prieta earthquake of 18 October 1989 for which a similar preseismic signal was recorded by a strainmeter located halfway between the geyser and the earthquake. These data show that at least some earthquakes possess observable precursors, one of the prerequisites for successful earthquake prediction. All three earthquakes were further than 130 kilometers from the geyser, suggesting that precursors might be more easily found around rather than within the ultimate rupture zone of large California earthquakes. PMID:17738277

  16. Tracking Earthquake Cascades

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2011-12-01

    In assessing their risk to society, earthquakes are best characterized as cascades that can propagate from the natural environment into the socio-economic (built) environment. Strong earthquakes rarely occur as isolated events; they usually cluster in foreshock-mainshock-aftershock sequences, seismic swarms, and extended sequences of large earthquakes that propagate along major fault systems. These cascades are regulated by stress-mediated interactions among faults driven by tectonic loading. Within these cascades, each large event can itself cause a chain reaction in which the primary effects of faulting and ground shaking induce secondary effects, including tsunami, landslides, liquefaction, and set off destructive processes within the built environment, such as fires and radiation leakage from nuclear plants. Recent earthquakes have demonstrated how the socio-economic effects of large earthquakes can reverberate for many years. To reduce earthquake risk and improve the resiliency of communities to earthquake damage, society depends on five geotechnologies for tracking earthquake cascades: long-term probabilistic seismic hazard analysis (PSHA), short-term (operational) earthquake forecasting, earthquake early warning, tsunami warning, and the rapid production of post-event information for response and recovery (see figure). In this presentation, I describe how recent advances in earthquake system science are leading to improvements in this geotechnology pipeline. In particular, I will highlight the role of earthquake simulations in predicting strong ground motions and their secondary effects before and during earthquake cascades

  17. Crowd-Sourced Global Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  18. Earthquake sources near Uturuncu Volcano

    NASA Astrophysics Data System (ADS)

    Keyson, L.; West, M. E.

    2013-12-01

    Uturuncu, located in southern Bolivia near the Chile and Argentina border, is a dacitic volcano that was last active 270 ka. It is a part of the Altiplano-Puna Volcanic Complex, which spans 50,000 km2 and is comprised of a series of ignimbrite flare-ups since ~23 ma. Two sets of evidence suggest that the region is underlain by a significant magma body. First, seismic velocities show a low velocity layer consistent with a magmatic sill below depths of 15-20 km. This inference is corroborated by high electrical conductivity between 10km and 30km. This magma body, the so called Altiplano-Puna Magma Body (APMB) is the likely source of volcanic activity in the region. InSAR studies show that during the 1990s, the volcano experienced an average uplift of about 1 to 2 cm per year. The deformation is consistent with an expanding source at depth. Though the Uturuncu region exhibits high rates of crustal seismicity, any connection between the inflation and the seismicity is unclear. We investigate the root causes of these earthquakes using a temporary network of 33 seismic stations - part of the PLUTONS project. Our primary approach is based on hypocenter locations and magnitudes paired with correlation-based relative relocation techniques. We find a strong tendency toward earthquake swarms that cluster in space and time. These swarms often last a few days and consist of numerous earthquakes with similar source mechanisms. Most seismicity occurs in the top 10 kilometers of the crust and is characterized by well-defined phase arrivals and significant high frequency content. The frequency-magnitude relationship of this seismicity demonstrates b-values consistent with tectonic sources. There is a strong clustering of earthquakes around the Uturuncu edifice. Earthquakes elsewhere in the region align in bands striking northwest-southeast consistent with regional stresses.

  19. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2011

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl K.

    2012-01-01

    Between January 1 and December 31, 2011, the Alaska Volcano Observatory (AVO) located 4,364 earthquakes, of which 3,651 occurred within 20 kilometers of the 33 volcanoes with seismograph subnetworks. There was no significant seismic activity above background levels in 2011 at these instrumented volcanic centers. This catalog includes locations, magnitudes, and statistics of the earthquakes located in 2011 with the station parameters, velocity models, and other files used to locate these earthquakes.

  20. Catalog of significant historical earthquakes in the Central United States

    USGS Publications Warehouse

    Bakun, W.H.; Hopper, M.G.

    2004-01-01

    We use Modified Mercalli intensity assignments to estimate source locations and moment magnitude M for eighteen 19th-century and twenty early- 20th-century earthquakes in the central United States (CUS) for which estimates of M are otherwise not available. We use these estimates, and locations and M estimated elsewhere, to compile a catelog of significant historical earthquakes in the CUS. The 1811-1812 New Madrid earthquakes apparently dominated CUS seismicity in the first two decades of the 19th century. M5-6 earthquakes occurred in the New Madrid Seismic Zone in 1843 and 1878, but none have occurred since 1878. There has been persistent seismic activity in the Illinois Basin in southern Illinois and Indiana, with M > 5.0 earthquakes in 1895, 1909, 1917, 1968, and 1987. Four other M > 5.0 CUS historical earthquakes have occurred: in Kansas in 1867, in Nebraska in 1877, in Oklahoma in 1882, and in Kentucky in 1980.

  1. Real-time earthquake monitoring using a search engine method.

    PubMed

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  2. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, S.E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  3. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  4. Speeding earthquake disaster relief

    USGS Publications Warehouse

    Mortensen, Carl; Donlin, Carolyn; Page, Robert A.; Ward, Peter

    1995-01-01

    In coping with recent multibillion-dollar earthquake disasters, scientists and emergency managers have found new ways to speed and improve relief efforts. This progress is founded on the rapid availability of earthquake information from seismograph networks.

  5. Testing hypotheses of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

  6. Coseismic slip distribution of the 2011 off the Pacific Coast of Tohoku Earthquake (M9.0) refined by means of seafloor geodetic data

    NASA Astrophysics Data System (ADS)

    Iinuma, T.; Hino, R.; Kido, M.; Inazu, D.; Osada, Y.; Ito, Y.; Ohzono, M.; Tsushima, H.; Suzuki, S.; Fujimoto, H.; Miura, S.

    2012-07-01

    On 11 March 2011, the devastating M9.0 Tohoku Earthquake occurred on the interface of the subducting Pacific plate, and was followed by a huge tsunami that killed about 20,000 people. Several geophysical studies have already suggested that the very shallow portion of the plate interface might have played an important role in producing such a large earthquake and tsunami. However, the sparsity of seafloor observations leads to insufficient spatial resolution of the fault slip on such a shallow plate interface. For this reason, the location and degree of the slip has not yet been estimated accurately enough to assess future seismic risks. Thus, we estimated the coseismic slip distribution based on terrestrial GPS observations and all available seafloor geodetic data that significantly improve the spatial resolution at the shallow portion of the plate interface. The results reveal that an extremely large (greater than 50 m) slip occurred in a small (about 40 km in width and 120 km in length) area near the Japan Trench and generated the huge tsunami. The estimated slip distribution and a comparison of it with the coupling coefficient distribution deduced from the analysis of the small repeating earthquakes suggest that the 2011 Tohoku Earthquake released strain energy that had accumulated over the past 1000 years, probably since the Jogan Earthquake in 869. The accurate assessments of seismic risks on very shallow plate interfaces in subduction zones throughout the world can be obtained by improving the quality and quantity of seafloor geodetic observations.

  7. Analysis of recent glacial earthquakes in Greenland

    NASA Astrophysics Data System (ADS)

    Olsen, K.; Nettles, M.

    2015-12-01

    Large calving events at Greenland's outlet glaciers produce teleseismically detectable glacial earthquakes. These events are observed in the seismic record for the past 22 years, but the complete catalog of glacial earthquakes still numbers only ~300. The annual occurrence of these long-period events has increased over time, which makes recent years especially valuable in expanding the global dataset. Glacial earthquakes from 1993- 2010 have been analyzed systematically (Tsai and Ekström, 2007; Veitch and Nettles, 2012). Here, we analyze more recent events using the same centroid—single-force (CSF) approach as previous authors, focusing initially on data from 2013. In addition, we perform a focused study of selected events from 2009-2010 to assess the reliability of the force azimuths obtained from such inversions. Recent spatial and temporal patterns of glacial earthquakes in Greenland differ from those in previous years. In 2013, three times as many events occurred on the west coast as on the east, and these events originated predominantly from two glaciers: Jakobshavn Glacier on the west coast and Helheim Glacier on the east. Kangerdlugssuaq Glacier, on the east coast, produced no glacial earthquakes in 2013, though it produced many events in earlier years. Previous CSF results for glacial earthquakes show force azimuths perpendicular to the glacier front during a calving event, with force plunges near horizontal. However, some azimuths indicate forces initially oriented upglacier, while others are oriented downglacier (seaward). We perform a set of experiments on events from 2009 and 2010 and find two acceptable solutions for each glacial earthquake, oriented 180° apart with plunges of opposite sign and centroid times differing by approximately one half of the assumed duration of the earthquake time function. These results suggest the need for a more complex time function to model glacial earthquakes more accurately.

  8. Analysis of Recent Earthquakes in the Ft. Worth Basin near Dallas, Texas

    NASA Astrophysics Data System (ADS)

    Frohlich, C. A.; Potter, E.; Hayward, C.; Stump, B.

    2009-12-01

    Between 31 October 2008 and 16 May 2009, the USGS National Earthquake Information Center (NEIC) reported 14 earthquakes with magnitudes between 2.5 and 3.3 and epicenters between Dallas and Fort Worth (DFW), Texas. Local residents felt nine of these earthquakes, and news stories expressed locals' concern and speculation that the earthquakes might be related to recent natural gas drilling in the Barnett Shale. We report on analysis of data collected by continuously operating regional stations and by a five-station temporary local network operated in the DFW area between 9 November 2008 and 2 January 2009. We also discuss preliminary results from temporary networks operated since June 2009 near DFW and Cleburne, TX. Both portable deployments were supported by PASSCAL. Among the observations at regional distances, the station WMOK (Wichita Mountain, Oklahoma; Δ = 2.36°) has clear recordings of DFW quakes and initial data processing showed that waveforms from the reported DFW quakes were all highly correlated. Cross correlation of the continuous signal at WMOK with known DFW quake signals found no DFW quakes occurring prior to 30 October 2008, and then ~180 similar events occurring between 30 October and 17 May 2009. These earthquakes had magnitudes of 1.5 and greater, and most occurred in four time periods; 100 occurred from 30 Oct to 1 Nov; 11 on 20 Nov; 18 on 26 December, and 28 on 15-17 May. Eleven of these earthquakes were exceptionally well recorded by the 2008 temporary network, allowing us to determine accurate relative locations using velocities measured in a nearby deep well. Our preferred hypocentral locations have focal depths of ~4.5 km and lie along a 1.1 km SW-NE line on the DFW airport property, south of the main terminal area. The mean epicenter determined for these 11 events is less than 0.5 km from a 4.2 km deep saltwater disposal (SWD) drilled to dispose of brines associated with natural gas production. Injection of brines in this well commenced

  9. Mexican Earthquakes and Tsunamis Catalog Reviewed

    NASA Astrophysics Data System (ADS)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  10. CyberTEAM Interactive Epicenter Locator Tool

    NASA Astrophysics Data System (ADS)

    Ouyang, Y.; Hayden, K.; Lehmann, M.; Kilb, D.

    2008-12-01

    News coverage showing collapsed buildings, broken bridges and smashed cars help middle school students visualize the hazardous nature of earthquakes. However, few students understand how scientists investigate earthquakes through analysis of data collected using technology devices from around the world. The important findings by Muawia Barazangi and James Dorman in 1969 revealed how earthquakes charted between 1961 and 1967 delineated narrow belts of seismicity. This important discovery prompted additional research that eventually led to the theory of plate tectonics. When a large earthquake occurs, people from distances near and far can feel it to varying degrees. But how do scientists examine data to identify the locations of earthquake epicenters? The scientific definition of an earthquake: "a movement within the Earth's crust or mantle, caused by the sudden rupture or repositioning of underground material as they release stress" can be confusing for students first studying Earth science in 6th grade. Students struggle with understanding how scientists can tell when and where a rupture occurs, when the inner crust and mantle are not visible to us. Our CyberTEAM project provides 6th grade teachers with the opportunity to engage adolescents in activities that make textbooks come alive as students manipulate the same data that today's scientists use. We have developed an Earthquake Epicenter Location Tool that includes two Flash-based interactive learning objects that can be used to study basic seismology concepts and lets the user determine earthquake epicenters from current data. Through the Wilber II system maintained at the IRIS (Incorporated Research Institutions for Seismology) Web site, this project retrieves seismic data of recent earthquakes and makes them available to the public. Students choose an earthquake to perform further explorations. For each earthquake, a selection of USArray seismic stations are marked on a Google Map. Picking a station on the

  11. Earthquake prediction: the interaction of public policy and science.

    PubMed Central

    Jones, L M

    1996-01-01

    Earthquake prediction research has searched for both informational phenomena, those that provide information about earthquake hazards useful to the public, and causal phenomena, causally related to the physical processes governing failure on a fault, to improve our understanding of those processes. Neither informational nor causal phenomena are a subset of the other. I propose a classification of potential earthquake predictors of informational, causal, and predictive phenomena, where predictors are causal phenomena that provide more accurate assessments of the earthquake hazard than can be gotten from assuming a random distribution. Achieving higher, more accurate probabilities than a random distribution requires much more information about the precursor than just that it is causally related to the earthquake. PMID:11607656

  12. Earthquake prediction: The interaction of public policy and science

    USGS Publications Warehouse

    Jones, L.M.

    1996-01-01

    Earthquake prediction research has searched for both informational phenomena, those that provide information about earthquake hazards useful to the public, and causal phenomena, causally related to the physical processes governing failure on a fault, to improve our understanding of those processes. Neither informational nor causal phenomena are a subset of the other. I propose a classification of potential earthquake predictors of informational, causal, and predictive phenomena, where predictors are causal phenomena that provide more accurate assessments of the earthquake hazard than can be gotten from assuming a random distribution. Achieving higher, more accurate probabilities than a random distribution requires much more information about the precursor than just that it is causally related to the earthquake.

  13. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  14. Pre-earthquake magnetic pulses

    NASA Astrophysics Data System (ADS)

    Scoville, J.; Heraud, J.; Freund, F.

    2015-08-01

    A semiconductor model of rocks is shown to describe unipolar magnetic pulses, a phenomenon that has been observed prior to earthquakes. These pulses are suspected to be generated deep in the Earth's crust, in and around the hypocentral volume, days or even weeks before earthquakes. Their extremely long wavelength allows them to pass through kilometers of rock. Interestingly, when the sources of these pulses are triangulated, the locations coincide with the epicenters of future earthquakes. We couple a drift-diffusion semiconductor model to a magnetic field in order to describe the electromagnetic effects associated with electrical currents flowing within rocks. The resulting system of equations is solved numerically and it is seen that a volume of rock may act as a diode that produces transient currents when it switches bias. These unidirectional currents are expected to produce transient unipolar magnetic pulses similar in form, amplitude, and duration to those observed before earthquakes, and this suggests that the pulses could be the result of geophysical semiconductor processes.

  15. Using Smartphones to Detect Earthquakes

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.

    2012-12-01

    We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

  16. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  17. School Safety and Earthquakes.

    ERIC Educational Resources Information Center

    Dwelley, Laura; Tucker, Brian; Fernandez, Jeanette

    1997-01-01

    A recent assessment of earthquake risk to Quito, Ecuador, concluded that many of its public schools are vulnerable to collapse during major earthquakes. A subsequent examination of 60 buildings identified 15 high-risk buildings. These schools were retrofitted to meet standards that would prevent injury even during Quito's largest earthquakes. US…

  18. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  19. Laser measuring system accurately locates point coordinates on photograph

    NASA Technical Reports Server (NTRS)

    Doede, J. H.; Lindenmeyer, C. W.; Vonderohe, R. H.

    1966-01-01

    Laser activated ultraprecision ranging apparatus interfaced with a computer determines point coordinates on a photograph. A helium-neon gas CW laser provides collimated light for a null balancing optical system. This system has no mechanical connection between the ranging apparatus and the photograph.

  20. Mapping of earthquakes vulnerability area in Papua

    NASA Astrophysics Data System (ADS)

    Muhammad Fawzy Ismullah, M.; Massinai, Muh. Altin

    2016-05-01

    Geohazard is a geological occurrence which may lead to a huge loss for human. A mitigation of these natural disasters is one important thing to be done properly in order to reduce the risks. One of the natural disasters that frequently occurs in the Papua Province is the earthquake. This study applies the principle of Geospatial and its application for mapping the earthquake-prone area in the Papua region. It uses earthquake data, which is recorded for 36 years (1973-2009), fault location map, and ground acceleration map of the area. The earthquakes and fault map are rearranged into an earthquake density map, as well as an earthquake depth density map and fault density map. The overlaid data of these three maps onto ground acceleration map are then (compiled) to obtain an earthquake unit map. Some districts area, such as Sarmi, Nabire, and Dogiyai, are identified by a high vulnerability index. In the other hand, Waropen, Puncak, Merauke, Asmat, Mappi, and Bouven Digoel area shows lower index. Finally, the vulnerability index in other places is detected as moderate.

  1. Earthquake Apparent Stress Scaling

    NASA Astrophysics Data System (ADS)

    Walter, W. R.; Mayeda, K.; Ruppert, S.

    2002-12-01

    There is currently a disagreement within the geophysical community on the way earthquake energy scales with magnitude. One set of recent papers finds evidence that energy release per seismic moment (apparent stress) is constant (e.g. Choy and Boatwright, 1995; McGarr, 1999; Ide and Beroza, 2001). Another set of recent papers finds the apparent stress increases with magnitude (e.g. Kanamori et al., 1993 Abercrombie, 1995; Mayeda and Walter, 1996; Izutani and Kanamori, 2001). The resolution of this issue is complicated by the difficulty of accurately accounting for and determining the seismic energy radiated by earthquakes over a wide range of event sizes in a consistent manner. We have just started a project to reexamine this issue by analyzing aftershock sequences in the Western U.S. and Turkey using two different techniques. First we examine the observed regional S-wave spectra by fitting with a parametric model (Walter and Taylor, 2002) with and without variable stress drop scaling. Because the aftershock sequences have common stations and paths we can examine the S-wave spectra of events by size to determine what type of apparent stress scaling, if any, is most consistent with the data. Second we use regional coda envelope techniques (e.g. Mayeda and Walter, 1996; Mayeda et al, 2002) on the same events to directly measure energy and moment. The coda techniques corrects for path and site effects using an empirical Green function technique and independent calibration with surface wave derived moments. Our hope is that by carefully analyzing a very large number of events in a consistent manner using two different techniques we can start to resolve this apparent stress scaling issue. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under Contract No. W-7405-Eng-48.

  2. Parameterization of historical earthquakes in Switzerland

    NASA Astrophysics Data System (ADS)

    Álvarez-Rubio, Sonia; Kästli, Philipp; Fäh, Donat; Sellami, Souad; Giardini, Domenico

    2012-01-01

    Macroseismic earthquake parameters of historical events have been reassessed in the framework of the update of the Earthquake Catalogue of Switzerland ECOS-09. The Bakun and Wentworth method (Bakun and Wentworth 1997) has been used to assess location, magnitude, and, when possible, focal depth. We apply a two-step procedure. Intensity attenuation is assessed first by fitting a model with a logarithmic and a linear term, using a set of 111 earthquakes. The magnitude range is 3 and 5.8. Then, intensity to magnitude relation is developed. A subset of the 111 events, all having an instrumental moment magnitude, was used to perform this intensity to magnitude calibration. Five final calibration strategies were developed based on different intensity calibration datasets, regionalized or non-regionalized models, and fixed or variable source depth. The final assessment of the macroseismic earthquake parameters is based on an expert judgment procedure, using the results derived from all five strategies, and taking into consideration the historical knowledge available for the particular earthquake. A bootstrap procedure has been applied to assess the uncertainty of parameters. Indicative lower and upper bounds of uncertainty are derived from distributions of location and magnitude for a number of events, obtained through bootstrap sampling of the intensity field and of the single intensity values. The final uncertainties are given in terms of parameter uncertainty classes already used in previous versions of the earthquake catalogue of Switzerland.

  3. Simulating Earthquake Early Warning Systems in the Classroom as a New Approach to Teaching Earthquakes

    NASA Astrophysics Data System (ADS)

    D'Alessio, M. A.

    2010-12-01

    A discussion of P- and S-waves seems an ubiquitous part of studying earthquakes in the classroom. Textbooks from middle school through university level typically define the differences between the waves and illustrate the sense of motion. While many students successfully memorize the differences between wave types (often utilizing the first letter as a memory aide), textbooks rarely give tangible examples of how the two waves would "feel" to a person sitting on the ground. One reason for introducing the wave types is to explain how to calculate earthquake epicenters using seismograms and travel time charts -- very abstract representations of earthquakes. Even when the skill is mastered using paper-and-pencil activities or one of the excellent online interactive versions, locating an epicenter simply does not excite many of our students because it evokes little emotional impact, even in students located in earthquake-prone areas. Despite these limitations, huge numbers of students are mandated to complete the task. At the K-12 level, California requires that all students be able to locate earthquake epicenters in Grade 6; in New York, the skill is a required part of the Regent's Examination. Recent innovations in earthquake early warning systems around the globe give us the opportunity to address the same content standard, but with substantially more emotional impact on students. I outline a lesson about earthquakes focused on earthquake early warning systems. The introductory activities include video clips of actual earthquakes and emphasize the differences between the way P- and S-waves feel when they arrive (P arrives first, but is weaker). I include an introduction to the principle behind earthquake early warning (including a summary of possible uses of a few seconds warning about strong shaking) and show examples from Japan. Students go outdoors to simulate P-waves, S-waves, and occupants of two different cities who are talking to one another on cell phones

  4. Least Square Support Vector Machine for Detection of - Ionospheric Anomalies Associated with the Powerful Nepal Earthquake (Mw = 7.5) of 25 April 2015

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2016-06-01

    Due to the irrepalable devastations of strong earthquakes, accurate anomaly detection in time series of different precursors for creating a trustworthy early warning system has brought new challenges. In this paper the predictability of Least Square Support Vector Machine (LSSVM) has been investigated by forecasting the GPS-TEC (Total Electron Content) variations around the time and location of Nepal earthquake. In 77 km NW of Kathmandu in Nepal (28.147° N, 84.708° E, depth = 15.0 km) a powerful earthquake of Mw = 7.8 took place at 06:11:26 UTC on April 25, 2015. For comparing purpose, other two methods including Median and ANN (Artificial Neural Network) have been implemented. All implemented algorithms indicate on striking TEC anomalies 2 days prior to the main shock. Results reveal that LSSVM method is promising for TEC sesimo-ionospheric anomalies detection.

  5. Shallow moonquakes - How they compare with earthquakes

    NASA Technical Reports Server (NTRS)

    Nakamura, Y.

    1980-01-01

    Of three types of moonquakes strong enough to be detectable at large distances - deep moonquakes, meteoroid impacts and shallow moonquakes - only shallow moonquakes are similar in nature to earthquakes. A comparison of various characteristics of moonquakes with those of earthquakes indeed shows a remarkable similarity between shallow moonquakes and intraplate earthquakes: (1) their occurrences are not controlled by tides; (2) they appear to occur in locations where there is evidence of structural weaknesses; (3) the relative abundances of small and large quakes (b-values) are similar, suggesting similar mechanisms; and (4) even the levels of activity may be close. The shallow moonquakes may be quite comparable in nature to intraplate earthquakes, and they may be of similar origin.

  6. Multiple event location analysis of aftershock sequences in the Pannonian basin

    NASA Astrophysics Data System (ADS)

    Bekesi, Eszter; Sule, Balint; Bondar, Istvan

    2016-04-01

    Accurate seismic event location is crucial to understand tectonic processes such as crustal faults that are most commonly investigated by studying seismic activity. Location errors can be significantly reduced using multiple event location methods. We applied the double difference method to relocate the earthquake occurred near Oroszlány and its 200 aftershocks to identify the geometry of the related fault. We used the extended ISC location algorithm, iLoc to determine the absolute single event locations for the Oroszlány aftershock sequence and applied double difference algorithm on the new hypocenters. To improve location precision, we added differential times from waveform cross-correlation to the multiple event location process to increase the accuracy of arrival time readings. We also tested the effect of various local 1-D velocity models on the results. We compared hypoDD results of bulletin and iLoc hypocenters to investigate the effect of initial hypocenter parameters on the relocation process. We show that hypoDD collapses the initial, rather diffuse locations into a smaller cluster and the vertical cross-sections show sharp images of seismicity. Unsurprisingly, the combined use of catalog and cross-correlation data sets provides the more accurate locations. Some of the relocated events in the cluster are ground truth quality with a location accuracy of 5 km or better. Having achieved accurate locations for the event cluster we are able to resolve the fault plane ambiguity in the moment tensor solutions and determine the accurate strike of the fault.

  7. Earthquake forecasting and warning

    SciTech Connect

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  8. Rapid telemetry and earthquake early warning

    NASA Astrophysics Data System (ADS)

    Allen, R.; Bose, M.; Brown, H.; Cua, G.; Given, D.; Hauksson, E.; Heaton, T.; Hellweg, M.; Jordan, T.; Kireev, A.; Maechling, P.; Neuhauser, D.; Oppenheimer, D.; Solanki, K.; Zeleznik, M.

    2008-05-01

    The California Integrated Seismic Network (CISN) is currently testing algorithms for earthquake early warning on the realtime seismic systems in the state. An earthquake warning system rapidly detects the initiation of earthquakes and assesses the associated hazard. The goal is to provide warning of potentially damaging ground motion in a target region prior to the arrival of seismic waves. The network-based approach to early warning requires station data to be gathered at a central site for joint processing. ElarmS, one network-based approach being tested, currently runs 15 sec behind realtime in order to gather ~90% of station data before processing. Even with this delay the recent Mw 5.4 Alum Rock earthquake near San Jose was detected and an accurate hazard assessment was available before ground shaking in San Francisco. The Virtual Seismologist (VS) method, another network-based approach, is a Bayesian method that incorporates information such as network topology, previously observed seismicity, and the Gutenberg-Richter relationship in magnitude and location estimation. The VS method is currently being transitioned from off-line to real-time testing and will soon be running 15 sec behind real-time, as in the case of ElarmS. We are also testing an on-site warning approach, which is based on single-station observations. On-site systems can deliver earthquake information faster than regional systems, and the warning could possibly reach potential users at much closer epicentral distances before the damaging shaking starts. By definition, on-site systems do not require a central processing facility or delivery of data from a distant seismic station, but they are less robust that networked-based systems and need a fast and reliable telemetry to deliver warnings to local users. The range of possible warning times is typically seconds to tens of seconds and every second of data latency translates into an equal reduction in the available warning time. Minimal latency

  9. High precision earthquake source and wave properties of the Yellowstone volcanic-tectonic system using automated seismic waveform analysis

    NASA Astrophysics Data System (ADS)

    Farrell, J.; Smith, R. B.; Massin, F.; Husen, S.; Burlacu, R.; Koper, K. D.; Drobeck, D.

    2011-12-01

    The University of Utah operates the Yellowstone Seismograph Network (YSN) that over the past decade has been upgraded with USGS, NPS, NSF and University of Utah support, with a recent major upgrade using USGS ARRA support. The YSN consists of an upgraded digitally telemetered network of 26 seismic stations (3-component short period, broadband, and accelerometers deployed in different configurations) and 6 PBO borehole stations. Data recorded with the YSN are intensively used for earthquake research on the youthful Yellowstone caldera, its crustal magma reservoir, related faults and corresponding behavior with the dramatic episodes of Yellowstone's crustal deformation. Since 1984, more than 32,000 earthquakes have been located in the Yellowstone volcano/tectonic area (on average 1,000 to 3,000 events per year), with preponderance of earthquakes occurring in distinct swarms. For monitoring purposes, hypocenters are determined using linearized earthquake location and a 1D P-wave velocity model. For research, 3D P- and S-velocity models and nonlinear probabilistic location algorithms provide more accurate locations, with consistent error estimates. We compare the 1D and 3D locations for the Yellowstone system from 1995-2010. Results show that hypocenters are most noticeably improved in depth with the 3D nonlinear locations tending to be ~2 km deeper than the routine locations. The improved 3D hypocenter locations, combined with associated source mechanisms, are necessary to evaluate temporal and spatial variations of hypocenters with volcanic features as well as evaluating wave path characteristics and Q-1. To reliably obtain 3D velocity models for Yellowstone, an automatic P- and S-wave picker is being implemented for all of the continuous Yellowstone seismic data. Using a reference dataset of ~100 earthquakes, the automatic picker is calibrated to reliably pick P- and S-wave arrivals with consistent errors. The entire catalog of ~30,000 earthquakes will be analyzed

  10. Forecasting the Next Great San Francisco Earthquake

    NASA Astrophysics Data System (ADS)

    Rundle, P.; Rundle, J. B.; Turcotte, D. L.; Donnellan, A.; Yakovlev, G.; Tiampo, K. F.

    2005-12-01

    The great San Francisco earthquake of 18 April 1906 and its subsequent fires killed more than 3,000 persons, and destroyed much of the city leaving 225,000 out of 400,000 inhabitants homeless. The 1906 earthquake occurred on a km segment of the San Andreas fault that runs from the San Juan Bautista north to Cape Mendocino and is estimated to have had a moment magnitude m ,l 7.9. Observations of surface displacements across the fault were in the range m. As we approach the 100 year anniversary of this event, a critical concern is the hazard posed by another such earthquake. In this talk we examine the assumptions presently used to compute the probability of occurrence of these earthquakes. We also present the results of a numerical simulation of interacting faults on the San Andreas system. Called Virtual California, this simulation can be used to compute the times, locations and magnitudes of simulated earthquakes on the San Andreas fault in the vicinity of San Francisco. Of particular importance are new results for the statistical distribution of interval times between great earthquakes, results that are difficult or impossible to obtain from a purely field-based approach. We find that our results are fit well under most circumstances by the Weibull statistical distribution, and we compute waiting times to future earthquakes based upon our simulation results. A contrasting approach to the same problem has been adopted by the Working Group on California Earthquake Probabilities, who use observational data combined with statistical assumptions to compute probabilities of future earthquakes.

  11. Exploring Earthquakes in Real-Time

    NASA Astrophysics Data System (ADS)

    Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.

    2013-12-01

    Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.

  12. Catalog of Earthquake Hypocenters at Alaskan Volcanoes: January 1 through December 31, 2008

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.

    2009-01-01

    Between January 1 and December 31, 2008, the Alaska Volcano Observatory (AVO) located 7,097 earthquakes of which 5,318 occurred within 20 kilometers of the 33 volcanoes monitored by the AVO. Monitoring highlights in 2008 include the eruptions of Okmok Caldera, and Kasatochi Volcano, as well as increased unrest at Mount Veniaminof and Redoubt Volcano. This catalog includes descriptions of: (1) locations of seismic instrumentation deployed during 2008; (2) earthquake detection, recording, analysis, and data archival systems; (3) seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2008; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2008.

  13. Microearthquake networks and earthquake prediction

    USGS Publications Warehouse

    Lee, W.H.K.; Steward, S. W.

    1979-01-01

    A microearthquake network is a group of highly sensitive seismographic stations designed primarily to record local earthquakes of magnitudes less than 3. Depending on the application, a microearthquake network will consist of several stations or as many as a few hundred . They are usually classified as either permanent or temporary. In a permanent network, the seismic signal from each is telemetered to a central recording site to cut down on the operating costs and to allow more efficient and up-to-date processing of the data. However, telemetering can restrict the location sites because of the line-of-site requirement for radio transmission or the need for telephone lines. Temporary networks are designed to be extremely portable and completely self-contained so that they can be very quickly deployed. They are most valuable for recording aftershocks of a major earthquake or for studies in remote areas.  

  14. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    NASA Astrophysics Data System (ADS)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (<6.5R) can be predicted with very good accuracy window (+-1 day). In this contribution we present an improvement modification to the FDL method, the MFDL method, which performs better than the FDL. We use the FDL numbers to develop possible earthquakes dates but with the important difference that the starting seed date is a trigger planetary aspect prior to the earthquake. Typical planetary aspects are Moon conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with

  15. Did you feel it? : citizens contribute to earthquake science

    USGS Publications Warehouse

    Wald, David J.; Dewey, James W.

    2005-01-01

    Since the early 1990s, the magnitude and location of an earthquake have been available within minutes on the Internet. Now, as a result of work by the U.S. Geological Survey and with the cooperation of various regional seismic networks, people who experience an earthquake can go online and share information about its effects to help create a map of shaking intensities and damage. Such “Community Internet Intensity Maps” (CIIMs) contribute greatly toward the quick assessment of the scope of an earthquake emergency and provide valuable data for earthquake research.

  16. Highlights of the 13 March 1992 Erzincan (Turkey) earthquake

    USGS Publications Warehouse

    Celebi, Mehmet

    1992-01-01

    The March 13, 1992 Ms = 6.8 Erzincan earthquake in Turkey is highlighted here. The epicenter of this earthquake was located 7.7 km from the eastern end of the North Anatolian fault. The strong motions recorded in Erzincan had peak ground accelerations of approximately 0.5 g, accompanied by a pulse of 2 seconds. The duration of the earthquake was 7 seconds. This earthquake caused collapse of about 150 buildings--mainly to 4-5-story reinforced, concrete-framed buildings with infill walls. This damage, which is discussed, can be attributed to non-compliance with seismic codes.

  17. Association of earthquakes and faults in the San Francisco Bay area using Bayesian inference

    USGS Publications Warehouse

    Wesson, R.L.; Bakun, W.H.; Perkins, D.M.

    2003-01-01

    Bayesian inference provides a method to use seismic intensity data or instrumental locations, together with geologic and seismologic data, to make quantitative estimates of the probabilities that specific past earthquakes are associated with specific faults. Probability density functions are constructed for the location of each earthquake, and these are combined with prior probabilities through Bayes' theorem to estimate the probability that an earthquake is associated with a specific fault. Results using this method are presented here for large, preinstrumental, historical earthquakes and for recent earthquakes with instrumental locations in the San Francisco Bay region. The probabilities for individual earthquakes can be summed to construct a probabilistic frequency-magnitude relationship for a fault segment. Other applications of the technique include the estimation of the probability of background earthquakes, that is, earthquakes not associated with known or considered faults, and the estimation of the fraction of the total seismic moment associated with earthquakes less than the characteristic magnitude. Results for the San Francisco Bay region suggest that potentially damaging earthquakes with magnitudes less than the characteristic magnitudes should be expected. Comparisons of earthquake locations and the surface traces of active faults as determined from geologic data show significant disparities, indicating that a complete understanding of the relationship between earthquakes and faults remains elusive.

  18. The development of the International Network for Frontier Research on Earthquake Precursors (INFREP) by designing new analysing software and by setting up new recording locations of radio VLF/LF signals in Romania

    NASA Astrophysics Data System (ADS)

    Moldovan, Iren-Adelina; Petruta Constantin, Angela; Emilian Toader, Victorin; Toma-Danila, Dragos; Biagi, Pier Francesco; Maggipinto, Tommaso; Dolea, Paul; Septimiu Moldovan, Adrian

    2014-05-01

    Based on scientific evidences supporting the causality between earthquake preparatory stages, space weather and solar activity and different types of electromagnetic (EM) disturbances together with the benefit of having full access at ground and space based EM data, INFREP proposes a complex and cross correlated investigation of phenomena that occur in the coupled system Lithosphere-Atmosphere-Ionsophere in order to identify possible causes responsible for anomalous effects observed in the propagation characteristics of radio waves, especially at low (LF) and very low frequency (VLF). INFREP, a network of VLF (20-60 kHz) and LF (150-300 kHz) radio receivers, was put into operation in Europe in 2009, having as principal goal, the study of disturbances produced by the earthquakes on the propagation properties of these signals. The Romanian NIEP VLF / LF monitoring system consisting in a radio receiver -made by Elettronika S.R.L. (Italy) and provided by the Bari University- and the infrastructure that is necessary to record and transmit the collected data, is a part of the international initiative INFREP. The NIEP VLF / LF receiver installed in Romania was put into operation in February 2009 in Bucharest and relocated to the Black-Sea shore (Dobruja Seismologic Observatory) in December 2009. The first development of the Romanian EM monitoring system was needed because after changing the receiving site from Bucharest to Eforie we obtained unsatisfactory monitoring data, characterized by large fluctuations of the received signals' intensities. Trying to understand this behavior has led to the conclusion that the electric component of the electromagnetic field was possibly influenced by the local conditions. Starting from this observation we have run some tests and changed the vertical antenna with a loop-type antenna that is more appropriate in highly electric-field polluted environments. Since the amount of recorded data is huge, for streamlining the research process

  19. Implications for prediction and hazard assessment from the 2004 Parkfield earthquake.

    PubMed

    Bakun, W H; Aagaard, B; Dost, B; Ellsworth, W L; Hardebeck, J L; Harris, R A; Ji, C; Johnston, M J S; Langbein, J; Lienkaemper, J J; Michael, A J; Murray, J R; Nadeau, R M; Reasenberg, P A; Reichle, M S; Roeloffs, E A; Shakal, A; Simpson, R W; Waldhauser, F

    2005-10-13

    Obtaining high-quality measurements close to a large earthquake is not easy: one has to be in the right place at the right time with the right instruments. Such a convergence happened, for the first time, when the 28 September 2004 Parkfield, California, earthquake occurred on the San Andreas fault in the middle of a dense network of instruments designed to record it. The resulting data reveal aspects of the earthquake process never before seen. Here we show what these data, when combined with data from earlier Parkfield earthquakes, tell us about earthquake physics and earthquake prediction. The 2004 Parkfield earthquake, with its lack of obvious precursors, demonstrates that reliable short-term earthquake prediction still is not achievable. To reduce the societal impact of earthquakes now, we should focus on developing the next generation of models that can provide better predictions of the strength and location of damaging ground shaking. PMID:16222291

  20. Next-Level ShakeZoning for Earthquake Hazard Definition in Nevada

    NASA Astrophysics Data System (ADS)

    Louie, J. N.; Savran, W. H.; Flinchum, B. A.; Dudley, C.; Prina, N.; Pullammanappallil, S.; Pancha, A.

    2011-12-01

    We are developing "Next-Level ShakeZoning" procedures tailored for defining earthquake hazards in Nevada. The current Federally sponsored tools- the USGS hazard maps and ShakeMap, and FEMA HAZUS- were developed as statistical summaries to match earthquake data from California, Japan, and Taiwan. The 2008 Wells and Mogul events in Nevada showed in particular that the generalized statistical approach taken by ShakeMap cannot match actual data on shaking from earthquakes in the Intermountain West, even to first order. Next-Level ShakeZoning relies on physics and geology to define earthquake shaking hazards, rather than statistics. It follows theoretical and computational developments made over the past 20 years, to capitalize on detailed and specific local data sets to more accurately model the propagation and amplification of earthquake waves through the multiple geologic basins of the Intermountain West. Excellent new data sets are now available for Las Vegas Valley. Clark County, Nevada has completed the nation's very first effort to map earthquake hazard class systematically through an entire urban area using Optim's SeisOpt° ReMi technique, which was adapted for large-scale data collection. Using the new Parcel Map in computing shaking in the Valley for scenario earthquakes is crucial for obtaining realistic predictions of ground motions. In an educational element of the project, a dozen undergraduate students have been computing 50 separate earthquake scenarios affecting Las Vegas Valley, using the Next-Level ShakeZoning process. Despite affecting only the upper 30 meters, the Vs30 geotechnical shear-velocity from the Parcel Map shows clear effects on 3-d shaking predictions computed so far at frequencies from 0.1 Hz up to 1.0 Hz. The effect of the Parcel Map on even the 0.1-Hz waves is prominent even with the large mismatch of wavelength to geotechnical depths. Amplifications and de-amplifications affected by the Parcel Map exceed a factor of two, and are

  1. Seismogenic tectonics of the Qian-Gorlos earthquake in Jilin Province, China

    NASA Astrophysics Data System (ADS)

    Shen, Jun; Shao, Bo; Yu, Xiao-hui; Yu, Yang; Qi, Gao; Deng, Mei; Zhang, Hanwen

    2016-04-01

    The Qian-Gorlos earthquake, which occurred in the Songliao basin in Jilin Province in 1119 AD, was the largest earthquake to occur in NE China before the 1975 Haicheng earthquake. Based on historical records and surface geological investigations, it has been suggested previously that the earthquake epicenter was in the Longkeng area. However, other workers have considered the epicenter to be in the Halamaodu area based on the landslides and faults found in this region. No seismogenic structure has yet been found in either of these two regions. We tried to detect active faults in the urban areas of Songyuan City, where the historical earthquake was probably located. One of the aims of this work was to clarify the seismogenic structure so that the seismic risk in the city could be more accurately evaluated. The area was investigated and analyzed using information from remote sensing and topographic surveys, seismic data from petroleum exploration, shallow seismic profiles, exploratory geological trenches on fault outcrops, and borehole data. The geophysical data did not reveal any evidence of faults cutting through Cretaceous or later strata under the Longkeng scarp, which has been suggested to be structural evidence of the Qian-Gorlos earthquake. The continuous fault surfaces on the back edge of terraces in the Halamaodu area stretch for >3.5 km and were probably formed by tectonic activity. However, results from shallow seismic profiles showed that the faults did not extend downward, with the corresponding deep structure being identified as a gentle kink band. A new reverse fault was found to the west of the two suggested epicenters, which presented as a curvilinear fault extending to the west, and was formed by two groups of NE- and NW-trending faults intersecting the Gudian fault. Three-dimensional seismic and shallow seismic data from petroleum exploration revealed its distinct spatial distribution and showed that the fault may cut through Late Quaternary strata

  2. The loma prieta, california, earthquake: an anticipated event.

    PubMed

    1990-01-19

    The first major earthquake on the San Andreas fault since 1906 fulfilled a long-term forecast for its rupture in the southern Santa Cruz Mountains. Severe damage occurred at distances of up to 100 kilometers from the epicenter in areas underlain by ground known to be hazardous in strong earthquakes. Stronger earthquakes will someday strike closer to urban centers in the United States, most of which also contain hazardous ground. The Loma Prieta earthquake demonstrated that meaningful predictions can be made of potential damage patterns and that, at least in well-studied areas, long-term forecasts can be made of future earthquake locations and magnitudes. Such forecasts can serve as a basis for action to reduce the threat major earthquakes pose to the United States. PMID:17735847

  3. Nonlinear acoustic/seismic waves in earthquake processes

    SciTech Connect

    Johnson, Paul A.

    2012-09-04

    Nonlinear dynamics induced by seismic sources and seismic waves are common in Earth. Observations range from seismic strong ground motion (the most damaging aspect of earthquakes), intense near-source effects, and distant nonlinear effects from the source that have important consequences. The distant effects include dynamic earthquake triggering-one of the most fascinating topics in seismology today-which may be elastically nonlinearly driven. Dynamic earthquake triggering is the phenomenon whereby seismic waves generated from one earthquake trigger slip events on a nearby or distant fault. Dynamic triggering may take place at distances thousands of kilometers from the triggering earthquake, and includes triggering of the entire spectrum of slip behaviors currently identified. These include triggered earthquakes and triggered slow, silent-slip during which little seismic energy is radiated. It appears that the elasticity of the fault gouge-the granular material located between the fault blocks-is key to the triggering phenomenon.

  4. Earthquake Observation through Groundwater Monitoring in South Korea

    NASA Astrophysics Data System (ADS)

    Piao, J.; Woo, N. C.

    2014-12-01

    According to previous researches, the influence of the some earthquakes can be detected by groundwater monitoring. Even in some countries groundwater monitoring is being used as an important tool to identify earthquake precursors and prediction measures. Thus, in this study we attempt to catch the anomalous changes in groundwater produced by earthquakes occurred in Korea through the National Groundwater Monitoring Network (NGMN). For observing the earthquake impacts on groundwater more effectively, from the National Groundwater Monitoring Network we selected 28 stations located in the five earthquake-prone zones in South Korea. And we searched the responses to eight earthquakes with M ≥2.5 which occurred in the vicinity of five earthquake-prone zones in 2012. So far, we tested the groundwater monitoring data (water-level, temperature and electrical conductivity). Those data have only been treated to remove barometric pressure changes. Then we found 29 anomalous changes, confirming that groundwater monitoring data can provide valuable information on earthquake effects. To identify the effect of the earthquake from mixture signals of water-level, other signals must be separated from the original data. Periodic signals will be separated from the original data using Fast Fourier Transform (FFT). After that we will attempt to separate precipitation effect, and determine if the anomalies were generated by earthquake or not.

  5. A moment-tensor catalog for intermediate magnitude earthquakes in Mexico

    NASA Astrophysics Data System (ADS)

    Rodríguez Cardozo, Félix; Hjörleifsdóttir, Vala; Martínez-Peláez, Liliana; Franco, Sara; Iglesias Mendoza, Arturo

    2016-04-01

    Located among five tectonic plates, Mexico is one of the world's most seismically active regions. The earthquake focal mechanisms provide important information on the active tectonics. A widespread technique for estimating the earthquake magnitud and focal mechanism is the inversion for the moment tensor, obtained by minimizing a misfit function that estimates the difference between synthetic and observed seismograms. An important element in the estimation of the moment tensor is an appropriate velocity model, which allows for the calculation of accurate Green's Functions so that the differences between observed and synthetics seismograms are due to the source of the earthquake rather than the velocity model. However, calculating accurate synthetic seismograms gets progressively more difficult as the magnitude of the earthquakes decreases. Large earthquakes (M>5.0) excite waves of longer periods that interact weakly with lateral heterogeneities in the crust. For these events, using 1D velocity models to compute Greens functions works well and they are well characterized by seismic moment tensors reported in global catalogs (eg. USGS fast moment tensor solutions and GCMT). The opposite occurs for small and intermediate sized events, where the relatively shorter periods excited interact strongly with lateral heterogeneities in the crust and upper mantle. To accurately model the Green's functions for the smaller events in a large heterogeneous area, requires 3D or regionalized 1D models. To obtain a rapid estimate of earthquake magnitude, the National Seismological Survey in Mexico (Servicio Sismológico Nacional, SSN) automatically calculates seismic moment tensors for events in the Mexican Territory (Franco et al., 2002; Nolasco-Carteño, 2006). However, for intermediate-magnitude and small earthquakes the signal-to-noise ratio could is low for many of the seismic stations, and without careful selection and filtering of the data, obtaining a stable focal mechanism

  6. Importance of macroseismic data from moderate local earthquakes for seismic microzoning effects distribution during the 2003 Bardo, Tunisia, earthquake

    NASA Astrophysics Data System (ADS)

    Kacem, J.; Hfaiedh, M.

    2012-04-01

    The area considered in this study is located in Northern Tunisia. Being part of the western Mediterranean region, the geodynamic evolution of Northern Tunisia is closely related to the convergence between the African and the European tectonic plates. Numerous Quaternary fold, reverse and strike slip faults and historical earthquake indicate that the seismic hazard of Tunisia is considerable and a better strategy for seismic risk evaluation needs to be developed. In fact, the recent Quaternary activity in Tunisia has been proved and described by numerous authors. This activity sometimes affects Holocene to historic deposits. In particular, evidence of damage can be seen in several sites where constructions dating back to the Roman epoch have been affected. The large number of sites showing Holocene to Historic tectonic deformations cannot be explained by the relatively weak magnitude (M< 5), which characterizes the seismicity of Tunisia. These results suggest that Tunisia is characterized either by relatively important seismicity during the recent quaternary period or by a very shallow seismicity. The second hypothesis is supported by the recent macroseismicity data where several surface effects are observed in many examples of moderate earthquakes. To verify the results of seismic microzoning and to improve techniques, the macroseismic data of past strongly expressed earthquakes is an important key reference. The macroseismic and accelerometric data of the 2003 Bardo, Tunisia, earthquake in the epicentral region are collected and compiled to produce the most reliable and detailed isoseismal map. The area enclosed in the isoseismal with IV EMS degree is not symmetric with respect to the isoseismal with higher degree (V EMS). From this point of view, we can affirm that the attenuation was stronger on the western part than on the eastern one. Moreover, due to very local site effects, we found sporadic small areas with intensity up to IV EMS degree randomly distributed

  7. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    NASA Astrophysics Data System (ADS)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  8. 1/f and the Earthquake Problem: Scaling constraints to facilitate operational earthquake forecasting

    NASA Astrophysics Data System (ADS)

    Yoder, M. R.; Rundle, J. B.; Glasscoe, M. T.

    2013-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or '1/f', nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this '1/f problem,' it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area), in combination with a metric to quantify rate trends in local seismicity, to the local earthquake magnitude potential - the magnitudes of earthquakes the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.

  9. 3D Dynamic Earthquake Fracture Simulation (Test Case)

    NASA Astrophysics Data System (ADS)

    Korkusuz Öztürk, Yasemin; Meral Özel, Nurcan; Ando, Ryosuke

    2016-04-01

    A 3D dynamic earthquake fracture simulation is being developed for the fault structures which are non-planar to understand heterogeneous stress states in the Marmara Sea. Locating in a seismic gap, a large earthquake is expected in the center of the Sea of Marmara. Concerning the fact that more than 14 million inhabitants of İstanbul, located very closely to the Marmara Sea, the importance of the analysis of the Central Marmara Sea is extremely high. A few 3D dynamic earthquake fracture studies have been already done in the Sea of Marmara for pure right lateral strike-slip stress regimes (Oglesby and Mai, 2012; Aochi and Ulrich, 2015). In this study, a 3D dynamic earthquake fracture model with heterogeneous stress patches from the TPV5, a SCEC code validation case, is adapted. In this test model, the fault and the ground surfaces are gridded by a scalene triangulation technique using GMSH program. For a grid size changing between 0.616 km and 1.050 km the number of elements for the fault surface is 1984 and for the ground surface is 1216. When these results are compared with Kaneko's results for TPV5 from SPECFEM3D, reliable findings could be observed for the first 6.5 seconds (stations on the fault) although a stability problem is encountered after this time threshold. To solve this problem grid sizes are made smaller, so the number of elements increase 7986 for the fault surface and 4867 for the ground surface. On the other hand, computational problems arise in that case, since the computation time is directly proportional to the number of total elements and the required memory also increases with the square of that. Therefore, it is expected that this method can be adapted for less coarse grid cases, regarding the main difficulty coming from the necessity of an effective supercomputer and run time limitations. The main objective of this research is to obtain 3D dynamic earthquake rupture scenarios, concerning not only planar and non-planar faults but also

  10. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  11. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  12. NCEER seminars on earthquakes

    USGS Publications Warehouse

    Pantelic, J.

    1987-01-01

    In May of 1986, the National Center for Earthquake Engineering Research (NCEER) in Buffalo, New York, held the first seminar in its new monthly forum called Seminars on Earthquakes. The Center's purpose in initiating the seminars was to educate the audience about earthquakes, to facilitate cooperation between the NCEER and visiting researchers, and to enable visiting speakers to learn more about the NCEER   

  13. Earthquake at 40 feet

    USGS Publications Warehouse

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  14. Relocations and 3-D Velocity Structure for Aftershocks of the 2000 W. Tottori (Japan) Earthquake and 2001 Gujarat (India) Earthquake, Using Waveform Cross-correlations

    NASA Astrophysics Data System (ADS)

    Enescu, B.; Mori, J.

    2004-12-01

    The newly developed double-difference tomography method (Zhang and Thurber,2003) makes use of both absolute and relative arrival times to produce an improved velocity model and highly accurate hypocenter locations. By using this technique, we relocate the aftershocks of the 2000 Western Tottori earthquake (Mw 6.7) and 2001 Gujarat (Mw 7.7) earthquake and obtain a 3D-velocity model of the aftershock region. The first data set consists of 1035 aftershocks recorded at 62 stations during a period of about a month following the mainshock (Shibutani et al.,2002). In order to get the best arrival times a cross-correlation analysis was used to align the waveforms. The epicentral distribution of the relocated events reveals clear earthquake lineations, some of them close to the mainshock, and an increased clustering. The aftershocks' depth distribution shows a mean shift of the hypocenters' centroid of about 580m; a clear upper cutoff of the seismic activity and some clustering can be also seen. The final P-wave velocity model shows higher-value anomalies in the vicinity of the mainshock's hypocenter, in good agreement with the results of Shibutani et al.(2004). The second data set consists of about 1300 earthquakes, recorded during one week of observations by a Japanese-Indian research team in the aftershock region of the Gujarat earthquake (Sato et al.,2001). Using the double-difference algorithm and waveform cross-correlations, we were able to identify a more clear alignment of hypocenters that define the mainshock's fault and an area of relatively few aftershocks in the region of the mainshock's hypocenter. Both studies demonstrate that the cross-correlation techniques applied for events with inter-event distances as large as 10km and cross correlation coefficients as low as 50% can produce more accurate locations than those determined from catalog phase data. We are going to discuss briefly the critical role of frequency filtering and of the time window used for cross

  15. Real-time earthquake monitoring using a search engine method

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data.

  16. Real-time earthquake monitoring using a search engine method

    PubMed Central

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  17. Seismic databases and earthquake catalogue of the Caucasus

    NASA Astrophysics Data System (ADS)

    Godoladze, Tea; Javakhishvili, Zurab; Tvaradze, Nino; Tumanova, Nino; Jorjiashvili, Nato; Gok, Rengen

    2016-04-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283, Ms~7.0, Io=9; Lechkhumi-Svaneti earthquake of 1350, Ms~7.0, Io=9; and the Alaverdi(earthquake of 1742, Ms~6.8, Io=9. Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088, Ms~6.5, Io=9 and the Akhalkalaki earthquake of 1899, Ms~6.3, Io =8-9. Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; 1991 Ms=7.0 Racha earthquake, the largest event ever recorded in the region; the 1992 M=6.5 Barisakho earthquake; Ms=6.9 Spitak, Armenia earthquake (100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of various national networks (Georgia (~25 stations), Azerbaijan (~35 stations), Armenia (~14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. A catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences, Ilia State University). The catalog consists of more then 80,000 events. Together with our colleagues from Armenia, Azerbaijan and Turkey the database for the Caucasus seismic events was compiled. We tried to improve locations of the events and calculate Moment magnitudes for the events more than magnitude 4 estimate in order to obtain unified magnitude catalogue of the region. The results will serve as the input for the Seismic hazard assessment for the region.

  18. Induced earthquake magnitudes are as large as (statistically) expected

    NASA Astrophysics Data System (ADS)

    van der Elst, N.; Page, M. T.; Weiser, D. A.; Goebel, T.; Hosseini, S. M.

    2015-12-01

    Key questions with implications for seismic hazard and industry practice are how large injection-induced earthquakes can be, and whether their maximum size is smaller than for similarly located tectonic earthquakes. Deterministic limits on induced earthquake magnitudes have been proposed based on the size of the reservoir or the volume of fluid injected. McGarr (JGR 2014) showed that for earthquakes confined to the reservoir and triggered by pore-pressure increase, the maximum moment should be limited to the product of the shear modulus G and total injected volume ΔV. However, if induced earthquakes occur on tectonic faults oriented favorably with respect to the tectonic stress field, then they may be limited only by the regional tectonics and connectivity of the fault network, with an absolute maximum magnitude that is notoriously difficult to constrain. A common approach for tectonic earthquakes is to use the magnitude-frequency distribution of smaller earthquakes to forecast the largest earthquake expected in some time period. In this study, we show that the largest magnitudes observed at fluid injection sites are consistent with the sampling statistics of the Gutenberg-Richter (GR) distribution for tectonic earthquakes, with no assumption of an intrinsic upper bound. The GR law implies that the largest observed earthquake in a sample should scale with the log of the total number induced. We find that the maximum magnitudes at most sites are consistent with this scaling, and that maximum magnitude increases with log ΔV. We find little in the size distribution to distinguish induced from tectonic earthquakes. That being said, the probabilistic estimate exceeds the deterministic GΔV cap only for expected magnitudes larger than ~M6, making a definitive test of the models unlikely in the near future. In the meantime, however, it may be prudent to treat the hazard from induced earthquakes with the same probabilistic machinery used for tectonic earthquakes.

  19. Earthquakes and Plate Boundaries

    ERIC Educational Resources Information Center

    Lowman, Paul; And Others

    1978-01-01

    Contains the contents of the Student Investigation booklet of a Crustal Evolution Education Project (CEEP) instructional modules on earthquakes. Includes objectives, procedures, illustrations, worksheets, and summary questions. (MA)

  20. Are Earthquake Magnitudes Clustered?

    SciTech Connect

    Davidsen, Joern; Green, Adam

    2011-03-11

    The question of earthquake predictability is a long-standing and important challenge. Recent results [Phys. Rev. Lett. 98, 098501 (2007); ibid.100, 038501 (2008)] have suggested that earthquake magnitudes are clustered, thus indicating that they are not independent in contrast to what is typically assumed. Here, we present evidence that the observed magnitude correlations are to a large extent, if not entirely, an artifact due to the incompleteness of earthquake catalogs and the well-known modified Omori law. The latter leads to variations in the frequency-magnitude distribution if the distribution is constrained to those earthquakes that are close in space and time to the directly following event.

  1. Ant Colony Optimization detects anomalous aerosol variations associated with the Chile earthquake of 27 February 2010

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2015-04-01

    This study attempts to acknowledge AOD (Aerosol Optical Depth) seismo-atmospheric anomalies around the time of the Chile earthquake of 27 February 2010. Since AOD precursor alone might not be useful as an accurate and stand alone criteria for the earthquake anomalies detection, therefore it would be more appropriate to use and integrate a variety of other precursors to reduce the uncertainty of potential detected seismic anomalies. To achieve this aim, eight other precursors including GPS-TEC (Total Electron Content), H+, He+, O+ densities (cm-3) and total ion density (cm-3) from IAP experiment, electron density (cm-3) and electron temperature (K) from ISL experiment and VLF electric field from ICE experiment have been surveyed to detect unusual variations around the time and location of the Chile earthquake. Moreover, three methods including Interquartile, ANN (Artificial Neural Network) and ACO (Ant Colony Optimization) have been implemented to observe the discord patterns in time series of the AOD precursor. All of the methods indicate a clear abnormal increase in time series of AOD data, 2 days prior to event. Also a striking anomaly is observed in time series of TEC data, 6 days preceding the earthquake. Using the analysis of ICE data, a prominent anomaly is detected in the VLF electric field measurement, 1 day before the earthquake. The time series of H+, He+, O+ densities (cm-3) and total ion density (cm-3) from IAP and also electron density (cm-3) and electron temperature (K) from ISL, illustrate the abnormal behaviors, 3 days before the event. It should be noted that the acknowledgment of the different lead times in outcomes of the implemented precursors strictly depend on the proper understanding of Lithosphere-Atmosphere-Ionosphere (LAI) coupling mechanism during seismic activities. It means that these different anomalies dates between LAI precursors can be a hint of truthfulness of multi-precursors analysis.

  2. Missing Great Earthquakes

    NASA Astrophysics Data System (ADS)

    Hough, S. E.; Martin, S.

    2013-12-01

    The occurrence of three earthquakes with Mw greater than 8.8, and six earthquakes larger than Mw8.5, since 2004 has raised interest in the long-term rate of great earthquakes. Past studies have focused on rates since 1900, which roughly marks the start of the instrumental era. Yet substantial information is available for earthquakes prior to 1900. A re-examination of the catalog of global historical earthquakes reveals a paucity of Mw ≥ 8.5 events during the 18th and 19th centuries compared to the rate during the instrumental era (Hough, 2013, JGR), suggesting that the magnitudes of some documented historical earthquakes have been underestimated, with approximately half of all Mw≥8.5 earthquakes missing or underestimated in the 19th century. Very large (Mw≥8.5) magnitudes have traditionally been estimated for historical earthquakes only from tsunami observations given a tautological assumption that all such earthquakes generate significant tsunamis. Magnitudes would therefore tend to be underestimated for deep megathrust earthquakes that generated relatively small tsunamis, deep earthquakes within continental collision zones, earthquakes that produced tsunamis that were not documented, outer rise events, and strike-slip earthquakes such as the 11 April 2012 Sumatra event. We further show that, where magnitudes of historical earthquakes are estimated from earthquake intensities using the Bakun and Wentworth (1997, BSSA) method, magnitudes of great earthquakes can be significantly underestimated. Candidate 'missing' great 19th century earthquakes include the 1843 Lesser Antilles earthquake, which recent studies suggest was significantly larger than initial estimates (Feuillet et al., 2012, JGR; Hough, 2013), and an 1841 Kamchatka event, for which Mw9 was estimated by Gusev and Shumilina (2004, Izv. Phys. Solid Ear.). We consider cumulative moment release rates during the 19th century compared to that during the 20th and 21st centuries, using both the Hough

  3. Decision making biases in the communication of earthquake risk

    NASA Astrophysics Data System (ADS)

    Welsh, M. B.; Steacy, S.; Begg, S. H.; Navarro, D. J.

    2015-12-01

    L'Aquila, with 6 scientists convicted of manslaughter, shocked the scientific community, leading to urgent re-appraisal of communication methods for low-probability, high-impact events. Before the trial, a commission investigating the earthquake recommended risk assessment be formalised via operational earthquake forecasts and that social scientists be enlisted to assist in developing communication strategies. Psychological research has identified numerous decision biases relevant to this, including hindsight bias, where people (after the fact) overestimate an event's predictability. This affects experts as well as naïve participants as it relates to their ability to construct a plausible causal story rather than the likelihood of the event. Another problem is availability, which causes overestimation of the likelihood of observed rare events due to their greater noteworthiness. This, however, is complicated by the 'description-experience' gap, whereby people underestimate probabilities for events they have not experienced. That is, people who have experienced strong earthquakes judge them more likely while those who have not judge them less likely - relative to actual probabilities. Finally, format changes alter people's decisions. That is people treat '1 in 10,000' as different from 0.01% despite their mathematical equivalence. Such effects fall under the broad term framing, which describes how different framings of the same event alter decisions. In particular, people's attitude to risk depends significantly on how scenarios are described. We examine the effect of biases on the communication of change in risk. South Australian participants gave responses to scenarios describing familiar (bushfire) or unfamiliar (earthquake) risks. While bushfires are rare in specific locations, significant fire events occur each year and are extensively covered. By comparison, our study location (Adelaide) last had a M5 quake in 1954. Preliminary results suggest the description

  4. P and S automatic picks for 3D earthquake tomography in NE Italy

    NASA Astrophysics Data System (ADS)

    Lovisa, L.; Bragato, P.; Gentili, S.

    2006-12-01

    Earthquake tomography is useful to study structural and geological features of the crust. In particular, it uses P and S arrival times for reconstructing weaves velocity fields and locating earthquakes hypocenters. However, tomography needs a large effort to provide a high number of manual picks. On the other side, many automatic picking methods have been proposed, but they are usually applied to preliminary elaboration of the data (fast alert and automatic bulletin generation); they are generally considered not reliable for tomography. In this work, we present and discuss the results of Vp, Vs and Vp/Vs tomographies obtained using automatic picks generated by the system TAPNEI (Gentili and Bragato 2006), applied in the NE Italy. Preliminarily, in order to estimate the error in comparison with the unknown true arrival times, an analysis on the picking quality is done. The tests have been performed using two dataset: the first is made up by 240 earthquakes automatically picked by TAPNEI; the second counts in the same earthquakes but manually picked (OGS database). The grid and the software used to perform tomography (Sim28, Michelini and Mc Evilly, 1991) are the same in the two cases. Vp, Vs and Vp/Vs fields of the two tomographies and their differences are shown on vertical sections. In addiction, the differences in earthquakes locations are studied; in particular, the quality of the accuracy of the localizations has been analyzed by estimating the distance of the hypocenter distributions with respect to the manual locations. The analysis include also a qualitative comparison with an independent tomography (Gentile et al., 2000) performed using Simulps (Evans et al, 1994) on a set of 224 earthquakes accurately selected and manually relocated. The quality of the pickings and the comparison with the tomography obtained by manual data suggest that earthquake tomography with automatic data can provide reliable results. We suggest the use of such data when a large

  5. Earthquake Analysis (EA) Software for The Earthquake Observatories

    NASA Astrophysics Data System (ADS)

    Yanik, K.; Tezel, T.

    2009-04-01

    There are many software that can used for observe the seismic signals and locate the earthquakes, but some of them commercial and has technical support. For this reason, many seismological observatories developed and use their own seismological software packets which are convenient with their seismological network. In this study, we introduce our software which has some capabilities that it can read seismic signals and process and locate the earthquakes. This software is used by the General Directorate of Disaster Affairs Earthquake Research Department Seismology Division (here after ERD) and will improve according to the new requirements. ERD network consist of 87 seismic stations that 63 of them were equipped with 24 bite digital Guralp CMG-3T, 16 of them with analogue short period S-13-Geometrics and 8 of them 24 bite digital short period S-13j-DR-24 Geometrics seismometers. Data is transmitted with satellite from broadband stations, whereas leased line used from short period stations. Daily data archive capacity is 4 GB. In big networks, it is very important that observe the seismic signals and locate the earthquakes as soon as possible. This is possible, if they use software which was developed considering their network properties. When we started to develop a software for big networks as our, we recognized some realities that all known seismic format data should be read without any convert process, observing of the only selected stations and do this on the map directly, add seismic files with import command, establishing relation between P and S phase readings and location solutions, store in database and entering to the program with user name and password. In this way, we can prevent data disorder and repeated phase readings. There are many advantages, when data store on the database proxies. These advantages are easy access to data from anywhere using ethernet, publish the bulletin and catalogues using website, easily sending of short message (sms) and e

  6. Napa earthquake: An earthquake in a highly connected world

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  7. Prediction of earthquake-triggered landslide event sizes

    NASA Astrophysics Data System (ADS)

    Braun, Anika; Havenith, Hans-Balder; Schlögel, Romy

    2016-04-01

    Seismically induced landslides are a major environmental effect of earthquakes, which may significantly contribute to related losses. Moreover, in paleoseismology landslide event sizes are an important proxy for the estimation of the intensity and magnitude of past earthquakes and thus allowing us to improve seismic hazard assessment over longer terms. Not only earthquake intensity, but also factors such as the fault characteristics, topography, climatic conditions and the geological environment have a major impact on the intensity and spatial distribution of earthquake induced landslides. We present here a review of factors contributing to earthquake triggered slope failures based on an "event-by-event" classification approach. The objective of this analysis is to enable the short-term prediction of earthquake triggered landslide event sizes in terms of numbers and size of the affected area right after an earthquake event occurred. Five main factors, 'Intensity', 'Fault', 'Topographic energy', 'Climatic conditions' and 'Surface geology' were used to establish a relationship to the number and spatial extend of landslides triggered by an earthquake. The relative weight of these factors was extracted from published data for numerous past earthquakes; topographic inputs were checked in Google Earth and through geographic information systems. Based on well-documented recent earthquakes (e.g. Haiti 2010, Wenchuan 2008) and on older events for which reliable extensive information was available (e.g. Northridge 1994, Loma Prieta 1989, Guatemala 1976, Peru 1970) the combination and relative weight of the factors was calibrated. The calibrated factor combination was then applied to more than 20 earthquake events for which landslide distribution characteristics could be cross-checked. One of our main findings is that the 'Fault' factor, which is based on characteristics of the fault, the surface rupture and its location with respect to mountain areas, has the most important

  8. Pre-earthquake Magnetic Pulses

    NASA Astrophysics Data System (ADS)

    Scoville, J.; Heraud, J. A.; Freund, F. T.

    2015-12-01

    A semiconductor model of rocks is shown to describe unipolar magnetic pulses, a phenomenon that has been observed prior to earthquakes. These pulses are suspected to be generated deep in the Earth's crust, in and around the hypocentral volume, days or even weeks before earth quakes. Their extremely long wavelength allows them to pass through kilometers of rock. Interestingly, when the sources of these pulses are triangulated, the locations coincide with the epicenters of future earthquakes. We couple a drift-diffusion semiconductor model to a magnetic field in order to describe the electromagnetic effects associated with electrical currents flowing within rocks. The resulting system of equations is solved numerically and it is seen that a volume of rock may act as a diode that produces transient currents when it switches bias. These unidirectional currents are expected to produce transient unipolar magnetic pulses similar in form, amplitude, and duration to those observed before earthquakes, and this suggests that the pulses could be the result of geophysical semiconductor processes.

  9. Differential energy radiation from two earthquakes in Japan with identical Mw: The Kyushu 1996 and Tottori 2000 earthquakes

    USGS Publications Warehouse

    Choy, G.L.; Boatwright, J.

    2009-01-01

    We examine two closely located earthquakes in Japan that had identical moment magnitudes Mw but significantly different energy magnitudes Me. We use teleseismic data from the Global Seismograph Network and strong-motion data from the National Research Institute for Earth Science and Disaster Prevention's K-Net to analyze the 19 October 1996 Kyushu earthquake (Mw 6.7, Me 6.6) and the 6 October 2000 Tottori earthquake (Mw 6.7, Me 7.4). To obtain regional estimates of radiated energy ES we apply a spectral technique to regional (<200 km) waveforms that are dominated by S and Lg waves. For the thrust-fault Kyushu earthquake, we estimate an average regional attenuation Q(f) 230f0:65. For the strike-slip Tottori earthquake, the average regional attenuation is Q(f) 180f0:6. These attenuation functions are similar to those derived from studies of both California and Japan earthquakes. The regional estimate of ES for the Kyushu earthquake, 3:8 ?? 1014 J, is significantly smaller than that for the Tottori earthquake, ES 1:3 ?? 1015 J. These estimates correspond well with the teleseismic estimates of 3:9 ?? 1014 J and 1:8 ?? 1015 J, respectively. The apparent stress (Ta = ??Es/M0 with ?? equal to rigidity) for the Kyushu earthquake is 4 times smaller than the apparent stress for the Tottori earthquake. In terms of the fault maturity model, the significantly greater release of energy by the strike-slip Tottori earthquake can be related to strong deformation in an immature intraplate setting. The relatively lower energy release of the thrust-fault Kyushu earthquake can be related to rupture on mature faults at a subduction environment. The consistence between teleseismic and regional estimates of ES is particularly significant as teleseismic data for computing ES are routinely available for all large earthquakes whereas often there are no near-field data.

  10. Eldercare Locator

    MedlinePlus

    ... page content Skip Navigation Department of Health and Human Services Your Browser ... Welcome to the Eldercare Locator, a public service of the U.S. Administration on Aging connecting you to services for older ...

  11. Pre-Earthquake Unipolar Electromagnetic Pulses

    NASA Astrophysics Data System (ADS)

    Scoville, J.; Freund, F.

    2013-12-01

    Transient ultralow frequency (ULF) electromagnetic (EM) emissions have been reported to occur before earthquakes [1,2]. They suggest powerful transient electric currents flowing deep in the crust [3,4]. Prior to the M=5.4 Alum Rock earthquake of Oct. 21, 2007 in California a QuakeFinder triaxial search-coil magnetometer located about 2 km from the epicenter recorded unusual unipolar pulses with the approximate shape of a half-cycle of a sine wave, reaching amplitudes up to 30 nT. The number of these unipolar pulses increased as the day of the earthquake approached. These pulses clearly originated around the hypocenter. The same pulses have since been recorded prior to several medium to moderate earthquakes in Peru, where they have been used to triangulate the location of the impending earthquakes [5]. To understand the mechanism of the unipolar pulses, we first have to address the question how single current pulses can be generated deep in the Earth's crust. Key to this question appears to be the break-up of peroxy defects in the rocks in the hypocenter as a result of the increase in tectonic stresses prior to an earthquake. We investigate the mechanism of the unipolar pulses by coupling the drift-diffusion model of semiconductor theory to Maxwell's equations, thereby producing a model describing the rock volume that generates the pulses in terms of electromagnetism and semiconductor physics. The system of equations is then solved numerically to explore the electromagnetic radiation associated with drift-diffusion currents of electron-hole pairs. [1] Sharma, A. K., P. A. V., and R. N. Haridas (2011), Investigation of ULF magnetic anomaly before moderate earthquakes, Exploration Geophysics 43, 36-46. [2] Hayakawa, M., Y. Hobara, K. Ohta, and K. Hattori (2011), The ultra-low-frequency magnetic disturbances associated with earthquakes, Earthquake Science, 24, 523-534. [3] Bortnik, J., T. E. Bleier, C. Dunson, and F. Freund (2010), Estimating the seismotelluric current

  12. The use of subsurface thermal data, isotopic tracers and earthquake hypocenter locations to unravel deep regional flow systems within the crystalline basement beneath the Rio Grande rift, New Mexico. (Invited)

    NASA Astrophysics Data System (ADS)

    Person, M. A.; Woolsey, E.; Pepin, J.; Crossey, L. J.; Karlstrom, K. E.; Phillips, F. M.; Kelley, S.; Timmons, S.

    2013-12-01

    The Rio Grande rift in New Mexico hosts a number of low-temperature geothermal systems as well as the 19 km deep Socorro Magma Body. The presence of a mantle helium anomaly measured at San Acacia spring (3He/4He = 0.295 RA) and in an adjacent shallow well (50m < ; 0.8 RA) overlying the Socorro Magma Body at the southern terminus of the Albuquerque Basin suggests that deeply sourced fluids mix with the sedimentary basin groundwater flow system. Temperatures recorded at the base of the San Acacia well is elevated (29 oC). Published estimates of uplift rates and heat flow suggest that the magma body was emplaced about 1-3 ka and reflects a long-lived (several Ma) magmatic system. Further south near the southern terminus of the Engle Basin, much warmer temperatures (42 oC) occur at shallow depths within the spa district in the town of Truth or Consequences at shallow depths also suggesting deep-fluid circulation. 14C constrained apparent groundwater residence times in the spa district range between 6-10 ka. We have developed two 6-19 km deep crustal-scale, cross-sectional models that simulate subsurface fluid flow, heat and isotope (3He/4He) transport as well as groundwater residence times along the Rio Grande rift. The North-South oriented model of the Albuquerque Basin incorporates a high-permeability conduit 100 m wide having hydrologic properties differing from surrounding crystalline basement units. We use these models to constrain the crustal permeability structure and fluid circulation patterns beneath the Albuquerque and Engle Basins. Model results are compared to measurements of groundwater temperatures, residence times (14C), and 3He/4He data. We also use the distribution of earthquake hypocenters to constrain likely fault-crystalline basement hydraulic interactions in the seismogenic crust above the Socorro Magma Body. For the case of the southern Albuquerque Basin, conduit permeability associated with the Indian Hill conduit/fault zone must range between

  13. Grading More Accurately

    ERIC Educational Resources Information Center

    Rom, Mark Carl

    2011-01-01

    Grades matter. College grading systems, however, are often ad hoc and prone to mistakes. This essay focuses on one factor that contributes to high-quality grading systems: grading accuracy (or "efficiency"). I proceed in several steps. First, I discuss the elements of "efficient" (i.e., accurate) grading. Next, I present analytical results…

  14. Interaction of the san jacinto and san andreas fault zones, southern california: triggered earthquake migration and coupled recurrence intervals.

    PubMed

    Sanders, C O

    1993-05-14

    Two lines of evidence suggest that large earthquakes that occur on either the San Jacinto fault zone (SJFZ) or the San Andreas fault zone (SAFZ) may be triggered by large earthquakes that occur on the other. First, the great 1857 Fort Tejon earthquake in the SAFZ seems to have triggered a progressive sequence of earthquakes in the SJFZ. These earthquakes occurred at times and locations that are consistent with triggering by a strain pulse that propagated southeastward at a rate of 1.7 kilometers per year along the SJFZ after the 1857 earthquake. Second, the similarity in average recurrence intervals in the SJFZ (about 150 years) and in the Mojave segment of the SAFZ (132 years) suggests that large earthquakes in the northern SJFZ may stimulate the relatively frequent major earthquakes on the Mojave segment. Analysis of historic earthquake occurrence in the SJFZ suggests little likelihood of extended quiescence between earthquake sequences. PMID:17818388

  15. Earthquake Lights and Electric Ground Potentials Following the South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Freund, F. T.; Scoville, J.; Heraud, J. A.; Spremo, S.; Sornette, J.; Kosovichev, P.; Baney, O. N.

    2014-12-01

    Earthquake lights (EQLs) in form of bright flashes have been documented multiple times by private security cameras during and after the M6.0 South Napa earthquake of Aug. 24, 2014. On the video records, series of flashes are seen rising out of the ground, sometimes in rapid succession, at other times as single events brightly illuminating the night sky. The EQLs appear to come from extended sources, probably up to hundreds of meters in lateral extent. Though few video records display accurate GPS-timing, most of the flashes were clearly co-seismic in the sense that they coincided with the local arrival of the seismic waves. This pattern is consistent with records obtained by a surveillance camera and a seismometer co-located on the PUCP campus in Lima, Peru, during the arrival of the P and S waves from the M8.0 Pisco earthquake about 150 km to the southeast of Lima. Analysis of the PUCP and other video records, plus a number of eyewitness reports, indicate that the EQLs were associated (i) with the S waves and (ii) with mafic dykes. Attempts to see to detect the Napa EQLs on records of the GOES satellite were unsuccessful. Unusual conditions have to exist to produce electric discharges at the Earth's surface that can rise 100-200 m into the sky. Key to understanding the underlying processes is the fact that, when mafic rocks are stressed, positive hole charge carriers become activated, i.e. defect electrons in the oxygen anion sublattice. The higher the stress rate, the higher the currents, reaching currents on the order of 1-2 billion A/km3 during compaction of gabbro within 1-2 msec. Obviously, when S waves pass through rocks at velocities around 3.4 km/sec, large numbers of positive holes appear. Flowing out of the stressed rock volume they can create very high electric fields, leading to a number of follow-on processes including corona discharges. In Campbell near San Jose, about 60 km south of Napa, a security camera, motion-triggered by the arrival of the

  16. Mechanics of Multifault Earthquake Ruptures

    NASA Astrophysics Data System (ADS)

    Fletcher, J. M.; Oskin, M. E.; Teran, O.

    2015-12-01

    The 2010 El Mayor-Cucapah earthquake of magnitude Mw 7.2 produced the most complex rupture ever documented on the Pacific-North American plate margin, and the network of high- and low-angle faults activated in the event record systematic changes in kinematics with fault orientation. Individual faults have a broad and continuous spectrum of slip sense ranging from endmember dextral strike slip to normal slip, and even faults with thrust sense of dip slip were commonly observed in the aftershock sequence. Patterns of coseismic slip are consistent with three-dimensional constrictional strain and show that integrated transtensional shearing can be accommodated in a single earthquake. Stress inversions of coseismic surface rupture and aftershock focal mechanisms define two coaxial, but permuted stress states. The maximum (σ1) and intermediate (σ2) principal stresses are close in magnitude, but flip orientations due to topography- and density-controlled gradients in lithostatic load along the length of the rupture. Although most large earthquakes throughout the world activate slip on multiple faults, the mechanical conditions of their genesis remain poorly understood. Our work attempts to answer several key questions. 1) Why do complex fault systems exist? They must do something that simple, optimally-oriented fault systems cannot because the two types of faults are commonly located in close proximity. 2) How are faults with diverse orientations and slip senses prepared throughout the interseismic period to fail spontaneously together in a single earthquake? 3) Can a single stress state produce multi-fault failure? 4) Are variations in pore pressure, friction and cohesion required to produce simultaneous rupture? 5) How is the fabric of surface rupture affected by variations in orientation, kinematics, total geologic slip and fault zone architecture?

  17. The great 1933 Sanriku-oki earthquake: reappraisal of the mainshock and its aftershocks and implications for its tsunami using regional tsunami and seismic data

    NASA Astrophysics Data System (ADS)

    Uchida, Naoki; Kirby, Stephen; Umino, Norihito; Hino, Ryota; Kazakami, Tomoe

    2016-06-01

    The aftershock distribution of the 1933 Sanriku-oki outer-trench earthquake is estimated by using modern relocation methods and a newly developed velocity structure to examine the spatial extent of the source-fault and the possibility of a triggered interplate seismicity. In this study, we first examined the regional data quality of the 1933 earthquake based on smoked paper records and then relocated the earthquakes by using the 3-D velocity structure and double-difference method. The improvements of hypocenter locations using these methods were confirmed by the examination of recent earthquakes that are accurately located based on OBS data. The results show that the 1933 aftershocks occurred under both the outer- and inner-trench-slope regions. In the outer-trench-slope region, aftershock are distributed in a ˜280-km-long area and their depths are shallower than 50 km. Although we could not constrain the fault geometry from the hypocenter distribution, the depth distribution suggests the whole lithosphere is probably not under deviatoric tension at the time of the 1933 earthquake. The occurrence of aftershocks under the inner trench slope was also confirmed by an investigation of waveform frequency difference between outer and inner trench earthquakes as recorded at Mizusawa. The earthquakes under the inner trench slope were shallow (depth ≦ 30 km) and the waveforms show a low-frequency character similar to the waveforms of recent, precisely located earthquakes in the same area. They are also located where recent activity of interplate thrust earthquakes is high. These suggests the 1933 outer-trench-slope mainshock triggered interplate earthquakes which is an unusual case in the order of occurrence in contrast with the more common pairing of a large initial interplate shock with subsequent outer-slope earthquakes. The off-trench earthquakes are distributed about 80 km width in the trench perpendicular direction. This wide width cannot be explained from a single

  18. Automated seismic event location by arrival time stacking: Applications to local and micro-seismicity

    NASA Astrophysics Data System (ADS)

    Grigoli, F.; Cesca, S.; Braun, T.; Philipp, J.; Dahm, T.

    2012-04-01

    Locating seismic events is one of the oldest problem in seismology. In microseismicity application, when the number of event is very large, it is not possible to locate earthquake manually and automated location procedures must be established. Automated seismic event location at different scales is very important in different application areas, including mining monitoring, reservoir geophysics and early warning systems. Location is needed to start rescue operations rapidly. Locating and mapping microearthquakes or acoustic emission sources in mining environments is important for monitoring of mines stability. Mapping fractures through microseimicity distribution inside hydrocarbon reservoirs is needed to find areas with an higher permeability and enhance oil production. In the last 20 years a large number of picking algorithm was developed in order to locate seismic events automatically. While P onsets can now be accurately picked using automatic routines, the automatic picking of later seismic phases (including S onset) is still problematic , thus limiting the location performance. In this work we present a picking free location method based on the use of the Short-Term-Average/Long-Term-Average (STA/LTA) traces at different stations as observed data. For different locations and origin times, observed STA/LTA are stacked along the travel time surface corresponding to the selected hypocentre. Iterating this procedure on a three-dimensional grid we retrieve a multidimensional matrix whose absolute maximum corresponds to the spatio-temporal coordinates of the seismic event. We tested our methodology on synthetic data, simulating different environments and network geometries. Finally, we apply our method to real datasets related to microseismic activity in mines and earthquake swarms in Italy. This work has been funded by the German BMBF "Geotechnologien" project MINE (BMBF03G0737A).

  19. Can we control earthquakes?

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    In 1966, it was discovered that high pressure injection of industrial waste fluids into the subsurface near Denver, Colo., was triggering earthquakes. While this was disturbing at the time, it was also exciting because there was immediate speculation that here at last was a mechanism to control earthquakes.  

  20. Earthquake history of Texas

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

    Seventeen earthquakes, intensity V or greater, have centered in Texas since 1882, when the first shock was reported. The strongest earthquake, a maximum intensity VIII, was in western Texas in 1931 and was felt over 1 165 000 km 2. Three shocks in the Panhandle region in 1925, 1936, and 1943 were widely felt. 

  1. Earthquake research in China

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    The prediction of the Haicheng earthquake was an extraordinary achievement by the geophysical workers of the People's Republic of China, whose national program in earthquake reserach was less than 10 years old at the time. To study the background to this prediction, a delgation of 10 U.S scientists, which I led, visited China in June 1976. 

  2. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  3. Observation of ionospheric disturbances for earthquakes (M>4) occurred during June 2013 to July 2014 in Indonesia using wavelets

    NASA Astrophysics Data System (ADS)

    Revathi, R.; Lakshminarayana, S.; Koteswara Rao, S.; Ramesh, K. S.; Uday Kiran, K.

    2016-05-01

    Seismo-ionospheric perturbations are extensively studied for large earthquakes occurred over various parts of the world. Specific signatures of these natural events are observed in the Total electron content (TEC) data prior to their occurrence. Analysis of these natural disasters occurring at a specific location will help in their accurate detection and prediction. In this paper the Java region of Indonesia comprising of a belt of volcanic mountains, where a considerable number of events to analyze their characteristics in the ionosphere are considered for study. This region of Indonesia has an International Global Navigation Satellite System Station at Bakosturnal, Indonesia. Vertical total electron content data on the earthquake day is analyzed for 13 events occurred during June 2013 to July 2014.

  4. Lacustrine turbidites as a tool for quantitative earthquake reconstruction: New evidence for a variable rupture mode in south central Chile

    NASA Astrophysics Data System (ADS)

    Moernaut, Jasper; Daele, Maarten Van; Heirman, Katrien; Fontijn, Karen; Strasser, Michael; Pino, Mario; Urrutia, Roberto; De Batist, Marc

    2014-03-01

    Understanding the long-term earthquake recurrence pattern at subduction zones requires continuous paleoseismic records with excellent temporal and spatial resolution and stable threshold conditions. South central Chilean lakes are typically characterized by laminated sediments providing a quasi-annual resolution. Our sedimentary data show that lacustrine turbidite sequences accurately reflect the historical record of large interplate earthquakes (among others the 2010 and 1960 events). Furthermore, we found that a turbidite's spatial extent and thickness are a function of the local seismic intensity and can be used for reconstructing paleo-intensities. Consequently, our multilake turbidite record aids in pinpointing magnitudes, rupture locations, and extent of past subduction earthquakes in south central Chile. Comparison of the lacustrine turbidite records with historical reports, a paleotsunami/subsidence record, and a marine megaturbidite record demonstrates that the Valdivia Segment is characterized by a variable rupture mode over the last 900 years including (i) full ruptures (Mw ~9.5: 1960, 1575, 1319 ± 9, 1127 ± 44), (ii) ruptures covering half of the Valdivia Segment (Mw ~9: 1837), and (iii) partial ruptures of much smaller coseismic slip and extent (Mw ~7.5-8: 1737, 1466 ± 4). Also, distant or smaller local earthquakes can leave a specific sedimentary imprint which may resolve subtle differences in seismic intensity values. For instance, the 2010 event at the Maule Segment produced higher seismic intensities toward southeastern localities compared to previous megathrust ruptures of similar size and extent near Concepción.

  5. LLNL-Generated Content for the California Academy of Sciences, Morrison Planetarium Full-Dome Show: Earthquake

    SciTech Connect

    Rodgers, A J; Petersson, N A; Morency, C E; Simmons, N A; Sjogreen, B

    2012-01-23

    The California Academy of Sciences (CAS) Morrison Planetarium is producing a 'full-dome' planetarium show on earthquakes and asked LLNL to produce content for the show. Specifically the show features numerical ground motion simulations of the M 7.9 1906 San Francisco and a possible future M 7.05 Hayward fault scenario earthquake. The show also features concepts of plate tectonics and mantle convection using images from LLNL's G3D global seismic tomography. This document describes the data that was provided to the CAS in support of production of the 'Earthquake' show. The CAS is located in Golden Gate Park, San Francisco and hosts over 1.6 million visitors. The Morrison Planetarium, within the CAS, is the largest all digital planetarium in the world. It features a 75-foot diameter spherical section projection screen tilted at a 30-degree angle. Six projectors cover the entire field of view and give a three-dimensional immersive experience. CAS shows strive to use scientifically accurate digital data in their productions. The show, entitled simply 'Earthquake', will debut on 26 May 2012. They are working on graphics and animations based on the same data sets for display on LLNL powerwalls and flat-screens as well as for public release.

  6. Tracking the rupture of the Mw = 9.3 Sumatra earthquake over 1,150 km at teleseismic distance.

    PubMed

    Krüger, Frank; Ohrnberger, Matthias

    2005-06-16

    On 26 December 2004, a moment magnitude Mw = 9.3 earthquake occurred along Northern Sumatra, the Nicobar and Andaman islands, resulting in a devastating tsunami in the Indian Ocean region. The rapid and accurate estimation of the rupture length and direction of such tsunami-generating earthquakes is crucial for constraining both tsunami wave-height models as well as the seismic moment of the events. Compressional seismic waves generated at the hypocentre of the Sumatra earthquake arrived after about 12 min at the broadband seismic stations of the German Regional Seismic Network (GRSN), located approximately 9,000 km from the event. Here we present a modification of a standard array-seismological approach and show that it is possible to track the propagating rupture front of the Sumatra earthquake over a total rupture length of 1,150 km. We estimate the average rupture speed to be 2.3-2.7 km s(-1) and the total duration of rupture to be at least 430 s, and probably between 480 and 500 s. PMID:15908983

  7. Using Novel Earthquake Early Warning (EEW) with Optimized Sensor Model to Determine How Establishments Will Be Affected in a 7.0 Hayward Earthquake Scenario

    NASA Astrophysics Data System (ADS)

    Munnangi, P.

    2015-12-01

    . Onsite was better issuing quick warnings, but less reliable due to its one station dependency. ElarmS had more delay, but was more accurate since it used four stations. Knowing how the different variables affect EEW can provide citizens with an outlook of how to be prepared for a big earthquake in the Bay Area based on their location and proximity to faults and seismic stations.

  8. Fluid injection triggering of 2011 earthquake sequence in Oklahoma

    NASA Astrophysics Data System (ADS)

    Keranen, K. M.; Savage, H. M.; Abers, G. A.; Cochran, E. S.

    2012-12-01

    Significant earthquakes are increasingly occurring within the United States midcontinent, with nine having moment-magnitude (Mw) ≥4.0 and five with Mw≥5.0 in 2011 alone. In parallel, wastewater injection into deep sedimentary formations has increased as unconventional oil and gas resources are developed. Injected fluids may lower normal stress on existing fault planes, and the correlation between injection wells and earthquake locations led to speculation that many 2011 earthquakes were triggered by injection. The largest earthquake potentially related to injection (Mw5.7) struck in November 2011 in central Oklahoma. Here we use aftershocks to document the fault patterns responsible for the M5.7 earthquake and a prolific sequence of related events, and use the timing and spatial correlation of the earthquakes with injection wells and subsurface structures to show that the earthquakes were likely triggered by fluid injection. The aftershock sequence details rupture along three distinct fault planes, the first of which reaches within 250 meters of active injection wells and within 1 km of the surface. This earthquake sequence began where fluids are injected at low pressure into a depleted oil reservoir bound by faults that effectively seal fluid flow. Injection into sealed compartments allows reservoir pressure to increase gradually over time, suggesting that reservoir volume, in this case, controls the triggering timescale. This process allows multi-year lags between the commencement of fluid injection and triggered earthquakes.

  9. Modeling earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Charpentier, Arthur; Durand, Marilou

    2015-07-01

    In this paper, we investigate questions arising in Parsons and Geist (Bull Seismol Soc Am 102:1-11, 2012). Pseudo causal models connecting magnitudes and waiting times are considered, through generalized regression. We do use conditional model (magnitude given previous waiting time, and conversely) as an extension to joint distribution model described in Nikoloulopoulos and Karlis (Environmetrics 19: 251-269, 2008). On the one hand, we fit a Pareto distribution for earthquake magnitudes, where the tail index is a function of waiting time following previous earthquake; on the other hand, waiting times are modeled using a Gamma or a Weibull distribution, where parameters are functions of the magnitude of the previous earthquake. We use those two models, alternatively, to generate the dynamics of earthquake occurrence, and to estimate the probability of occurrence of several earthquakes within a year or a decade.

  10. Search for seismic forerunners to earthquakes in central California

    USGS Publications Warehouse

    Wesson, R.L.; Robinson, R.; Bufe, C.G.; Ellsworth, W.L.; Pfluke, J.H.; Steppe, J.A.; Seekins, L.C.

    1977-01-01

    The relatively high seismicity of the San Andreas fault zone in central California provides an excellent opportunity to search for seismic forerunners to moderate earthquakes. Analysis of seismic traveltime and earthquake location data has resulted in the identification of two possible seismic forerunners. The first is a period of apparently late (0.3 sec) P-wave arrival times lasting several weeks preceding one earthquake of magnitude 5.0. The rays for these travel paths passed through - or very close to - the aftershock volume of the subsequent earthquake. The sources for these P-arrival time data were earthquakes in the distance range 20-70 km. Uncertainties in the influence of small changes in the hypocenters of the source earthquakes and in the identification of small P-arrivals raise the possibility that the apparantly delayed arrivals are not the result of a decrease in P-velocity. The second possible precursor is an apparent increase in the average depth of earthquakes preceding two moderate earthquakes. This change might be only apparent, caused by a location bias introduced by a decrease in P-wave velocity, but numerical modeling for realistic possible changes in velocity suggests that the observed effect is more likely a true migration of earthquakes. To carry out this work - involving the manipulation of several thousand earthquake hypocenters and several hundred thousand readings of arrival time - a system of data storage was designed and manipulation programs for a large digital computer have been executed. This system allows, for example, the automatic selection of earthquakes from a specific region, the extraction of all the observed arrival times for these events, and their relocation under a chosen set of assumptions. ?? 1977.

  11. Precise Relative Earthquake Magnitudes from Cross Correlation

    DOE PAGESBeta

    Cleveland, K. Michael; Ammon, Charles J.

    2015-04-21

    We present a method to estimate precise relative magnitudes using cross correlation of seismic waveforms. Our method incorporates the intercorrelation of all events in a group of earthquakes, as opposed to individual event pairings relative to a reference event. This method works well when a reliable reference event does not exist. We illustrate the method using vertical strike-slip earthquakes located in the northeast Pacific and Panama fracture zone regions. Our results are generally consistent with the Global Centroid Moment Tensor catalog, which we use to establish a baseline for the relative event sizes.

  12. Scenario earthquake hazards for the Long Valley Caldera-Mono Lake area, east-central California

    USGS Publications Warehouse

    Chen, Rui; Branum, David M.; Wills, Chris J.; Hill, David P.

    2014-01-01

    than on the hill side. The effect of fault rupture displacements may be localized along the surface trace of the mapped earthquake fault if fault geometry is simple and the fault traces are accurately located. However, surface displacement hazards can spread over a few hundred meters to a few kilometers if the earthquake fault has numerous splays or branches, such as the Hilton Creek Fault. The amplitude of rupture displacement is estimated to be about 1 meter along normal faults in the study area and close to 2 meters along the White Mountains Fault Zone. All scenarios show the possibility of widespread ground failure. Liquefaction damage would likely occur in the areas of higher ground shaking near the faults where there are sandy/silty sediments and the depth to groundwater is 20 feet or less. Generally, this means damage is most common near lakes and streams in the areas of strongest shaking. Landslide potential exists throughout the study region. All steep slopes (>30 degrees) present a potential hazard at any level of shaking. Lesser slopes may have landslides within the areas of the higher ground shaking. The landslide hazard zones also are likely sources for snow avalanches during winter months and for large boulders that can be shaken loose and roll hundreds of feet down hill, which happened during the 1980 Mammoth Lakes Earthquakes. Whereas methodologies used in estimating ground shaking, liquefaction, and landslides have been well developed and have been applied in published hazard maps, methodologies used in estimating surface fault displacement are still being developed. Therefore, this report provides a more in-depth and detailed discussion of methodologies used for deterministic and probabilistic fault displacement hazard analyses for this project.

  13. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  14. Crustal heterogeneities beneath the 2011 Talala, Saurashtra earthquake, Gujarat, India source zone: Seismological evidence for neo-tectonics

    NASA Astrophysics Data System (ADS)

    Singh, A. P.; Mishra, O. P.; Rastogi, B. K.; Kumar, Santosh

    2013-01-01

    During the 1st decade of the 21st century, the study area of Talala, Saurashtra of western India witnessed three damaging earthquakes of moderate magnitude, year 2007 [Mw 5.0; Mw 4.8] and in the year 2011 [Mw 5.1] that generated public panic in the region. The last damaging moderate earthquake of the 20th October 2011 in Talala region (21.09°N;70.45°E), located at about 200 km south to the devastating 2001 Bhuj (23.412°N, 70.232°E) mainshock (Mw 7.6), jolted the entire Saurashtra region of Gujarat. A long series of aftershocks followed hereafter, recorded at nine seismograph/accelerograph stations. Hypocenters of aftershocks were relocated accurately using absolute and relative travel time (double-difference) method. In this study, we, for the first time, determined 3-D tomographic images of the upper crust beneath the 2011 Talala earthquake source zone by inverting about 1135 P and 1125 S wave arrival time data. Estimates of seismic velocities (Vp, Vs) and Poisson's ratio (σ) structures offer a reliable interpretation of crustal heterogeneities and their bearing on geneses of moderate earthquakes and their aftershock sequences beneath the source zone. It is found that the 2011 Talala mainshock hypocenter depth (6 km) is located near the boundary of the low and high velocity (Vp, Vs) and the source zone is associated with low-σ anomalies guarded by the prominent high-σ anomalies along the active fault zone having strike-slip motion beneath the earthquake source zone. The pattern of distribution of (Vp, Vs, σ) and its association with occurrences of aftershocks provide seismological evidence for the neo-tectonics in the region having left lateral strike-slip motion of the fault.

  15. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  16. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  17. Earthquake Early Warning: User Education and Designing Effective Messages

    NASA Astrophysics Data System (ADS)

    Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L.

    2014-12-01

    The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental

  18. Precursors to Great Earthquakes Along the Nankai Trough; Slow Earthquakes, Non-volcanic Deep Tremor and Slow Slip

    NASA Astrophysics Data System (ADS)

    Sacks, S.; Linde, A. T.

    2007-12-01

    Just before (and during) the December 1944 Tonankai great earthquake, the Military Survey Institute of Japan carried out leveling surveys in the anticipated region of the earthquake. Mogi (1985) pointed out that large closure errors, for line segments measured on the day before and the morning of the earthquake, could be indicative of continuing tilting due to precursory slip. Before the 1946 Nankaido great earthquake there were level changes recorded by tide gauges and also large changes in water levels in wells. We have shown (Linde and Sacks 2002) that the pre-earthquake changes for both events are indicative of slow slip on the down-dip extension of the seismogenic zone, a region that can store strain energy but fails with slow slip. The coseismic slip for both earthquakes averages about 4 meters; the down-dip slow slip was determined to be about half the seismogenic value. In his most recent studies of the same area, Obara (2006) reports that small (~cm) slow slip events and non-volcanic tremor occur on the upper surface of the subducting plate. The locations for these events correspond rather closely to the areas we proposed as having slow slip precursory to the great earthquakes. Additionally the slip rate from Obara's work would result in about 2 meters being released in 100 years, the approximate return interval for the great earthquakes. This is consistent with the deficit being released as a large slow event just before those great earthquakes.

  19. The dynamic implication of focal mechanism solutions of Wenchuan earthquake sequence

    NASA Astrophysics Data System (ADS)

    Hu, X.; Cui, X.; Chen, L.

    2010-12-01

    At 14:28 CST on May 12, 2008, the disastrous MS8.0 Wenchuan earthquake took place in Sichuan province of China, followed by tens of thousands of aftershocks. It occurred on the Longmenshan fault zone which is a high-angle inland over-thrust fault. What is the corresponding tectonic implication behind the focal mechanism is an important issue to realize the dynamic mechanism of Wenchuan earthquake sequence. In order to reveal it, our research as following is carried out. 1) We extensively collected digital waveform records, seriously and strictly read out P wave first motion, used more accurate locating results, employed the improved grid point test method, computed the focal mechanism solutions of Wenchuan earthquake sequence, and gave out 125 reliable focal mechanism solutions (M≥4.0,including a M3.9 earthquake result). The result shows that most of the earthquakes are thrust or strike-slip. The focal mechanism solutions have characteristic of subsection distribution. The P axis mainly distribute in direction of E-W within a certain range. 2) According to the main faults information and fine medium parameters, using ANSYS software, we established a three-dimensional elastic finite element model of the Longmenshan fault zone and its surrounding areas. According to Global Position System observations, considering three representative patterns of tectonic deformation velocity varying with depth, we loaded three different boundary conditions and got each numerical simulation result of the tectonic stress field in this region. The simulation results of the three patterns consistently show that the orientation of the principal compressive stress is almost E-W in Longmenshan fault zone and its surrounding area. And from north to south, there is a clockwise rotation within a narrow range. The result is consistent with the previous research in these areas. 3) Based on the numerical simulation results of tectonic stress field of three structural deformation patterns

  20. Earthquake induced landslide hazard field observatory in the Avcilar peninsula

    NASA Astrophysics Data System (ADS)

    Bigarre, Pascal; Coccia, Stella; Theoleyre, Fiona; Ergintav, Semih; Özel, Oguz; Yalçinkaya, Esref; Lenti, Luca; Martino, Salvatore; Gamba, Paolo; Zucca, Francesco; Moro, Marco

    2015-04-01

    Earthquake-triggered landslides have an increasing disastrous impact in seismic regions due to the fast growing urbanization and infrastructures. Just considering disasters from the last fifteen years, among which the 1999 Chi-Chi earthquake, the 2008 Wenchuan earthquake, and the 2011 Tohoku earthquake, these events generated tens of thousands of coseismic landslides. Those resulted in amazing death toll and considerable damages, affecting the regional landscape including its hydrological main features. Despite a strong impetus in research during past decades, knowledge on those geohazards is still fragmentary, while databases of high quality observational data are lacking. These phenomena call for further collaborative researches aiming eventually to enhance preparedness and crisis management. The MARSITE project gathers research groups in a comprehensive monitoring activity developed in the Sea of Marmara Region, one of the most densely populated parts of Europe and rated at high seismic risk level since the 1999 Izmit and Duzce devastating earthquakes. Besides the seismic threat, landslides in Turkey and in this region constitute an important source of loss. The 6th Work Package of MARSITE project gathers 9 research groups to study earthquake-induced landslides focusing on two sub-regional areas of high interest among which the Cekmece-Avcilar peninsula, located westwards of Istanbul, as a highly urbanized concentrated landslide prone area, showing high susceptibility to both rainfalls while affected by very significant seismic site effects. A multidisciplinary research program based on pre-existing studies has been designed with objectives and tasks linked to constrain and tackle progressively some challenging issues related to data integration, modeling, monitoring and mapping technologies. Since the start of the project, progress has been marked on several important points as follows. The photogeological interpretation and analysis of ENVISAT-ERS DIn

  1. The mass balance of earthquakes and earthquake sequences

    NASA Astrophysics Data System (ADS)

    Marc, O.; Hovius, N.; Meunier, P.

    2016-04-01

    Large, compressional earthquakes cause surface uplift as well as widespread mass wasting. Knowledge of their trade-off is fragmentary. Combining a seismologically consistent model of earthquake-triggered landsliding and an analytical solution of coseismic surface displacement, we assess how the mass balance of single earthquakes and earthquake sequences depends on fault size and other geophysical parameters. We find that intermediate size earthquakes (Mw 6-7.3) may cause more erosion than uplift, controlled primarily by seismic source depth and landscape steepness, and less so by fault dip and rake. Such earthquakes can limit topographic growth, but our model indicates that both smaller and larger earthquakes (Mw < 6, Mw > 7.3) systematically cause mountain building. Earthquake sequences with a Gutenberg-Richter distribution have a greater tendency to lead to predominant erosion, than repeating earthquakes of the same magnitude, unless a fault can produce earthquakes with Mw > 8 or more.

  2. Likelihood testing of seismicity-based rate forecasts of induced earthquakes in Oklahoma and Kansas

    NASA Astrophysics Data System (ADS)

    Moschetti, M. P.; Hoover, S. M.; Mueller, C. S.

    2016-05-01

    Likelihood testing of induced earthquakes in Oklahoma and Kansas has identified the parameters that optimize the forecasting ability of smoothed seismicity models and quantified the recent temporal stability of the spatial seismicity patterns. Use of the most recent 1 year period of earthquake data and use of 10-20 km smoothing distances produced the greatest likelihood. The likelihood that the locations of January-June 2015 earthquakes were consistent with optimized forecasts decayed with increasing elapsed time between the catalogs used for model development and testing. Likelihood tests with two additional sets of earthquakes from 2014 exhibit a strong sensitivity of the rate of decay to the smoothing distance. Marked reductions in likelihood are caused by the nonstationarity of the induced earthquake locations. Our results indicate a multiple-fold benefit from smoothed seismicity models in developing short-term earthquake rate forecasts for induced earthquakes in Oklahoma and Kansas, relative to the use of seismic source zones.

  3. High-speed rupture during the initiation of the 2015 Bonin Islands deep earthquake

    NASA Astrophysics Data System (ADS)

    Zhan, Z.; Ye, L.; Shearer, P. M.; Lay, T.; Kanamori, H.

    2015-12-01

    Among the long-standing questions on how deep earthquakes rupture, the nucleation phase of large deep events is one of the most puzzling parts. Resolving the rupture properties of the initiation phase is difficult to achieve with far-field data because of the need for accurate corrections for structural effects on the waveforms (e.g., attenuation, scattering, and site effects) and alignment errors. Here, taking the 2015 Mw 7.9 Bonin Islands earthquake (depth = 678 km) as an example, we jointly invert its far-field P waves at multiple stations for the average rupture speed during the first second of the event. We use waveforms from a closely located aftershock as empirical Green's functions, and correct for possible differences in focal mechanisms and waveform misalignments with an iterative approach. We find that the average initial rupture speed is over 5 km/s, significantly higher than the average rupture speed of 3 km/s later in the event. This contrast suggests that rupture speeds of deep earthquakes can be highly variable during individual events and may define different stages of rupture, potentially with different mechanisms.

  4. Somalian Earthquakes of May, 1980, East Africa

    SciTech Connect

    Ruegg, J.C.; Lepine, J.C.; Tarantola, A.; Leveque, J.J.

    1981-04-01

    A seismic crisis, with a m/sub b/ = 5.3 main shock, occured in the Somali Republic East Africa (10 /sup 0/N, 43 /sup 0/E) from April to November 1980. Up to 2000 earthquakes with M/sub L/>2 have been recorded during this period. This earthquake sequence is of particular interest because it occurred in a seismically inactive zone and include a rather long aftershock sequence. Two groups of epicenters were identified using a relative location procedure. Aftershocks observed during the first two weeks fall very close to the Borama City, while latter shocks are situated 10km west. This may suggest that the second group of earthquakes has been induced continental margin between the Somalian Plateau shield and the quasi-oceanic crust of the Afar-Gulf of Aden region, remains active to day and is relevant to intraplate seismicity.

  5. Geodetic Imaging for Rapid Assessment of Earthquakes: Airborne Laser Scanning (ALS)

    NASA Astrophysics Data System (ADS)

    Carter, W. E.; Shrestha, R. L.; Glennie, C. L.; Sartori, M.; Fernandez-Diaz, J.; National CenterAirborne Laser Mapping Operational Center

    2010-12-01

    To the residents of an area struck by a strong earthquake quantitative information on damage to the infrastructure, and its attendant impact on relief and recovery efforts, is urgent and of primary concern. To earth scientists a strong earthquake offers an opportunity to learn more about earthquake mechanisms, and to compare their models with the real world, in hopes of one day being able to accurately predict the precise locations, magnitudes, and times of large (and potentially disastrous) earthquakes. Airborne laser scanning (also referred to as airborne LiDAR or Airborne Laser Swath Mapping) is particularly well suited for rapid assessment of earthquakes, both for immediately estimating the damage to infrastructure and for providing information for the scientific study of earthquakes. ALS observations collected at low altitude (500—1000m) from a relatively slow (70—100m/sec) aircraft can provide dense (5—15 points/m2) sets of surface features (buildings, vegetation, ground), extending over hundreds of square kilometers with turn around times of several hours to a few days. The actual response time to any given event depends on several factors, including such bureaucratic issues as approval of funds, export license formalities, and clearance to fly over the area to be mapped, and operational factors such as the deployment of the aircraft and ground teams may also take a number of days for remote locations. Of course the need for immediate mapping of earthquake damage generally is not as urgent in remote regions with less infrastructure and few inhabitants. During August 16-19, 2010 the National Center for Airborne Laser Mapping (NCALM) mapped the area affected by the magnitude 7.2 El Mayor-Cucapah Earthquake (Northern Baja California Earthquake), which occurred on April 4, 2010, and was felt throughout southern California, Arizona, Nevada, and Baja California North, Mexico. From initial ground observations the fault rupture appeared to extend 75 km

  6. New Zealand Earthquake Forecast Testing Centre

    NASA Astrophysics Data System (ADS)

    Gerstenberger, Matthew C.; Rhoades, David A.

    2010-08-01

    The New Zealand Earthquake Forecast Testing Centre is being established as one of several similar regional testing centres under the umbrella of the Collaboratory for the Study of Earthquake Predictability (CSEP). The Centre aims to encourage the development of testable models of time-varying earthquake occurrence in the New Zealand region, and to conduct verifiable prospective tests of their performance over a period of five or more years. The test region, data-collection region and requirements for testing are described herein. Models must specify in advance the expected number of earthquakes with epicentral depths h ≤ 40 km in bins of time, magnitude and location within the test region. Short-term models will be tested using 24-h time bins at magnitude M ≥ 4. Intermediate-term models and long-term models will be tested at M ≥ 5 using 3-month, 6-month and 5-year bins, respectively. The tests applied will be the same as at other CSEP testing centres: the so-called N test of the total number of earthquakes expected over the test period; the L test of the likelihood of the earthquake catalogue under the model; and the R test of the ratio of the likelihoods under alternative models. Four long-term, three intermediate-term and two short-term models have been installed to date in the testing centre, with tests of these models commencing on the New Zealand earthquake catalogue from the beginning of 2008. Submission of models is open to researchers worldwide. New models can be submitted at any time. The New Zealand testing centre makes extensive use of software produced by the CSEP testing centre in California. It is envisaged that, in time, the scope of the testing centre will be expanded to include new testing methods and differently-specified models, nonetheless that the New Zealand testing centre will develop in parallel with other regional testing centres through the CSEP international collaborative process.

  7. Contradicting Estimates of Location, Geometry, and Rupture History of Highly Active Faults in Central Japan

    NASA Astrophysics Data System (ADS)

    Okumura, K.

    2011-12-01

    Accurate location and geometry of seismic sources are critical to estimate strong ground motion. Complete and precise rupture history is also critical to estimate the probability of the future events. In order to better forecast future earthquakes and to reduce seismic hazards, we should consider over all options and choose the most likely parameter. Multiple options for logic trees are acceptable only after thorough examination of contradicting estimates and should not be a result from easy compromise or epoche. In the process of preparation and revisions of Japanese probabilistic and deterministic earthquake hazard maps by Headquarters for Earthquake Research Promotion since 1996, many decisions were made to select plausible parameters, but many contradicting estimates have been left without thorough examinations. There are several highly-active faults in central Japan such as Itoigawa-Shizuoka Tectonic Line active fault system (ISTL), West Nagano Basin fault system (WNBF), Inadani fault system (INFS), and Atera fault system (ATFS). The highest slip rate and the shortest recurrence interval are respectively ~1 cm/yr and 500 to 800 years, and estimated maximum magnitude is 7.5 to 8.5. Those faults are very hazardous because almost entire population and industries are located above the fault within tectonic depressions. As to the fault location, most uncertainties arises from interpretation of geomorphic features. Geomorphological interpretation without geological and structural insight often leads to wrong mapping. Though non-existent longer fault may be a safer estimate, incorrectness harm reliability of the forecast. Also this does not greatly affect strong motion estimates, but misleading to surface displacement issues. Fault geometry, on the other hand, is very important to estimate intensity distribution. For the middle portion of the ISTL, fast-moving left-lateral strike-slip up to 1 cm/yr is obvious. Recent seismicity possibly induced by 2011 Tohoku

  8. Travel-time source-specific station correction improves location accuracy

    NASA Astrophysics Data System (ADS)

    Giuntini, Alessandra; Materni, Valerio; Chiappini, Stefano; Carluccio, Roberto; Console, Rodolfo; Chiappini, Massimo

    2013-04-01

    Accurate earthquake locations are crucial for investigating seismogenic processes, as well as for applications like verifying compliance to the Comprehensive Test Ban Treaty (CTBT). Earthquake location accuracy is related to the degree of knowledge about the 3-D structure of seismic wave velocity in the Earth. It is well known that modeling errors of calculated travel times may have the effect of shifting the computed epicenters far from the real locations by a distance even larger than the size of the statistical error ellipses, regardless of the accuracy in picking seismic phase arrivals. The consequences of large mislocations of seismic events in the context of the CTBT verification is particularly critical in order to trigger a possible On Site Inspection (OSI). In fact, the Treaty establishes that an OSI area cannot be larger than 1000 km2, and its larger linear dimension cannot be larger than 50 km. Moreover, depth accuracy is crucial for the application of the depth event screening criterion. In the present study, we develop a method of source-specific travel times corrections based on a set of well located events recorded by dense national seismic networks in seismically active regions. The applications concern seismic sequences recorded in Japan, Iran and Italy. We show that mislocations of the order of 10-20 km affecting the epicenters, as well as larger mislocations in hypocentral depths, calculated from a global seismic network and using the standard IASPEI91 travel times can be effectively removed by applying source-specific station corrections.

  9. Photocopy of photograph (original located at Mare Island Archives). Original ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy of photograph (original located at Mare Island Archives). Original photographer unknown. View of sawmill after earthquake of 1898. - Mare Island Naval Shipyard, East of Nave Drive, Vallejo, Solano County, CA

  10. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    SciTech Connect

    Saragoni, G. Rodolfo

    2008-07-08

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  11. Determination of Fault Plane and Rupture Direction of the April 18, 2008 Earthquake, Mt. Carmel, Illinois

    NASA Astrophysics Data System (ADS)

    Yang, H.; Chu, R.; Zhu, L.

    2008-12-01

    We located a large number of aftershocks to determine the fault plane of the April 18, 2008, Illinois earthquake.Those aftershocks were detected by a sliding-window cross correlation (SCC) technique that we developed in this study. We applied this technique to continuous waveforms recorded by the Cooperative New Madrid Seismic Network stations. It detected 86 aftershocks down to magnitude 0.8 in the two-week time window following the mainshock, which is twice more than the number of aftershocks reported by the seismic network. Most aftershocks happened within 24 hours of the mainshock. We then relocated all events by the double-difference relocation algorithm. Accurate differential P- and S-wave arrival times were obtained by waveform cross correlation. After relocation, all events are located in a SW-NE line which delineates an N40E oriented strike-slip fault. The fault is nearly vertical down to ~20 km. To determine the direction of mainshock rupture propagation, we used waveforms of a small magnitude aftershock as the empirical Green's functions to estimate source time function of the mainshock. Results show that the rupture propagated nearly horizontally to the north in the fault plane oriented in N30E, consistent with the fault plane determined by earthquakes locations.

  12. Spatio-temporal properties and evolution of the 2013 Aigion earthquake swarm (Corinth Gulf, Greece)

    NASA Astrophysics Data System (ADS)

    Mesimeri, M.; Karakostas, V.; Papadimitriou, E.; Schaff, D.; Tsaklidis, G.

    2016-04-01

    The 2013 Aigion earthquake swarm that took place in the west part of Corinth Gulf is investigated for revealing faulting and seismicity properties of the activated area. The activity started on May 21 and was appreciably intense in the next 3 months. The recordings of the Hellenic Unified Seismological Network (HUSN), which is adequately dense around the affected area, were used to accurately locate 1501 events. The double difference ( hypoDD) technique was employed for the manually picked P and S phases along with differential times derived from waveform cross-correlation for improving location accuracy. The activated area with dimensions 6 × 2 km is located approximately 5 km SE of Aigion. Focal mechanisms of 77 events with M ≥ 2.0 were determined from P wave first motions and used for the geometry identification of the ruptured segments. Spatio-temporal distribution of earthquakes revealed an eastward and westward hypocentral migration from the starting point suggesting the division of the seismic swarm into four major clusters. The hypocentral migration was corroborated by the Coulomb stress change calculation, indicating that four fault segments involved in the rupture process successively failed by stress change encouragement. Examination of fluid flow brought out that it cannot be unambiguously considered as the driving mechanism for the successive failures.

  13. A filter bank approach to earthquake early warning

    NASA Astrophysics Data System (ADS)

    Meier, M.; Heaton, T. H.; Clinton, J. F.

    2013-12-01

    sufficiently accurate characterization of an ongoing event with two stations, with consistent characterization of the evolving uncertainty of the location and magnitude.

  14. Extreme Magnitude Earthquakes and their Economical Consequences

    NASA Astrophysics Data System (ADS)

    Chavez, M.; Cabrera, E.; Ashworth, M.; Perea, N.; Emerson, D.; Salazar, A.; Moulinec, C.

    2011-12-01

    The frequency of occurrence of extreme magnitude earthquakes varies from tens to thousands of years, depending on the considered seismotectonic region of the world. However, the human and economic losses when their hypocenters are located in the neighborhood of heavily populated and/or industrialized regions, can be very large, as recently observed for the 1985 Mw 8.01 Michoacan, Mexico and the 2011 Mw 9 Tohoku, Japan, earthquakes. Herewith, a methodology is proposed in order to estimate the probability of exceedance of: the intensities of extreme magnitude earthquakes, PEI and of their direct economical consequences PEDEC. The PEI are obtained by using supercomputing facilities to generate samples of the 3D propagation of extreme earthquake plausible scenarios, and enlarge those samples by Monte Carlo simulation. The PEDEC are computed by using appropriate vulnerability functions combined with the scenario intensity samples, and Monte Carlo simulation. An example of the application of the methodology due to the potential occurrence of extreme Mw 8.5 subduction earthquakes on Mexico City is presented.

  15. Earthquake swarms on Mount Erebus, Antarctica

    NASA Astrophysics Data System (ADS)

    Kaminuma, Katsutada; Baba, Megumi; Ueki, Sadato

    1986-12-01

    Mount Erebus (3794 m), located on Ross Island in McMurdo Sound, is one of the few active volcanoes in Antartica. A high-sensitivity seismic network has been operated by Japanese and US parties on and around the Volcano since December, 1980. The results of these observations show two kinds of seismic activity on Ross Island: activity concentrated near the summit of Mount Erebus associated with Strombolian eruptions, and micro-earthquake activity spread through Mount Erebus and the surrounding area. Seismicity on Mount Erebus has been quite high, usually exceeding 20 volcanic earthquakes per day. They frequently occur in swarms with daily counts exceeding 100 events. Sixteen earthquake swarms with more than 250 events per day were recorded by the seismic network during the three year period 1982-1984, and three notable earthquake swarms out of the sixteen were recognized, in October, 1982 (named 82-C), March-April, 1984 (84-B) and July, 1984 (84-F). Swarms 84-B and 84-F have a large total number of earthquakes and large Ishimoto-Iida's "m"; hence these two swarms are presumed to constitute on one of the precursor phenomena to the new eruption, which took place on 13 September, 1984, and lasted a few months.

  16. Earthquake engineering in Peru

    USGS Publications Warehouse

    Vargas, N.J

    1983-01-01

    During the last decade, earthquake engineering research in Peru has been carried out at the Catholic University of Peru and at the Universidad Nacional de Ingeniera (UNI). The Geophysical Institute (IGP) under the auspices of the Organization of American States (OAS) has initiated in Peru other efforts in regional seismic hazard assessment programs with direct impact to the earthquake engineering program. Further details on these programs have been reported by L. Ocola in the Earthquake Information Bulletin, January-February 1982, vol. 14, no. 1, pp. 33-38. 

  17. AGU develops earthquake curriculum

    NASA Astrophysics Data System (ADS)

    Blue, Charles

    AGU, in cooperation with the Federal Emergency Management Agency (FEMA), announces the production of a new curriculum package for grades 7-12 on the engineering and geophysical aspects of earthquakes.According to Frank Ireton, AGU's precollege education manager, “Both AGU and FEMA are working to promote the understanding of earthquake processes and their impact on the built environment. We are designing a program that involves students in learning how science, mathematics, and social studies concepts can be applied to reduce earthquake hazards.”

  18. The USGS Earthquake Notification Service (ENS): Customizable notifications of earthquakes around the globe

    USGS Publications Warehouse

    Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul; Martinez, Eric; Oppenheimer, David

    2008-01-01

    At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.

  19. HOLOCENE AND LATE PLEISTOCENE(? ) EARTHQUAKE-INDUCED SAND BLOWS IN COASTAL SOUTH CAROLINA.

    USGS Publications Warehouse

    Obermeier, S.F.; Jacobson, R.B.; Powars, D.S.; Weems, R.E.; Hallbick, D.C.; Gohn, G.S.; Markewich, H.W.

    1986-01-01

    Multiple generations of prehistoric sand blows, interpreted as earthquake induced, have been discovered throughout coastal South Carolina. These sand blows extend far beyond 1886 earthquake induced sand blows, in sediments having approximately the same liquefaction susceptibility. The seismic source zone for the prehistoric sand blows is unknown. The different distributions of prehistoric and 1886 sand blows have two possible explanations: (1) moderate to strong earthquakes originated in different seismic source locations through time or (2) at least one earthquake much stronger than the 1886 event also originated from the same seismic source as the 1886 earthquake.

  20. The 2015 Illapel earthquake, central Chile: A type case for a characteristic earthquake?

    NASA Astrophysics Data System (ADS)

    Tilmann, F.; Zhang, Y.; Moreno, M.; Saul, J.; Eckelmann, F.; Palo, M.; Deng, Z.; Babeyko, A.; Chen, K.; Baez, J. C.; Schurr, B.; Wang, R.; Dahm, T.

    2016-01-01

    On 16 September 2015, the MW = 8.2 Illapel megathrust earthquake ruptured the Central Chilean margin. Combining inversions of displacement measurements and seismic waveforms with high frequency (HF) teleseismic backprojection, we derive a comprehensive description of the rupture, which also predicts deep ocean tsunami wave heights. We further determine moment tensors and obtain accurate depth estimates for the aftershock sequence. The earthquake nucleated near the coast but then propagated to the north and updip, attaining a peak slip of 5-6 m. In contrast, HF seismic radiation is mostly emitted downdip of the region of intense slip and arrests earlier than the long period rupture, indicating smooth slip along the shallow plate interface in the final phase. A superficially similar earthquake in 1943 with a similar aftershock zone had a much shorter source time function, which matches the duration of HF seismic radiation in the recent event, indicating that the 1943 event lacked the shallow slip.

  1. Sensitivity of tsunami wave profiles and inundation simulations to earthquake slip and fault geometry for the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Goda, Katsuichiro; Mai, Paul Martin; Yasuda, Tomohiro; Mori, Nobuhito

    2014-12-01

    In this study, we develop stochastic random-field slip models for the 2011 Tohoku earthquake and conduct a rigorous sensitivity analysis of tsunami hazards with respect to the uncertainty of earthquake slip and fault geometry. Synthetic earthquake slip distributions generated from the modified Mai-Beroza method captured key features of inversion-based source representations of the mega-thrust event, which were calibrated against rich geophysical observations of this event. Using original and synthesised earthquake source models (varied for strike, dip, and slip distributions), tsunami simulations were carried out and the resulting variability in tsunami hazard estimates was investigated. The results highlight significant sensitivity of the tsunami wave profiles and inundation heights to the coastal location and the slip characteristics, and indicate that earthquake slip characteristics are a major source of uncertainty in predicting tsunami risks due to future mega-thrust events.

  2. Moment Tensor Inversion of the Mw 5.8 May 16 2010 Deep Earthquake in Puerto Rico

    NASA Astrophysics Data System (ADS)

    Soto-Cordero, L.; Convers, J. A.; Dreger, D. S.; Allstadt, K.

    2011-12-01

    Daily seismicity on the Puerto Rico/Virgin Islands (PR/VI) Region is characterized by shallow micro and minor events in response to the interaction of the Caribbean and North American Plates. This complex and active plate boundary has been responsible for the generation of historical shallow tsunamigenic events (e.g. 1867, 1918, and 1946) that caused extensive damage and loss of life in the Northeastern Caribbean. However, in 2010, three deep (>90 km) moderate earthquakes (Mw>5.4) occurred in this region and were reported as felt moderately strongly by local residents. The largest of these, a M5.8 event, which occurred on May 16, 2010, caused slight damage to reinforced concrete structures. We calculate the complete moment tensor solution for this earthquake using the moment tensor inversion method of Dreger, and compute additional source parameters (stress drop, apparent stress, rise time) from broadband waveform modeling. The 1D Puerto Rico seismic velocity model used for automatic, real-time and reviewed locations by the Puerto Rico Seismic Network is used successfully to generate Green's functions for stations located within the PR/VI. Preliminary results suggest a normal mechanism with a strike slip component. Accurate moment tensor solutions using regional seismic data for these earthquakes will improve our understanding of the deformation of the subducting slab: from possible tearing of the slab to intra-slab shearing.

  3. Nonlinear processes in earthquakes

    SciTech Connect

    Jones, E.M.; Frohlich, C.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Three-dimensional, elastic-wave-propagation calculations were performed to define the effects of near-source geologic structure on the degree to which seismic signals produced by earthquakes resemble {open_quotes}non-double-couple{close_quotes} sources. Signals from sources embedded in a subducting slab showed significant phase and amplitude differences compared with a {open_quotes}no-slab{close_quotes} case. Modifications to the LANL elastic-wave propagation code enabled improved simulations of path effects on earthquake and explosion signals. These simulations demonstrate that near-source, shallow, low-velocity basins can introduce earthquake-like features into explosion signatures through conversion of compressive (P-wave) energy to shear (S- and R-wave) modes. Earthquake sources simulated to date do not show significant modifications.

  4. Forecasting southern california earthquakes.

    PubMed

    Raleigh, C B; Sieh, K; Sykes, L R; Anderson, D L

    1982-09-17

    Since 1978 and 1979, California has had a significantly higher frequency of moderate to large earthquakes than in the preceding 25 years. In the past such periods have also been associated with major destructive earthquakes, of magnitude 7 or greater, and the annual probability of occurrence of such an event is now 13 percent in California. The increase in seismicity is associated with a marked deviation in the pattern of strain accumulation, a correlation that is physically plausible. Although great earthquakes (magnitude greater than 7.5) are too infrequent to have clear associations with any pattern of seismicity that is now observed, the San Andreas fault in southern California has accumulated sufficient potential displacement since the last rupture in 1857 to generate a great earthquake along part or all of its length. PMID:17740956

  5. To capture an earthquake

    SciTech Connect

    Ellsworth, W.L. )

    1990-11-01

    An earthquake model based on the theory of plate tectonics is presented. It is assumed that the plates behave elastically in response to slow, steady motions and the strains concentrate within the boundary zone between the plates. When the accumulated stresses exceed the bearing capacity of the rocks, the rocks break, producing an earthquake and releasing the accumulated stresses. As the steady movement of the plates continues, strain begins to reaccumulate. The cycle of strain accumulation and release is modeled using the motion of a block, pulled across a rough surface by a spring. A model earthquake can be predicted by taking into account a precursory event or the peak spring force prior to slip as measured in previous cycles. The model can be applied to faults, e.g., the San Andreas fault, if the past earthquake history of the fault and the rate of strain accumulation are known.

  6. First Results of the Regional Earthquake Likelihood Models Experiment

    USGS Publications Warehouse

    Schorlemmer, D.; Zechar, J.D.; Werner, M.J.; Field, E.H.; Jackson, D.D.; Jordan, T.H.

    2010-01-01

    The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize earthquake occurrence hypotheses in the form of prospective earthquake rate forecasts in California. RELM members, working in small research groups, developed more than a dozen 5-year forecasts; they also outlined a performance evaluation method and provided a conceptual description of a Testing Center in which to perform predictability experiments. Subsequently, researchers working within the Collaboratory for the Study of Earthquake Predictability (CSEP) have begun implementing Testing Centers in different locations worldwide, and the RELM predictability experiment-a truly prospective earthquake prediction effort-is underway within the U. S. branch of CSEP. The experiment, designed to compare time-invariant 5-year earthquake rate forecasts, is now approximately halfway to its completion. In this paper, we describe the models under evaluation and present, for the first time, preliminary results of this unique experiment. While these results are preliminary-the forecasts were meant for an application of 5 years-we find interesting results: most of the models are consistent with the observation and one model forecasts the distribution of earthquakes best. We discuss the observed sample of target earthquakes in the context of historical seismicity within the testing region, highlight potential pitfalls of the current tests, and suggest plans for future revisions to experiments such as this one. ?? 2010 The Author(s).

  7. Accurate measurement of time

    NASA Astrophysics Data System (ADS)

    Itano, Wayne M.; Ramsey, Norman F.

    1993-07-01

    The paper discusses current methods for accurate measurements of time by conventional atomic clocks, with particular attention given to the principles of operation of atomic-beam frequency standards, atomic hydrogen masers, and atomic fountain and to the potential use of strings of trapped mercury ions as a time device more stable than conventional atomic clocks. The areas of application of the ultraprecise and ultrastable time-measuring devices that tax the capacity of modern atomic clocks include radio astronomy and tests of relativity. The paper also discusses practical applications of ultraprecise clocks, such as navigation of space vehicles and pinpointing the exact position of ships and other objects on earth using the GPS.

  8. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  9. Earthquake education in California

    USGS Publications Warehouse

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  10. Prediction of earthquake response spectra

    USGS Publications Warehouse

    Joyner, W.B.; Boore, David M.

    1982-01-01

    We have developed empirical equations for predicting earthquake response spectra in terms of magnitude, distance, and site conditions, using a two-stage regression method similar to the one we used previously for peak horizontal acceleration and velocity. We analyzed horizontal pseudo-velocity response at 5 percent damping for 64 records of 12 shallow earthquakes in Western North America, including the recent Coyote Lake and Imperial Valley, California, earthquakes. We developed predictive equations for 12 different periods between 0.1 and 4.0 s, both for the larger of two horizontal components and for the random horizontal component. The resulting spectra show amplification at soil sites compared to rock sites for periods greater than or equal to 0.3 s, with maximum amplification exceeding a factor of 2 at 2.0 s. For periods less than 0.3 s there is slight deamplification at the soil sites. These results are generally consistent with those of several earlier studies. A particularly significant aspect of the predicted spectra is the change of shape with magnitude (confirming earlier results by McGuire and by Irifunac and Anderson). This result indicates that the conventional practice of scaling a constant spectral shape by peak acceleration will not give accurate answers. The Newmark and Hall method of spectral scaling, using both peak acceleration and peak velocity, largely avoids this error. Comparison of our spectra with the Nuclear Regulatory Commission's Regulatory Guide 1.60 spectrum anchored at the same value at 0.1 s shows that the Regulatory Guide 1.60 spectrum is exceeded at soil sites for a magnitude of 7.5 at all distances for periods greater than about 0.5 s. Comparison of our spectra for soil sites with the corresponding ATC-3 curve of lateral design force coefficient for the highest seismic zone indicates that the ATC-3 curve is exceeded within about 7 km of a magnitude 6.5 earthquake and within about 15 km of a magnitude 7.5 event. The amount by

  11. Measuring the effectiveness of earthquake forecasting in insurance strategies

    NASA Astrophysics Data System (ADS)

    Mignan, A.; Muir-Wood, R.

    2009-04-01

    Given the difficulty of judging whether the skill of a particular methodology of earthquake forecasts is offset by the inevitable false alarms and missed predictions, it is important to find a means to weigh the successes and failures according to a common currency. Rather than judge subjectively the relative costs and benefits of predictions, we develop a simple method to determine if the use of earthquake forecasts can increase the profitability of active financial risk management strategies employed in standard insurance procedures. Three types of risk management transactions are employed: (1) insurance underwriting, (2) reinsurance purchasing and (3) investment in CAT bonds. For each case premiums are collected based on modelled technical risk costs and losses are modelled for the portfolio in force at the time of the earthquake. A set of predetermined actions follow from the announcement of any change in earthquake hazard, so that, for each earthquake forecaster, the financial performance of an active risk management strategy can be compared with the equivalent passive strategy in which no notice is taken of earthquake forecasts. Overall performance can be tracked through time to determine which strategy gives the best long term financial performance. This will be determined by whether the skill in forecasting the location and timing of a significant earthquake (where loss is avoided) is outweighed by false predictions (when no premium is collected). This methodology is to be tested in California, where catastrophe modeling is reasonably mature and where a number of researchers issue earthquake forecasts.

  12. A prospective earthquake forecast experiment in the western Pacific

    NASA Astrophysics Data System (ADS)

    Eberhard, David A. J.; Zechar, J. Douglas; Wiemer, Stefan

    2012-09-01

    Since the beginning of 2009, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been conducting an earthquake forecast experiment in the western Pacific. This experiment is an extension of the Kagan-Jackson experiments begun 15 years earlier and is a prototype for future global earthquake predictability experiments. At the beginning of each year, seismicity models make a spatially gridded forecast of the number of Mw≥ 5.8 earthquakes expected in the next year. For the three participating statistical models, we analyse the first two years of this experiment. We use likelihood-based metrics to evaluate the consistency of the forecasts with the observed target earthquakes and we apply measures based on Student's t-test and the Wilcoxon signed-rank test to compare the forecasts. Overall, a simple smoothed seismicity model (TripleS) performs the best, but there are some exceptions that indicate continued experiments are vital to fully understand the stability of these models, the robustness of model selection and, more generally, earthquake predictability in this region. We also estimate uncertainties in our results that are caused by uncertainties in earthquake location and seismic moment. Our uncertainty estimates are relatively small and suggest that the evaluation metrics are relatively robust. Finally, we consider the implications of our results for a global earthquake forecast experiment.

  13. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  14. a Collaborative Cyberinfrastructure for Earthquake Seismology

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.

    2013-12-01

    One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

  15. Retrospective Seismological Observations: Recording yesterday's earthquakes on seismometers installed today

    NASA Astrophysics Data System (ADS)

    Entwistle, E.; Curtis, A.; Baptie, B.; Meles, G. A.

    2013-12-01

    Earthquake seismograms are usually available only at seismometers that are active at the time of the event. However, recently Source-Receiver Interferometry (SRI) was shown to combine spatial and temporal redatuming to construct seismograms on seismometers deployed only before, during or after the earthquake occurred. Thus seismometers can be redeployed post-earthquake in more useful locations, and earthquake seismograms can nevertheless be obtained (Curtis et al., 2012). We identify suitable SRI source and receiver geometries to construct new earthquake seismograms across the USA. Suitable geometries satisfy: 1) minimum and maximum source-to-receiver distances, 2) a source-to-receiver-array ray path that intersects one or more other seismometers, 3) a dense receiver array that lies approximately perpendicularly (70 - 110 degrees) to a point on that ray, 4) sufficiently long ambient noise records. We also improve SRI receiver integration by embedding seismometer arrays within 2D spatial Voronoi cells. Using data from the USArray TA network we successfully reconstructed M5.5 earthquake seismograms at seven virtual locations in New Mexico. Thus, a new database of retrospective earthquake seismograms can be constructed across the USA.

  16. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, j