Science.gov

Sample records for earthquake monitoring center

  1. Earthquake Monitoring in Haiti

    Following the devastating 2010 Haiti earthquake, the USGS has been helping with earthquake awareness and monitoring in the country, with continued support from the U.S. Agency for International Development (USAID). This assistance has helped the Bureau des Mines et de l'Energie (BME) in Port-au-Prin...

  2. Comprehensive Seismic Monitoring for Emergency Response and Hazards Assessment: Recent Developments at the USGS National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Buland, R. P.; Guy, M.; Kragness, D.; Patton, J.; Erickson, B.; Morrison, M.; Bryon, C.; Ketchum, D.; Benz, H.

    2009-12-01

    The USGS National Earthquake Information Center (NEIC) has put into operation a new generation of seismic acquisition, processing and distribution subsystems that seamlessly integrate regional, national and global seismic network data for routine monitoring of earthquake activity and response to large, damaging earthquakes. The system, Bulletin Hydra, was designed to meet Advanced National Seismic System (ANSS) design goals to handle thousands of channels of real-time seismic data, compute and distribute time-critical seismic information for emergency response applications, and manage the integration of contributed earthquake products and information, arriving from near-real-time up to six weeks after an event. Bulletin Hydra is able meet these goals due to a modular, scalable, and flexible architecture that supports on-the-fly consumption of new data, readily allows for the addition of new scientific processing modules, and provides distributed client workflow management displays. Through the Edge subsystem, Bulletin Hydra accepts waveforms in half a dozen formats. In addition, Bulletin Hydra accepts contributed seismic information including hypocenters, magnitudes, moment tensors, unassociated and associated picks, and amplitudes in a variety of formats including earthworm import/export pairs and EIDS. Bulletin Hydra has state-driven algorithms for computing all IASPEI standard magnitudes (e.g. mb, mb_BB, ML, mb_LG, Ms_20, and Ms_BB) as well as Md, Ms(VMAX), moment tensor algorithms for modeling different portions of the wave-field at different distances (e.g. teleseismic body-wave, centroid, and regional moment tensors), and broadband depth. All contributed and derived data are centrally managed in an Oracle database. To improve on single station observations, Bulletin Hydra also does continuous real-time beam forming of high-frequency arrays. Finally, workflow management displays are used to assist NEIC analysts in their day-to-day duties. All combined, Bulletin Hydra processes data from more than 3,000 stations worldwide to produce the Preliminary Determination of Epicenters (PDE) and associated bulletin products. The system architecture is flexible and cost-effective enough that the NEIC operates four full-up systems, two for operations and two for development and QA, and is going to be deploying a warm, off-site backup system. New features planned for Bulletin Hydra include w-phase modeling, automated single-event, cross-correlation procedures for improved phase picking, computation of source-time functions from empirical Green’s Function analysis, and integration of two and a half dimensional (2.5D) ray tracing to improve single event locations.

  3. Earthquake engineering research center annual report, 1991-1992

    SciTech Connect

    Not Available

    1992-10-01

    The Earthquake Engineering Research Center exists to conduct research and develop technical information in all areas pertaining to earthquake engineering, including strong ground motion, response of natural and manmade structures to earthquakes, design of structures to resist earthquakes, development of new systems for earthquake protection, and development of architectural and public policy aspects of earthquake engineering. The purpose of the Center is achieved through three major functions. The first and primary function is academic research that is performed by graduate students, research engineers, and visiting postdoctoral scholars working with the Center's faculty participants. The research is funded by extramural grants awarded to individual faculty participants from private, state, and federal agencies.

  4. Earthquake Processing System at the Alaska Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Hansen, Roger; Staff, Aeic

    2010-05-01

    The Alaska Earthquake Information Center (AEIC) has the responsibility to record, locate, catalog, and alert Government entities and the public about the occurrence of earthquakes originating within the State of Alaska. Currently, we catalog about 25,000 events per year in and around the State of Alaska, utilizing a network of over 550 seismic stations. In order to handle this many stations recording such a large number of events, we have had to choose operating procedures that are both efficient and robust to be able to function with our staff of 12 people. After much evaluation of competing systems, we chose Antelope as the architecture that would allow us to best grow our capabilities in the proper directions. In this presentation we will illustrate many of our unique implementations of the Antelope tools, and the many additional modules constructed with the Antelope toolbox that have been developed to fit particular needs of AEIC. In addition to simply cataloging the many events in Alaska, we are responsible for rapid notification, ShakeMaps, several local, regional and teleseismic magnitudes (including regional moment tensors), early warning of critical structures such as the Trans-Alaska Oil Pipeline, and assistance with tsunami mitigation and warnings.

  5. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    USGS Publications Warehouse

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-01-01

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  6. Earthquake Observation through Groundwater Monitoring in South Korea

    NASA Astrophysics Data System (ADS)

    Piao, J.; Woo, N. C.

    2014-12-01

    According to previous researches, the influence of the some earthquakes can be detected by groundwater monitoring. Even in some countries groundwater monitoring is being used as an important tool to identify earthquake precursors and prediction measures. Thus, in this study we attempt to catch the anomalous changes in groundwater produced by earthquakes occurred in Korea through the National Groundwater Monitoring Network (NGMN). For observing the earthquake impacts on groundwater more effectively, from the National Groundwater Monitoring Network we selected 28 stations located in the five earthquake-prone zones in South Korea. And we searched the responses to eight earthquakes with M ≥2.5 which occurred in the vicinity of five earthquake-prone zones in 2012. So far, we tested the groundwater monitoring data (water-level, temperature and electrical conductivity). Those data have only been treated to remove barometric pressure changes. Then we found 29 anomalous changes, confirming that groundwater monitoring data can provide valuable information on earthquake effects. To identify the effect of the earthquake from mixture signals of water-level, other signals must be separated from the original data. Periodic signals will be separated from the original data using Fast Fourier Transform (FFT). After that we will attempt to separate precipitation effect, and determine if the anomalies were generated by earthquake or not.

  7. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  8. Monitoring the Pollino Earthquake Swarm (Italy)

    NASA Astrophysics Data System (ADS)

    Roessler, D.; Passarelli, L.; Govoni, A.; Rivalta, E.

    2014-12-01

    The Mercure Basin (MB) and the Castrovillari Fault (CF) in the Pollino range (southern Apennines, Italy) representone of the most prominent seismic gaps in the Italian seismic catalog, with no M>6 earthquakes during the lastcenturies. In recent times, the MB has been repeatedly interested by seismic swarms.The most energetic swarm started in 2010 and still active in 2014. The seismicity culminated in autumn 2012 with a M=5 event on October 25. In contrast, the CF appears aseismic. Only the northern part of the CF has experienced microseismicity.The range host a number of additional sub-parallel faults.Their rheology is unclear. Current debates include the potential of the MB and the CF to host largeearthquakes and the level and the style of deformation.Understanding the seismicity and the behaviour of the faultsis therefore necessary to assess the seismic hazard. The GFZ German Research Centre for Geosciences and INGV, Italy, have been jointly monitoring the ongoing seismicity using a small-aperture seismic array, integrated in a temporary seismic network. Using the array, we automatically detect about ten times more earthquakes than currently included inlocal catalogues corresponding to completeness above M~0.5.In the course of the swarm, seismicity has mainly migrated within the Mercure Basin.However, the eastward spread towards the northern tio of the CF in 2013 marksa phase with seismicity located outside of the Mercure Basin.The event locations indicate spatially distinct clusters with different mechanisms across the E-W trending Pollino Fault.The clusters differ in strike and dip.Calibration of the local magnitude scale confirms earlier studies further north in the Apennines. The station corrections show N-S variation indicating that the Pollino Fault forms an important structural boundary.

  9. Enhanced Earthquake Monitoring of the European Arctic

    NASA Astrophysics Data System (ADS)

    Kvaerna, Tormod; Schweitzer, Johannes; Antonovskaya, Galina; Kremenetskaya, Elena O.

    2014-05-01

    We present preliminary results from a cooperative initiative between NORSAR and seismological institutions in NW Russia (Arkhangelsk and Apatity), which each operate seismic networks. To indicate the potential of combining resources to improve the seismic coverage of the European Arctic, we have carried out a comparison based on the first six months of 2013 between the Reviewed Event Bulletin of the CTBT International Data Centre, the NORSAR reviewed regional seismic bulletin (using data from Fennoscandia, Spitsbergen and the Kola Peninsula) and the bulletin produced by the Arkhangelsk seismological center (using data from their own network in combination with the data used to produce the NORSAR bulletin). We show that the addition of the Arkhangelsk network leads to a considerable increase in the number of located seismic events, both at local distances from the individual stations and in the High Arctic. The latter increase is particularly pronounced along the Gakkel Ridge to the north of the Svalbard and Franz-Josef Land archipelagos. A closer investigation shows that the additional events in the High Arctic are included due to the contribution from the station ZFI on Franz-Josef Land in combination with the Spitsbergen stations SPITS and KBS. We also note that the vast majority of the events along the Gakkel Ridge have been located slightly to the south of the ridge. We interpret this as an effect of the lack of recording stations closer to and north of the Gakkel Ridge, and the use of a one-dimensional velocity model which is not fully representative for travel-times along observed propagation paths. We conclude that while the characteristics of earthquake activity in the European Arctic is currently poorly known, the knowledge can be expected to be significantly improved by establishing the appropriate cooperative seismic recording infrastructures.

  10. Southern California Earthquake Center (SCEC) Summer Internship Programs

    NASA Astrophysics Data System (ADS)

    Benthien, M. L.; Perry, S.; Jordan, T. H.

    2004-12-01

    For the eleventh consecutive year, the Southern California Earthquake Center (SCEC) coordinated undergraduate research experiences in summer 2004, allowing 35 students with a broad array of backgrounds and interests to work with the world's preeminent earthquake scientists and specialists. Students participate in interdisciplinary, system-level earthquake science and information technology research, and several group activities throughout the summer. Funding for student stipends and activities is made possible by the NSF Research Experiences for Undergraduates (REU) program. SCEC coordinates two intern programs: The SCEC Summer Undergraduate Research Experience (SCEC/SURE) and the SCEC Undergraduate Summer in Earthquake Information Technology (SCEC/USEIT). SCEC/SURE interns work one-on-one with SCEC scientists at their institutions on a variety of earthquake science research projects. The goals of the program are to expand student participation in the earth sciences and related disciplines, encourage students to consider careers in research and education, and to increase diversity of students and researchers in the earth sciences. 13 students participated in this program in 2004. SCEC/USEIT is an NSF REU site that brings undergraduate students from across the country to the University of Southern California each summer. SCEC/USEIT interns interact in a team-oriented research environment and are mentored by some of the nation's most distinguished geoscience and computer science researchers. The goals of the program are to allow undergraduates to use advanced tools of information technology to solve problems in earthquake research; close the gap between computer science and geoscience; and engage non-geoscience majors in the application of earth science to the practical problems of reducing earthquake risk. SCEC/USEIT summer research goals are structured around a grand challenge problem in earthquake information technology. For the past three years the students have developed a new earthquake and fault visualization platform named "LA3D." 22 students participated in this program in 2004. SCEC Interns come together several times during the summer, beginning with a Communication Workshop that develops the student's oral and written communication skills. In mid-summer, a one-day SCEC Intern Colloquium is held, where student researchers present status reports on their research, followed by a three-day field trip of southern California geology and SCEC research locations. Finally, at the end of the summer each student presents a poster at the SCEC Annual Meeting.

  11. Monitoring of soft high damping elastomeric bearings for earthquake isolation

    SciTech Connect

    Coveney, V.A.; Kuroda, T.; Kobatake, M.; Nita, Y.; Kulak, R.F.; Chang, Y.W.; Seidensticker, R.W.

    1993-07-01

    Over the last 20 years several practical systems for the protection of buildings and their contents against the effects of earthquakes were developed. These systems rely on effectively decoupling the building from the strong horizontal ground accelerations that are some of the most damaging features of an earthquake. The isolation of small buildings against earthquakes poses particular problems for high damping elastomer. systems. It was recognized that one way to overcome these problems was to use elastomers with particularly low moduli, high damping and other necessary characteristics. This paper describes some key features of the development of soft high damping natural bearings. Their use for the earthquake isolation of a small highly instrumented building at Tohoku University in Sendai, Japan is discussed. The paper focuses on the monitoring of the bearings during production and their performance in situ under static and earthquake (dynamic) conditions.

  12. Earthquake monitoring for multi-temporal images of Ziyuan-3

    NASA Astrophysics Data System (ADS)

    Wang, Qing; Jiang, Yong-hua; Zhang, Guo; Sheng, Qing-hong

    2015-12-01

    With frequent occurrence of earthquake disaster, earthquake monitoring becomes increasingly concerned. Global observing by optical remote sensing is an emerging technology widely applied in monitoring temporal changes of topography in earthquake. It provides advantages of large width of observation, fast data acquisition and high time effectiveness. This technique takes advantages of accurate image registration of pre-seismic and post-seismic to spot surface rupture zones. Therefore, the spatial alignment accuracy of multi temporal images becomes a problem that hinder the earthquake monitoring. Considering the adverse impact of different imaging angle, camera lens distortion and other factors on image registration, a new approach of high accurate registration based on constraining positioning consistency in rational function model (RFM) is proposed. Ziyuan3 images of Yutian country in Xinjiang are used to perform the earthquake monitoring experiment. After applying the proposed method, registration accuracy of pre-seismic and postseismic images is better than 0.6 pixel; surface rupture zones caused by earthquake are acquired promptly.

  13. Recent improvements in earthquake and tsunami monitoring in the Caribbean

    NASA Astrophysics Data System (ADS)

    Gee, L.; Green, D.; McNamara, D.; Whitmore, P.; Weaver, J.; Huang, P.; Benz, H.

    2007-12-01

    Following the catastrophic loss of life from the December 26, 2004, Sumatra-Andaman Islands earthquake and tsunami, the U.S. Government appropriated funds to improve monitoring along a major portion of vulnerable coastal regions in the Caribbean Sea, the Gulf of Mexico, and the Atlantic Ocean. Partners in this project include the United States Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the Puerto Rico Seismic Network (PRSN), the Seismic Research Unit of the University of the West Indies, and other collaborating institutions in the Caribbean region. As part of this effort, the USGS is coordinating with Caribbean host nations to design and deploy nine new broadband and strong-motion seismic stations. The instrumentation consists of an STS-2 seismometer, an Episensor accelerometer, and a Q330 high resolution digitizer. Six stations are currently transmitting data to the USGS National Earthquake Information Center, where the data are redistributed to the NOAA's Tsunami Warning Centers, regional monitoring partners, and the IRIS Data Management Center. Operating stations include: Isla Barro Colorado, Panama; Gun Hill Barbados; Grenville, Grenada; Guantanamo Bay, Cuba; Sabaneta Dam, Dominican Republic; and Tegucigalpa, Honduras. Three additional stations in Barbuda, Grand Turks, and Jamaica will be completed during the fall of 2007. These nine stations are affiliates of the Global Seismographic Network (GSN) and complement existing GSN stations as well as regional stations. The new seismic stations improve azimuthal coverage, increase network density, and provide on-scale recording throughout the region. Complementary to this network, NOAA has placed Deep-ocean Assessment and Reporting of Tsunami (DART) stations at sites in regions with a history of generating destructive tsunamis. Recently, NOAA completed deployment of 7 DART stations off the coasts of Montauk Pt, NY; Charleston, SC; Miami, FL; San Juan, Puerto Rico; New Orleans, LA; and Bermuda as part of the U.S. tsunami warning system expansion. DART systems consist of an anchored seafloor pressure recorder (BPR) and a companion moored surface buoy for real-time communications. The new stations are a second-generation design (DART II) equipped with two- way satellite communications that allow NOAA's Tsunami Warning Centers to set stations in event mode in anticipation of possible tsunamis or retrieve the high-resolution (15-s intervals) data in one-hour blocks for detailed analysis. Combined with development of sophisticated wave propagation and site-specific inundation models, the DART data are being used to forecast wave heights for at-risk coastal communities. NOAA expects to deploy a total of 39 DART II buoy stations by 2008 (32 in the Pacific and 7 in the Atlantic, Caribbean and Gulf regions). The seismic and DART networks are two components in a comprehensive and fully-operational global observing system to detect and warn the public of earthquake and tsunami threats. NOAA and USGS are working together to make important strides in enhancing communication networks so residents and visitors can receive earthquake and tsunami watches and warnings around the clock.

  14. Monitoring seismic velocity changes associated with the 2014 Mw 6.0 South Napa earthquake

    NASA Astrophysics Data System (ADS)

    Taira, T.; Brenguier, F.; Kong, Q.

    2014-12-01

    We analyze ambient seismic noise wavefield to explore temporal variations in seismic velocity associated with the 24 August 2014 Mw 6.0 South Napa earthquake. We estimate relative velocity changes (dv/v) with MSNoise [Lecocq et al., 2014, SRL] by analyzing continuous waveforms collected at 10 seismic stations that locate near the epicenter of the 2014 South Napa earthquake. Following Brenguier et al. [2008, Science], our preliminary analysis focuses on the vertical component waveforms in a frequency range of 0.1-0.9 Hz. We determine the reference Green's function (GF) for each station pair as the average of 1-day stacks of GFs obtained in the time interval, January through July 2014. We estimate the time history of dv/v by measuring delay times between 10-day stacks of GF and reference GF. We find about 0.07% velocity reduction immediately after the 2014 South Napa earthquake by measuring the delay times between stacked and reference GFs. Our preliminary result also reveals a post-seismic relaxation process. The velocity reduction is down to 0.04% about 20 days after the 2014 South Napa earthquake. We have implemented an automated system to monitor the time history of dv/v (http://earthquakes.berkeley.edu/~taira/SNapa/SNapa_Noise.html) by using waveforms archived at the Northern California Earthquake Data Center. We will characterize the detailed temporal evolution of velocity change associated with the 2014 South Napa earthquake.

  15. Long Baseline Tilt Meter Array to Monitor Cascadia's Slow Earthquakes

    NASA Astrophysics Data System (ADS)

    Suszek, N.; Bilham, R.; Flake, R.; Melbourne, T. I.; Miller, M.

    2004-12-01

    Five biaxial Michelson tilt meters are currently being installed in the Puget Lowlands near Seattle to monitor dynamic tilt changes accompanying episodic slow earthquakes that occur at 20-40 km depth. Each tilt meter consists of a 1-2 m deep, 500-m-long, 15-cm diameter, horizontal, half-filled water-pipe, terminated by float sensors with sub-micron water-level resolution, similar to those that have operated unattended for the past decade within the Long Valley caldera. The sensors measure water height relative to the base of a pile driven to 10 m depth. A wide-body LVDT attached to this pile outside the reservoir, senses the motion of the core attached to the float within. The voltage indicating the position of the core is sampled 16 times a second, and digitally filtered before transmission via radio modem for storage as 1-minute samples in a remote computer. The computer gathers 16-bit water height, vault temperature, air pressure and various housekeeping data once per minute using remote telemetry. Installed during 2004, the first of the tilt meters, installed in 2004, float sensors at each end, and one in the center of each pipe, permit us to examine tilt signal coherence and local noise. Each adjacent pair of sensors has a tilt resolution of 2e-9 and a range of 8 microradians. We anticipate tilt signals with durations of 0.3-30 days, and amplitudes of less than 0.1 microradian associated with slow earthquakes. Anticipated noise levels in the tilt meters are 10-1000 times lower that these expected signals, similar to or better than signal-to-noise levels from planned strain meters of the PBO array.

  16. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These data can be accessed through the above web services and through special NCEDC web pages.

  17. Towards an Earthquake Monitoring System for Indian Ocean Tsunami Early Warning

    NASA Astrophysics Data System (ADS)

    Kraft, T.; Hanka, W.; Saul, J.; Heinloo, A.; Reinhardt, J.; Weber, B.; Becker, J.; Thoms, H.; Pahlke, D.

    2006-12-01

    The Mw=9.3 Sumatra earthquake of December 26, 2004, generated a tsunami that effected the entire Indian Ocean region and caused approximately 230,000 fatalities. The German human aid program for the Indian Ocean region started immediately after the disaster with substantial funding of 45M Euro for the proposed German Indian Ocean Tsunami Early Warning System (GITEWS). In this presentation we describe the concept of the Earthquake Monitoring System and report on its present status: The major challenge for a Earthquake Monitoring System (ESM) is to deliver information about location, size, source parameters and possibly rupture process as early as possible before the potential tsunami hits the neighboring coastal areas. Tsunamigenic earthquakes are expected to occur in subduction zones close to coast lines. This is particularly true for the Sunda trench off-shore Indonesia, but also in the Macran subduction zone off-shore Iran. Key for an Indian Ocean monitoring system with short warning times is therefore a dense real-time seismic network in Indonesia, supplemented by a substantial number of stations in other countries and territories within and around the Indian Ocean. 40 new broadband and strong motion stations will be installed during the GITEWS project until 2010. The EMS Control Center will be based on an enhanced version of the widely used SeisComP software and the GEOFON earthquake information system prototype presently operated at the GFZ-Potsdam (http://geofon.gfz- potsdam.de/db/eqinfo.php). However, the Control Center software under development at the moment will be more reliable, faster and automatic but with operator supervison. It will use sophisticated visualisation tools, offer the posibility for manual correction and re-calculation, flexible configuration and support for distributed processing. Is large redundancy for algorithms, moduls and hardware assures easy integration into larger multi-sensor, multi-hazard control centers and decision support systems. A first prototype of the EMS Control Center software will be ready in mid 2007.

  18. Real-time earthquake monitoring: Early warning and rapid response

    NASA Technical Reports Server (NTRS)

    1991-01-01

    A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.

  19. Lessons Learned from Creating the Public Earthquake Resource Center at CERI

    NASA Astrophysics Data System (ADS)

    Patterson, G. L.; Michelle, D.; Johnston, A.

    2004-12-01

    The Center for Earthquake Research and Information (CERI) at the University of Memphis opened the Public Earthquake Resource Center (PERC) in May 2004. The PERC is an interactive display area that was designed to increase awareness of seismology, Earth Science, earthquake hazards, and earthquake engineering among the general public and K-12 teachers and students. Funding for the PERC is provided by the US Geological Survey, The NSF-funded Mid America Earthquake Center, and the University of Memphis, with input from the Incorporated Research Institutions for Seismology. Additional space at the facility houses local offices of the US Geological Survey. PERC exhibits are housed in a remodeled residential structure at CERI that was donated by the University of Memphis and the State of Tennessee. Exhibits were designed and built by CERI and US Geological Survey staff and faculty with the help of experienced museum display subcontractors. The 600 square foot display area interactively introduces the basic concepts of seismology, real-time seismic information, seismic network operations, paleoseismology, building response, and historical earthquakes. Display components include three 22" flat screen monitors, a touch sensitive monitor, 3 helicorder elements, oscilloscope, AS-1 seismometer, life-sized liquefaction trench, liquefaction shake table, and building response shake table. All displays include custom graphics, text, and handouts. The PERC website at www.ceri.memphis.edu/perc also provides useful information such as tour scheduling, ask a geologist, links to other institutions, and will soon include a virtual tour of the facility. Special consideration was given to address State science standards for teaching and learning in the design of the displays and handouts. We feel this consideration is pivotal to the success of any grass roots Earth Science education and outreach program and represents a valuable lesson that has been learned at CERI over the last several years. Another critical lesson that has been learned is to employ K-12 education professionals and utilize undergrad and graduate student workers in the University's Department of Education. Such staff members are keenly aware of the pressures and needs in diverse communities such as Shelby County, Tennessee and are uniquely suited to design and implement new and innovative programs that provide substantive short-term user benefits and promote long-term relationships with the K-12 teachers, students, and teacher's organizations.

  20. Artificial neural network model for earthquake prediction with radon monitoring.

    PubMed

    Klahci, Fatih; Incez, Murat; Do?ru, Mahmut; Aksoy, Ercan; Baykara, Oktay

    2009-01-01

    Apart from the linear monitoring studies concerning the relationship between radon and earthquake, an artificial neural networks (ANNs) model approach is presented starting out from non-linear changes of the eight different parameters during the earthquake occurrence. A three-layer Levenberg-Marquardt feedforward learning algorithm is used to model the earthquake prediction process in the East Anatolian Fault System (EAFS). The proposed ANN system employs individual training strategy with fixed-weight and supervised models leading to estimations. The average relative error between the magnitudes of the earthquakes acquired by ANN and measured data is about 2.3%. The relative error between the test and earthquake data varies between 0% and 12%. In addition, the factor analysis was applied on all data and the model output values to see the statistical variation. The total variance of 80.18% was explained with four factors by this analysis. Consequently, it can be concluded that ANN approach is a potential alternative to other models with complex mathematical operations. PMID:18789709

  1. Enhanced Earthquake Monitoring in the European Arctic

    NASA Astrophysics Data System (ADS)

    Antonovskaya, Galina; Konechnaya, Yana; Kremenetskaya, Elena O.; Asming, Vladimir; Kvrna, Tormod; Schweitzer, Johannes; Ringdal, Frode

    2015-03-01

    This paper presents preliminary results from a cooperative initiative between the Norwegian Seismic Array (NORSAR) institution in Norway and seismological institutions in NW Russia (Arkhangelsk and Apatity). We show that the joint processing of data from the combined seismic networks of all these institutions leads to a considerable increase in the number of located seismic events in the European Arctic compared to standard seismic bulletins such as the NORSAR reviewed regional seismic bulletin and the Reviewed Event Bulletin (REB) issued by the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) organization. The increase is particularly pronounced along the Gakkel Ridge to the north of the Svalbard and Franz-Josef Land archipelagos. We also note that the vast majority of the events along the Gakkel Ridge have been located slightly to the south of the ridge. We interpret this as an effect of the lack of recording stations closer to and north of the Gakkel Ridge, and the use of a one-dimensional velocity model which is not fully representative for travel-times along observed propagation paths. We conclude that while the characteristics of earthquake activity in the European Arctic is currently poorly known, the knowledge can be expected to be significantly improved by establishing the appropriate cooperative seismic recording infrastructures.

  2. USGS NEIC Earthquake Monitoring, Response and Research in the Northern Pacific Region

    NASA Astrophysics Data System (ADS)

    Hayes, G. P.

    2012-12-01

    A major component of USGS National Earthquake Information Center (NEIC) operations are those related to the monitoring of and response to global earthquakes. In this presentation I will discuss the monitoring capabilities of the NEIC in the Alaska-Aleutians and Kuril-Kamchatka arc regions, and how these might affect our response to major subduction zone earthquakes. I will focus in particular on our capabilities for the rapid characterization of earthquake magnitude and mechanism, issues vital for subsequent real-time shaking and tsunami risk assessments. Such rapid assessments are made possible by the availability of nearby long-period data, from the Global Seismic Network and other regional networks available in real time at the NEIC. In Alaska, available data facilitates accurate magnitude assessments for even the largest earthquakes in as little as 5-10 minutes. In Kamchatka, however, such response times are delayed a further 10-15 minutes by the limited availability of regional data. These issues impact the generation and accuracy of downstream response products produced in real time after a major global event. In the second part of this presentation, I will highlight the timeline over which these products, such as ShakeMap and PAGER, become available, and when and how they were produced in real time following the 2011 M9.0 Tohoku earthquake in Japan. The NEIC was aware of great size of this earthquake in less than 20 minutes; with more regional data from the region, this time could be reduced to less than 10 minutes for future earthquakes. Our response to this event was a demonstration of the major advances made since a similar sized earthquake in Sumatra in 2004, while at the same time highlighting where further improvements are necessary in the future, in response to the growing needs of our society for immediate, accurate and actionable information. Many of the advances and improvements made to rapid earthquake characterization and response stem from research efforts at the USGS and in the academic community. In the last part of this presentation, I will focus on some of the USGS-based research efforts in the northern Pacific region - such as paleoseismic investigations, and studies of subduction zone geometry, structure and historic moment release - and will discuss how they can and are improving our understanding of the history and future seismic potential of these two hazardous arc systems.

  3. Korea Integrated Seismic System (KISS) and Earthquake Monitoring for Korea Train eXpress (KTX).

    NASA Astrophysics Data System (ADS)

    Park, Jung Ho; Chi, Heon Cheol; Seub Lim, In; Kim, Geun Young; Shin, Jin Soo

    2010-05-01

    Since 2002 Korea Integrated Seismic System (KISS) has been playing main role in real-time seismic data exchange between different seismic networks operated by four earthquake monitoring institutes: KMA, KEPRI, KINS and KIGAM. Seismic data from different seismic networks are gathered into the data pool of KISS where clients can receive data in real-time. Before expanding and modernizing of Korean seismic stations, the consortium of the four institutes made the standard criteria of seismic observation such as instrument, data format, and communication protocol for the purpose of integrating seismic networks. More than 160 digital stations (velocity or accelerometer) installed from 1998 to 2009 in Korea could be easily linked to KISS in real time due to the standard criteria. When a big earthquake happens, the observed peak acceleration value can be used as the instrumental intensity on the local site and the distribution of peak accelerations shows roughly the severity of the damaged area. Real Time Intensity Color Mapping (RTICOM) is developed to generate every second contour map of the nationwide intensity based on the peak acceleration values retrieved through KISS from local stations. RTICOM can be used for rapid evaluation of the intensity and decision making against earthquake damages. For the purpose of rapid response to earthquake hazard, Korea Train eXpress (KTX) constructed real-time monitoring system using accelerometers installed on bridges and tunnels. KTX monitoring center receives every second PGA data and monitoring system displays these data on the dedicated screen. The frequency zone of data is considered only below 10 Hz in other to reduce artificial false alarms. If a higher PGA value overcomes the pre-determined level then an alarm will happen with making sound and brightening red and yellow lights. The KTX control center would make repaid decision whether express train should be stopped immediately or not.

  4. Helping safeguard Veterans Affairs' hospital buildings by advanced earthquake monitoring

    USGS Publications Warehouse

    Kalkan, Erol; Banga, Krishna; Ulusoy, Hasan S.; Fletcher, Jon Peter B.; Leith, William S.; Blair, James L.

    2012-01-01

    In collaboration with the U.S. Department of Veterans Affairs (VA), the National Strong Motion Project of the U.S. Geological Survey has recently installed sophisticated seismic systems that will monitor the structural integrity of hospital buildings during earthquake shaking. The new systems have been installed at more than 20 VA medical campuses across the country. These monitoring systems, which combine sensitive accelerometers and real-time computer calculations, are capable of determining the structural health of each structure rapidly after an event, helping to ensure the safety of patients and staff.

  5. Earthquake Monitoring at Different Scales with Seiscomp3

    NASA Astrophysics Data System (ADS)

    Grunberg, M.; Engels, F.

    2013-12-01

    In the last few years, the French National Network of Seismic Survey (BCSF-RENASS) had to modernize its old and aging earthquake monitoring system coming from an inhouse developement. After having tried and conducted intensive tests on several real time frameworks such as EarthWorm and Seiscomp3 we have finaly adopted in 2012 Seiscomp3. Our actual system runs with two pipelines in parallel: the first one is tuned at a global scale to monitor the world seismicity (for event's magnitude > 5.5) and the second one is tuned at a national scale for the monitoring of the metropolitan France. The seismological stations used for the "world" pipeline are coming mainly from Global Seismographic Network (GSN), whereas for the "national" pipeline the stations are coming from the RENASS short period network and from the RESIF broadband network. More recently we have started to tune seiscomp3 at a smaller scale to monitor in real time the geothermal project (a R&D program in Deep Geothermal Energy) in the North-East part of France. Beside the use of the real time monitoring capabilities of Seiscomp3 we have also used a very handy feature to playback a 4 month length dataset at a local scale for the Rambervillers earthquake (22/02/2003, Ml=5.4) leading to the build of roughly 2000 aftershock's detections and localisations.

  6. Monitoring the ionosphere during the earthquake on GPS data

    NASA Astrophysics Data System (ADS)

    Smirnov, V. M.; Smirnova, E. V.

    The problem of stability estimation of physical state of an atmosphere attracts a rapt attention of the world community but it is still far from being solved A lot of global atmospheric processes which have direct influence upon all forms of the earth life have been detected The comprehension of cause effect relations stipulating their origin and development is possible only on the basis of long-term sequences of observations data of time-space variations of the atmosphere characteristics which should be received on a global scale and in the interval of altitudes as brand as possible Such data can be obtained only with application satellite systems The latest researches have shown that the satellite systems can be successfully used for global and continuous monitoring ionosphere of the Earth In turn the ionosphere can serve a reliable indicator of different kinds of effects on an environment both of natural and anthropogenic origin Nowadays the problem of the short-term forecast of earthquakes has achieved a new level of understanding There have been revealed indisputable factors which show that the ionosphere anomalies observed during the preparation of seismic events contain the information allowing to detect and to interpret them as earthquake precursors The partial decision of the forecast problem of earthquakes on ionospheric variations requires the processing data received simultaneously from extensive territories Such requirements can be met only on the basis of ground-space system of ionosphere monitoring The navigating systems

  7. Integrating geomatics and structural investigation in post-earthquake monitoring of ancient monumental Buildings

    NASA Astrophysics Data System (ADS)

    Dominici, Donatella; Galeota, Dante; Gregori, Amedeo; Rosciano, Elisa; Alicandro, Maria; Elaiopoulos, Michail

    2014-06-01

    The old city center of L’Aquila is rich in historical buildings of considerable merit. On April 6th 2009 a devastating earthquake caused significant structural damages, affecting especially historical and monumental masonry buildings. The results of a study carried out on a monumental building, former headquarters of the University of L’Aquila (The Camponeschi building, XVI century) are presented in this paper. The building is situated in the heart of the old city center and was seriously damaged by the earthquake. Preliminary visual damage analysis carried out immediately after the quake, clearly evidenced the building’s complexity, raising the need for direct and indirect investigation on the structure. Several non-destructive test methods were then performed in situ to better characterize the masonry typology and the damage distribution, as well. Subsequently, a number of representative control points were identified on the building’s facades to represent, by their motion over time, the evolution of the structural displacements and deformations. In particular, a surveying network consisting of 27 different points was established. A robotic total station mounted on top of a concrete pillar was used for periodically monitoring the surveying control network. Stability of the pillar was checked through a GNSS static survey repeated before any set of measurements. The present study evidences the interesting possibilities of combining geomatics with structural investigation during post-earthquake monitoring of ancient monumental buildings.

  8. Major improvements in progress for Southern California Earthquake Monitoring

    NASA Astrophysics Data System (ADS)

    Mori, Jim; Kanamori, Hiroo; Davis, James; Hauksson, Egill; Clayton, Robert; Heaton, Thomas; Jones, Lucile; Shakal, Anthony; Porcella, Ron

    Major improvements in seismic and strong-motion monitoring networks are being implemented in southern California to better meet the needs of emergency response personnel, structural engineers, and the research community in promoting earthquake hazard reduction. Known as the TriNet project, the improvements are being coordinated by the California Institute of Technology (Caltech), the U.S. Geological Survey (USGS), and the California Division of Mines and Geology (CDMG) of the state's Department of Conservation. Already the ambitious instrument and system development project has started to record and disseminate ground motions from a spatially dense and robust network of high quality seismographs.

  9. Collaborative Projects at the Northern California Earthquake Data Center (NCEDC)

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Oppenheimer, D.; Zuzlewski, S.; Gee, L.; Murray, M.; Bassett, A.; Prescott, W.; Romanowicz, B.

    2001-12-01

    The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and the USGS Menlo Park to provide a long-term archive and distribution center for geophysical data for northern California. Most data are available via the Web at http://quake.geo.berkeley.edu and research accounts are available for access to specialized datasets. Current efforts continue to expand the available datasets and to enhance distribution methods. The NCEDC currently archives continuous and event seismic waveform data from the BDSN and the USGS NCSN. Data from the BDSN are available in SEED and work is underway to make NCSN data available in this format. This massive project requires assembling and tracking the instrument responses from over 5000 current and historic NCSN data channels. Event waveforms from specialized networks, such as Geysers and Parkfield, are also available. In collaboration with the USGS, the NCEDC has archived a total of 887 channels from 139 sites of the "USGS low-frequency" geophysical network (UL), including data from strainmeters, creep meters, magnetometers, water well levels, and tiltmeters. There are 486 current data channels being updated at the NCEDC on a daily basis. All UL data are available in SEED. Data from the BARD network of over 40 continuously recording GPS sites are archived at the NCEDC in both raw and RINEX format. The NCEDC is now the primary archive for survey-mode GPS and other geodetic data collected in northern California by the USGS, universities, and other agencies. All of the BARD data and GPS data archived from USGS Menlo Park surveys are now available from the NCEDC via FTP. To support more portable and uniform data query programs among data centers, the NCEDC developed a set of Generic Data Center Views (GDVs) that incorporates the basic information that most datacenters maintain about data channels, instrument responses, and waveform inventory. We defined MSQL (Meta SeismiQuery Language), a query language based on the SQL SELECT command, to perform queries on the GDVs, and developed a program which converts the MSQL to an SQL request. MSQL2SQL converts the MSQL command into a parse tree, and defines an API allowing each datacenter to traverse the parse tree and revise it to produce a data center-specific SQL request. The NCEDC converted the IRIS SeismiQuery program to use the GDVs and MSQL, installed it at the NCEDC, and distributed the software to IRIS, SCEC-DC, and other interested parties. The resulting program should be much easier to install and support at other data centers. The NCEDC is also working on several data center integration projects in order to provide users with seamless access to data. The NCEDC is collaborating with IRIS on the NETDC project and with UNAVCO on the GPS Seamless Archive Centers initiative. Through the newly formed California Integrated Seismic Network, we are working with the SCEC-DC to provide unified access to California earthquake data.

  10. Monitoring fault zone environments with correlations of earthquake waveforms

    NASA Astrophysics Data System (ADS)

    Roux, Philippe; Ben-Zion, Yehuda

    2014-02-01

    We develop a new technique for monitoring temporal changes in fault zone environments based on cross-correlation of earthquake waveforms recorded by pairs of stations. The method is applied to waveforms of 10 000 earthquakes observed during 100 d around the 1999 M 7.1 Duzce mainshock by a station located in the core damage zone of the North Anatolian Fault and a nearby station. To overcome clock problems, the correlation functions are realigned on a dominant peak. Consequently, the analysis focuses on measurements of coherency rather than traveltimes, and is associated with correlation coefficient of groups of events with a reference wavelet. Examination of coherency in different frequency bands reveals clear changes at a narrow band centred around 0.8 Hz. The results show a rapid drop of 1-2 per cent of the coherency at the time of the Duzce event followed by gradual recovery with several prominent oscillations over 4 d. The observed changes likely reflect evolution of permeability and fluid motion in the core damage zone of the North Anatolian Fault. Compared to noise correlation processing, our analysis of earthquake waveform correlation (i) benefits from high level of coherence with short duration recorded signals, (ii) has considerably finer temporal sampling of fault dynamics after mainshocks than is possible with noise correlation, (iii) uses the coherence level to track property variations, which may be more robust than traveltime fluctuations in the coda of noise correlations. Studies utilizing both earthquake and noise waveforms at multiple pairs of stations across fault damage zones can improve significantly the understanding of fault zone processes.

  11. Array monitoring of swarm earthquakes in the Pollino range (Italy)

    NASA Astrophysics Data System (ADS)

    Roessler, Dirk; Passarelli, Luigi; Govoni, Aladino; Rivalta, Eleonora

    2014-05-01

    The Mercure Basin (MB) and the Castrovillari Fault (CF) in the Pollino range (southern Apennines, Italy) represent one of the most prominent seismic gaps in the Italian seismic catalog, with no M>6 earthquakes during the last centuries. In recent times, the MB has been repeatedly interested by seismic swarms, with the most energetic swarm started in 2010 and still active in 2013. The seismic activity culminated in autumn 2012 with a M=5 event on October 25. In contrast, the CF appears aseismic. Only the northern part of the CF has experienced microseismicity. The rheology of these faults is unclear. Current debates include the potential of the MB and the CF to host large earthquakes and the level and the style of deformation. Understanding the seismicity and the behaviour of the faults is therefore necessary to assess the seismic hazard. We have been monitoring the ongoing seismicity using a small-aperture seismic array, integrated in a temporary seismic network. The instruments are provided by the GFZ German Research Centre for Geosciences and INGV, Italy, and are operated in close collaboration between both institutes. Automatized seismic array methods are applied to resolve the spatio-temporal evolution of the seismicity in great detail. Using the GFZ array, we detect about ten times more earthquakes than currently included in automatic local catalogues. The increase corresponds to an improvement in complete event detection down to M~0.5. Event locations and the magnitude-frequency distribution are analysed to characterise the swarm and investigate the possible role of fluids for earthquake triggering. In the course of the swarm, seismicity has mainly migrated within the Mercure Basin. However, the spread towards the northern end of the Castrovillari fault to the east in 2013 marks a swarm phase with seismicity located outside of the Mercure Basin. The observations characterize the behaviour of the faults and their inter-connection.

  12. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  13. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  14. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  15. Southern California Earthquake Center--Virtual Display of Objects (SCEC-VDO): An Earthquake Research and Education Tool

    NASA Astrophysics Data System (ADS)

    Perry, S.; Maechling, P.; Jordan, T.

    2006-12-01

    Interns in the program Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT, an NSF Research Experience for Undergraduates Site) have designed, engineered, and distributed SCEC-VDO (Virtual Display of Objects), an interactive software used by earthquake scientists and educators to integrate and visualize global and regional, georeferenced datasets. SCEC-VDO is written in Java/Java3D with an extensible, scalable architecture. An increasing number of SCEC-VDO datasets are obtained on the fly through web services and connections to remote databases; and user sessions may be saved in xml-encoded files. Currently users may display time-varying sequences of earthquake hypocenters and focal mechanisms, several 3-dimensional fault and rupture models, satellite imagery - optionally draped over digital elevation models - and cultural datasets including political boundaries. The ability to juxtapose and interactively explore these data and their temporal and spatial relationships has been particularly important to SCEC scientists who are evaluating fault and deformation models, or who must quickly evaluate the menace of evolving earthquake sequences. Additionally, SCEC-VDO users can annotate the display, plus script and render animated movies with adjustable compression levels. SCEC-VDO movies are excellent communication tools and have been featured in scientific presentations, classrooms, press conferences, and television reports.

  16. Southern California Earthquake Center (SCEC) Communication, Education and Outreach Program

    NASA Astrophysics Data System (ADS)

    Benthien, M. L.

    2003-12-01

    The SCEC Communication, Education, and Outreach Program (CEO) offers student research experiences, web-based education tools, classroom curricula, museum displays, public information brochures, online newsletters, and technical workshops and publications. This year, much progress has been made on the development of the Electronic Encyclopedia of Earthquakes (E3), a collaborative project with CUREE and IRIS. The E3 development system is now fully operational, and 165 entries are in the pipeline. When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. To coordinate activities for the 10-year anniversary of the Northridge Earthquake in 2004 (and beyond), the "Earthquake Country Alliance" is being organized by SCEC CEO to present common messages, to share or promote existing resources, and to develop new activities and products jointly (such as a new version of Putting Down Roots in Earthquake Country). The group includes earthquake science and engineering researchers and practicing professionals, preparedness experts, response and recovery officials, news media representatives, and education specialists. A web portal, http://www.earthquakecountry.info, is being developed established with links to web pages and descriptions of other resources and services that the Alliance members provide. Another ongoing strength of SCEC is the Summer Intern program, which now has a year-round counterpart with students working on IT projects at USC. Since Fall 2002, over 32 students have participated in the program, including 7 students working with scientists throughout SCEC, 17 students involved in the USC "Earthquake Information Technology" intern program, and 7 students involved in CEO projects. These and other activities of the SCEC CEO program will be presented, along with lessons learned during program design and implementation.

  17. CSEP Testing Center and the first results of the earthquake forecast testing experiment in Japan

    NASA Astrophysics Data System (ADS)

    Tsuruoka, H.; Hirata, N.; Schorlemmer, D.; Euchner, F.; Nanjo, K. Z.; Jordan, T. H.

    2012-08-01

    Major objectives of the Japanese earthquake prediction research program for the period 2009-2013 are to create earthquake forecasting models and begin the prospective testing of these models against recorded seismicity. For this purpose, the Earthquake Research Institute of the University of Tokyo has joined an international partnership to create a Collaboratory for the Study of Earthquake Predictability (CSEP). Here, we describe a new infrastructure for developing and evaluating forecasting models—the CSEP Japan Testing Center—as well as some preliminary testing results. On 1 November 2009, the Testing Center started a prospective and competitive earthquake predictability experiment using the seismically active and well-instrumented region of Japan as a natural laboratory.

  18. Athens Neutron Monitor Data Processing Center - ANMODAP Center

    NASA Astrophysics Data System (ADS)

    Mavromichalaki, H.; Gerontidou, M.; Mariatos, G.; Papailiou, M.; Papaioannou, A.; Plainaki, C.; Sarlanis, C.; Souvatzoglou, G.

    2009-11-01

    Cosmic ray measurements in Athens were initiated in November 2000 with a standard 6NM-64 neutron monitor. Within the last years an effort has been made in order to construct an effective database of neutron monitor (NM) and satellite data in real-time, regarding the necessities of space weather monitoring (Athens Neutron Monitor Data Processing Center - ANMODAP Center). The prospective goal of this network is to make possible the receiving of all data in real-time in close sequence from all servers around the globe. The graphical representation of all these data in real-time is available through the website of the station ( http://cosray.phys.uoa.gr). Moreover, a second database that collects data with 1-min resolution operates in a parallel mode. The online services as a special 'Alert' algorithm for Ground Level Enhancements (GLEs) and some models created to analyze aspects of GLEs as the neutron monitor Basic Anisotropic Neutron Ground Level Enhancement (BANGLE) model and the Forbush Decreases (FORD) model as well, are presented. Moreover, a short account on work performed on the possible relationship between the geomagnetic activity level and the biological effects is given.

  19. Monitoring road losses for Lushan 7.0 earthquake disaster utilization multisource remote sensing images

    NASA Astrophysics Data System (ADS)

    Huang, He; Yang, Siquan; Li, Suju; He, Haixia; Liu, Ming; Xu, Feng; Lin, Yueguan

    2015-12-01

    Earthquake is one major nature disasters in the world. At 8:02 on 20 April 2013, a catastrophic earthquake with Ms 7.0 in surface wave magnitude occurred in Sichuan province, China. The epicenter of this earthquake located in the administrative region of Lushan County and this earthquake was named the Lushan earthquake. The Lushan earthquake caused heavy casualties and property losses in Sichuan province. After the earthquake, various emergency relief supplies must be transported to the affected areas. Transportation network is the basis for emergency relief supplies transportation and allocation. Thus, the road losses of the Lushan earthquake must be monitoring. The road losses monitoring results for Lushan earthquake disaster utilization multisource remote sensing images were reported in this paper. The road losses monitoring results indicated that there were 166 meters' national roads, 3707 meters' provincial roads, 3396 meters' county roads, 7254 meters' township roads, and 3943 meters' village roads were damaged during the Lushan earthquake disaster. The damaged roads mainly located at Lushan County, Baoxing County, Tianquan County, Yucheng County, Mingshan County, and Qionglai County. The results also can be used as a decision-making information source for the disaster management government in China.

  20. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  1. Earthquakes

    MedlinePlus

    ... National Science Foundation National Institute of Standards and Technology Publications If you require more information about any of these topics, the following resources may be helpful. America’s PrepareAthon! How to Prepare for Earthquakes Earthquake Preparedness: ...

  2. Romanian Data Center: A modern way for seismic monitoring

    NASA Astrophysics Data System (ADS)

    Neagoe, Cristian; Marius Manea, Liviu; Ionescu, Constantin

    2014-05-01

    The main seismic survey of Romania is performed by the National Institute for Earth Physics (NIEP) which operates a real-time digital seismic network. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Mark l4c, Ranger, gs21, Mark l22) and acceleration sensors (Episensor Kinemetrics). The data are transmitted at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for the Black Sea tsunami events. NIEP is a data acquisition node for the seismic network of Moldova (FDSN code MD) composed of five seismic stations. NIEP has installed in the northern part of Bulgaria eight seismic stations equipped with broadband sensors and Episensors and nine accelerometers (Episensors) installed in nine districts along the Danube River. All the data are acquired at NIEP for Early Warning System and for primary estimation of the earthquake parameters. The real-time acquisition (RT) and data exchange is done by Antelope software and Seedlink (from Seiscomp3). The real-time data communication is ensured by different types of transmission: GPRS, satellite, radio, Internet and a dedicated line provided by a governmental network. For data processing and analysis at the two data centers Antelope 5.2 TM is being used running on 3 workstations: one from a CentOS platform and two on MacOS. Also a Seiscomp3 server stands as back-up for Antelope 5.2 Both acquisition and analysis of seismic data systems produce information about local and global parameters of earthquakes. In addition, Antelope is used for manual processing (event association, calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV, etc.), generating ShakeMap products and interaction with global data centers. National Data Center developed tools to enable centralizing of data from software like Antelope and Seiscomp3. These tools allow rapid distribution of information about damages observed after an earthquake to the public. Another feature of the developed application is the alerting of designated persons, via email and SMS, based on the earthquake parameters. In parallel, Seiscomp3 sends automatic notifications (emails) with the earthquake parameters. The real-time seismic network and software acquisition and data processing used in the National Data Center development have increased the number of events detected locally and globally, the increase of the quality parameters obtained by data processing and potentially increasing visibility on the national and internationally.

  3. The Pacific Tsunami Warning Center response to the Mw8.1 Samoan earthquake of September 29, 2009

    NASA Astrophysics Data System (ADS)

    Hirshorn, B. F.; Becker, N.; Weinstein, S.

    2009-12-01

    Over 90% of tsunami-related casualties occur within a few hundred km of the causative event, usually an earthquake. The Mw 8.1 (GCMT) Samoan earthquake and tsunami of September 29, 2009, represents a best-case scenario for response and self-evacuation by a population near the epicenter of a tsunamigenic earthquake. The Samoan population felt over 60 seconds of strong ground shaking and saw the first tsunami wave motion as a recession rather than an onshore wave. Their observations coupled with effective public awareness saved many lives. Such phenomena do not precede all dangerous tsunamis, however, and Samoans may not receive these natural warnings for future local tsunamis. For example, a “tsunami earthquake” (Kanamori, 1972) can generate a destructive tsunami with little or no strong ground motion (cf. Nicaragua 1992 and Java 2006). Furthermore, if the Samoan earthquake had ruptured as a thrust mechanism more typical for the nearby subduction zone, then the first observed tsunami wave would have likely caused inundation, and thus the ocean would not have warned the population. The Pacific Tsunami Warning Center (PTWC) mitigates such hazards by monitoring earthquakes in real time and using semi-automated analysis to rapidly characterize seismic sources for their tsunami-generating potential in order to warn coastlines of any tsunami threats. As part of its mission PTWC also uses a dense local seismic network in order to produce local warnings for the State of Hawaii within 3 minutes of earthquake origin time. In this presentation we detail the analysis and response performed by the PTWC for the Samoan event. We highlight how the current sparse deployment of seismometers in the southwest Pacific Ocean resulted in PTWC issuing a warning 16 minutes after the earthquake's origin time, as compared to what can be done using a denser seismic network. Therefore, we advocate for a denser network of seismometers in the region that will allow the PTWC to halve the time needed to issue tsunami warnings after future earthquakes in the region that may not be as well suited for local response and self-evacuation as this recent event. Currently, there are new and developing seismic networks in Tonga, Fiji and Samoa. These data will be needed to reduce the time lapse between the earthquake and the tsunami warning.

  4. Remote monitoring of the earthquake cycle using satellite radar interferometry

    NASA Astrophysics Data System (ADS)

    Wright, Tim J.

    2002-12-01

    The earthquake cycle is poorly understood. Earthquakes continue to occur on previously unrecognized faults. Earthquake prediction seems impossible. These remain the facts despite nearly 100 years of intensive study since the earthquake cycle was first conceptualized. Using data acquired from satellites in orbit 800 km above the Earth, a new technique, radar interferometry (InSAR), has the potential to solve these problems. For the first time, detailed maps of the warping of the Earth's surface during the earthquake cycle can be obtained with a spatial resolution of a few tens of metres and a precision of a few millimetres. InSAR does not need equipment on the ground or expensive field campaigns, so it can gather crucial data on earthquakes and the seismic cycle from some of the remotest areas of the planet. In this article, I review some of the remarkable observations of the earthquake cycle already made using radar interferometry and speculate on breakthroughs that are tantalizingly close.

  5. Quantifying 10 years of improved earthquake-monitoring performance in the Caribbean region

    USGS Publications Warehouse

    McNamara, Daniel E.; Hillebrandt-Andrade, Christa; Saurel, Jean-Marie; Huerfano-Moreno, V.; Lynch, Lloyd

    2015-01-01

    Over 75 tsunamis have been documented in the Caribbean and adjacent regions during the past 500 years. Since 1500, at least 4484 people are reported to have perished in these killer waves. Hundreds of thousands are currently threatened along the Caribbean coastlines. Were a great tsunamigenic earthquake to occur in the Caribbean region today, the effects would potentially be catastrophic due to an increasingly vulnerable region that has seen significant population increases in the past 40–50 years and currently hosts an estimated 500,000 daily beach visitors from North America and Europe, a majority of whom are not likely aware of tsunami and earthquake hazards. Following the magnitude 9.1 Sumatra–Andaman Islands earthquake of 26 December 2004, the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Early Warning System for the Caribbean and Adjacent Regions (CARIBE‐EWS) was established and developed minimum performance standards for the detection and analysis of earthquakes. In this study, we model earthquake‐magnitude detection threshold and P‐wave detection time and demonstrate that the requirements established by the UNESCO ICG CARIBE‐EWS are met with 100% of the network operating. We demonstrate that earthquake‐monitoring performance in the Caribbean Sea region has improved significantly in the past decade as the number of real‐time seismic stations available to the National Oceanic and Atmospheric Administration tsunami warning centers have increased. We also identify weaknesses in the current international network and provide guidance for selecting the optimal distribution of seismic stations contributed from existing real‐time broadband national networks in the region.

  6. Real-time earthquake monitoring using a search engine method.

    PubMed

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  7. The Development of an Earthquake Preparedness Plan for a Child Care Center in a Geologically Hazardous Region.

    ERIC Educational Resources Information Center

    Wokurka, Linda

    The director of a child care center at a community college in California developed an earthquake preparedness plan for the center which met state and local requirements for earthquake preparedness at schools. The plan consisted of: (1) the identification and reduction of nonstructural hazards in classrooms, office, and staff rooms; (2) storage of…

  8. Advanced Real-time Monitoring System and Simulation Researches for Earthquakes and Tsunamis in Japan -Towards Disaster Mitigation on Earthquakes and Tsunamis-

    NASA Astrophysics Data System (ADS)

    Hyodo, M.; Kaneda, Y.; Takahashi, N.; Baba, T.; Hori, T.; Kawaguchi, K.; Araki, E.; Matsumoto, H.; Nakamura, T.; Kamiya, S.; Ariyoshi, K.; Nakano, M.; Choi, J. K.; Nishida, S.

    2014-12-01

    How to mitigate and reduce damages by Earthquakes and Tsunamis? This is very important and indispensable problem for Japan and other high seismicity countries.Based on lessons learned from the 2004 Sumatra Earthquake/Tsunamis and the 2011 East Japan Earthquake/Tsunami, we recognized the importance of real-time monitoring of these natural hazards. As real-time monitoring system, DONET1 (Dense Ocean floor Network for Earthquakes and Tsunamis) was deployed and DONET2 is being developed around the Nankai trough Southwestern Japan for Seismology and Earthquake/Tsunami Early Warning. Based on simulation researches, DONET1 and DONET2 with multi-kinds of sensors such as broadband seismometers and precise pressure gauges will be expected to monitor slow events such as low frequency tremors and slow earthquakes for the estimation of seismic stage which is the inter-seismic or pre-seismic stage. In advanced simulation researches such as the recurrence cycle of mega thrust earthquakes, the data assimilation is very powerful tool to improve the reliability. Furthermore, tsunami inundations, seismic responses on buildings/city and agent simulations are very important towards future disaster mitigation programs and related measures. Finally, real-time monitoring data and advanced simulations will be integrated for precise Earthquake/Tsunami Early Warning and Estimation of damages in future compound disasters on Earthquakes and Tsunamis. We will introduce the present progress of advanced researches and future scope for disaster mitigation researches on earthquakes and Tsunamis.

  9. Real-time earthquake monitoring using a search engine method

    PubMed Central

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  10. Real-time earthquake monitoring using a search engine method

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data.

  11. Remote monitoring of the earthquake cycle using satellite radar interferometry.

    PubMed

    Wright, Tim J

    2002-12-15

    The earthquake cycle is poorly understood. Earthquakes continue to occur on previously unrecognized faults. Earthquake prediction seems impossible. These remain the facts despite nearly 100 years of intensive study since the earthquake cycle was first conceptualized. Using data acquired from satellites in orbit 800 km above the Earth, a new technique, radar interferometry (InSAR), has the potential to solve these problems. For the first time, detailed maps of the warping of the Earth's surface during the earthquake cycle can be obtained with a spatial resolution of a few tens of metres and a precision of a few millimetres. InSAR does not need equipment on the ground or expensive field campaigns, so it can gather crucial data on earthquakes and the seismic cycle from some of the remotest areas of the planet. In this article, I review some of the remarkable observations of the earthquake cycle already made using radar interferometry and speculate on breakthroughs that are tantalizingly close. PMID:12626271

  12. Real-Time GPS Monitoring for Earthquake Rapid Assessment in the San Francisco Bay Area

    NASA Astrophysics Data System (ADS)

    Guillemot, C.; Langbein, J. O.; Murray, J. R.

    2012-12-01

    The U.S. Geological Survey Earthquake Science Center has deployed a network of eight real-time Global Positioning System (GPS) stations in the San Francisco Bay area and is implementing software applications to continuously evaluate the status of the deformation within the network. Real-time monitoring of the station positions is expected to provide valuable information for rapidly estimating source parameters should a large earthquake occur in the San Francisco Bay area. Because earthquake response applications require robust data access, as a first step we have developed a suite of web-based applications which are now routinely used to monitor the network's operational status and data streaming performance. The web tools provide continuously updated displays of important telemetry parameters such as data latency and receive rates, as well as source voltage and temperature information within each instrument enclosure. Automated software on the backend uses the streaming performance data to mitigate the impact of outages, radio interference and bandwidth congestion on deformation monitoring operations. A separate set of software applications manages the recovery of lost data due to faulty communication links. Displacement estimates are computed in real-time for various combinations of USGS, Plate Boundary Observatory (PBO) and Bay Area Regional Deformation (BARD) network stations. We are currently comparing results from two software packages (one commercial and one open-source) used to process 1-Hz data on the fly and produce estimates of differential positions. The continuous monitoring of telemetry makes it possible to tune the network to minimize the impact of transient interruptions of the data flow, from one or more stations, on the estimated positions. Ongoing work is focused on using data streaming performance history to optimize the quality of the position, reduce drift and outliers by switching to the best set of stations within the network, and automatically select the "next best" station to use as reference. We are also working towards minimizing the loss of streamed data during concurrent data downloads by improving file management on the GPS receivers.

  13. Current Development at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.; Clayton, R. W.

    2005-12-01

    Over the past year, the SCEDC completed or is near completion of three featured projects: Station Information System (SIS) Development: The SIS will provide users with an interface into complete and accurate station metadata for all current and historic data at the SCEDC. The goal of this project is to develop a system that can interact with a single database source to enter, update and retrieve station metadata easily and efficiently. The system will provide accurate station/channel information for active stations to the SCSN real-time processing system, as will as station/channel information for stations that have parametric data at the SCEDC i.e., for users retrieving data via STP. Additionally, the SIS will supply information required to generate dataless SEED and COSMOS V0 volumes and allow stations to be added to the system with a minimum, but incomplete set of information using predefined defaults that can be easily updated as more information becomes available. Finally, the system will facilitate statewide metadata exchange for both real-time processing and provide a common approach to CISN historic station metadata. Moment Tensor Solutions: The SCEDC is currently archiving and delivering Moment Magnitudes and Moment Tensor Solutions (MTS) produced by the SCSN in real-time and post-processing solutions for events spanning back to 1999. The automatic MTS runs on all local events with magnitudes > 3.0, and all regional events > 3.5. The distributed solution automatically creates links from all USGS Simpson Maps to a text e-mail summary solution, creates a .gif image of the solution, and updates the moment tensor database tables at the SCEDC. Searchable Scanned Waveforms Site: The Caltech Seismological Lab has made available 12,223 scanned images of pre-digital analog recordings of major earthquakes recorded in Southern California between 1962 and 1992 at http://www.data.scec.org/research/scans/. The SCEDC has developed a searchable web interface that allows users to search the available files, select multiple files for download and then retrieve a zipped file containing the results. Scanned images of paper records for M>3.5 southern California earthquakes and several significant teleseisms are available for download via the SCEDC through this search tool.

  14. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    NASA Astrophysics Data System (ADS)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of detections is very small compared to the 5,175 earthquakes in the USGS PDE global earthquake catalog for the same five month time period, and no accurate location or magnitude can be assigned based on Tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 80% occurred within 2 minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided (very) short first-impression narratives from people who experienced the shaking. The USGS will continue investigating how to use Twitter and other forms of social media to augment is current suite of seismographically derived products.

  15. Korea Integrated Seismic System tool(KISStool) for seismic monitoring and data sharing at the local data center

    NASA Astrophysics Data System (ADS)

    Park, J.; Chi, H. C.; Lim, I.; Jeong, B.

    2011-12-01

    The Korea Integrated Seismic System(KISS) is a back-bone seismic network which distributes seismic data to different organizations in near-real time at Korea. The association of earthquake monitoring institutes has shared their seismic data through the KISS from 2003. Local data centers operating remote several stations need to send their free field seismic data to NEMA(National Emergency Management Agency) by the law of countermeasure against earthquake hazard in Korea. It is very important the efficient tool for local data centers which want to rapidly detect local seismic intensity and to transfer seismic event information toward national wide data center including PGA, PGV, dominant frequency of P-wave, raw data, and etc. We developed the KISStool(Korea Integrated Seismic System tool) for easy and convenient operation seismic network in local data center. The KISStool has the function of monitoring real time waveforms by clicking station icon on the Google map and real time variation of PGA, PGV, and other data by opening the bar type monitoring section. If they use the KISStool, any local data center can transfer event information to NEMA(National Emergency Management Agency), KMA(Korea Meteorological Agency) or other institutes through the KISS using UDP or TCP/IP protocols. The KISStool is one of the most efficient methods to monitor and transfer earthquake event at local data center in Korea. KIGAM will support this KISStool not only to the member of the monitoring association but also local governments.

  16. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  17. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  18. Earthquakes

    USGS Publications Warehouse

    Shedlock, Kaye M.; Pakiser, Louis Charles

    1998-01-01

    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  19. 88 hours: the U.S. Geological Survey National Earthquake Information Center response to the March 11, 2011 Mw 9.0 Tohoku earthquake

    USGS Publications Warehouse

    Wald, David J.; Hayes, Gavin P.; Benz, Harley M.; Earle, Paul; Briggs, Richard W.

    2011-01-01

    The M 9.0 11 March 2011 Tohoku, Japan, earthquake and associated tsunami near the east coast of the island of Honshu caused tens of thousands of deaths and potentially over one trillion dollars in damage, resulting in one of the worst natural disasters ever recorded. The U.S. Geological Survey National Earthquake Information Center (USGS NEIC), through its responsibility to respond to all significant global earthquakes as part of the National Earthquake Hazards Reduction Program, quickly produced and distributed a suite of earthquake information products to inform emergency responders, the public, the media, and the academic community of the earthquake's potential impact and to provide scientific background for the interpretation of the event's tectonic context and potential for future hazard. Here we present a timeline of the NEIC response to this devastating earthquake in the context of rapidly evolving information emanating from the global earthquake-response community. The timeline includes both internal and publicly distributed products, the relative timing of which highlights the inherent tradeoffs between the requirement to provide timely alerts and the necessity for accurate, authoritative information. The timeline also documents the iterative and evolutionary nature of the standard products produced by the NEIC and includes a behind-the-scenes look at the decisions, data, and analysis tools that drive our rapid product distribution.

  20. Integrated monitoring of pre-earthquake signals in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, L. C.; Lin, C. H.

    2014-12-01

    In the search of possible earthquake precursors in the Taiwan area, there have been continuous measurements of the gravity, geomagnetic perturbation, crustal deformation, ionospheric disturbance, ground water level, and leaky gas (Radon ) from the crust in the past two decades. In 2010, a major project "Integrated Earthquake Precursors and Early Warning for Seismic Disaster Prevention in Taiwan" was initiated by the Ministry of Science and Technology. Under this project, the gamma-ray sensors, downhole strainmeters, telluric electric field measurements and thermal infrared ray analysis are further established. In addition, an electric coupling model for the lithosphere-atmosphere-ionosphere was developed. In this talk, some important results from the integrated observations and theoretical models for pre-earthquake signals will be presented.

  1. The Pacific Tsunami Warning Center's Response to the Tohoku Earthquake and Tsunami

    NASA Astrophysics Data System (ADS)

    Weinstein, S. A.; Becker, N. C.; Shiro, B.; Koyanagi, K. K.; Sardina, V.; Walsh, D.; Wang, D.; McCreery, C. S.; Fryer, G. J.; Cessaro, R. K.; Hirshorn, B. F.; Hsu, V.

    2011-12-01

    The largest Pacific basin earthquake in 47 years, and also the largest magnitude earthquake since the Sumatra 2004 earthquake, struck off of the east coast of the Tohoku region of Honshu, Japan at 5:46 UTC on 11 March 2011. The Tohoku earthquake (Mw 9.0) generated a massive tsunami with runups of up to 40m along the Tohoku coast. The tsunami waves crossed the Pacific Ocean causing significant damage as far away as Hawaii, California, and Chile, thereby becoming the largest, most destructive tsunami in the Pacific Basin since 1960. Triggers on the seismic stations at Erimo, Hokkaido (ERM) and Matsushiro, Honshu (MAJO), alerted Pacific Tsunami Warning Center (PTWC) scientists 90 seconds after the earthquake began. Four minutes after its origin, and about one minute after the earthquake's rupture ended, PTWC issued an observatory message reporting a preliminary magnitude of 7.5. Eight minutes after origin time, the Japan Meteorological Agency (JMA) issued its first international tsunami message in its capacity as the Northwest Pacific Tsunami Advisory Center. In accordance with international tsunami warning system protocols, PTWC then followed with its first international tsunami warning message using JMA's earthquake parameters, including an Mw of 7.8. Additional Mwp, mantle wave, and W-phase magnitude estimations based on the analysis of later-arriving seismic data at PTWC revealed that the earthquake magnitude reached at least 8.8, and that a destructive tsunami would likely be crossing the Pacific Ocean. The earthquake damaged the nearest coastal sea-level station located 90 km from the epicenter in Ofunato, Japan. The NOAA DART sensor situated 600 km off the coast of Sendai, Japan, at a depth of 5.6 km recorded a tsunami wave amplitude of nearly two meters, making it by far the largest tsunami wave ever recorded by a DART sensor. Thirty minutes later, a coastal sea-level station at Hanasaki, Japan, 600 km from the epicenter, recorded a tsunami wave amplitude of nearly three meters. The evacuation of Hawaii's coastlines commenced at 7:31 UTC. Concurrent with this tsunami event, a widely-felt Mw 4.6 earthquake occurred beneath the island of Hawai`i at 8:58 UTC. PTWC responded within three minutes of origin time with a Tsunami Information Statement stating that the Hawaii earthquake would not generate a tsunami. After issuing 27 international tsunami bulletins to Pacific basin countries, and 16 messages to the State of Hawaii during a period of 25 hours after the event began, PTWC concluded its role during the Tohoku tsunami event with the issuance of the corresponding warning cancellation message at 6:36 UTC on 12 March 2011. During the following weeks, however, the PTWC would continue to respond to dozens of aftershocks related to the earthquake. We will present a complete timeline of PTWC's activities, both domestic and international, during the Tohoku tsunami event. We will also illustrate the immense number of website hits, phone calls, and media requests that flooded PTWC during the course of the event, as well as the growing role social media plays in communicating tsunami hazard information to the public.

  2. Towards implementation of the GRiD MT algorithm for near real-time calculation of moment tensors at the Alaska Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Macpherson, K. A.; Ruppert, N. A.; Dreger, D. S.; Lombard, P. N.; Freymueller, J. T.; Nicolsky, D.; Guilhem, A.

    2013-12-01

    The Alaska Earthquake Information Center (AEIC) locates approximately 30,000 earthquakes a year and is the primary source for earthquake information for the state of Alaska. This information is vital for the state; the most seismically active in the Union and home to significant infrastructure such as the Trans-Alaska Pipeline and Anchorage, a city with a population of over 295,000. The ability to quickly characterize an earthquake's moment and mechanism and make this information available to the public is a fundamental component of the AEIC's mission. In order to enhance the AEIC's capabilities in this regard, we are implementing the GRiD MT algorithm. This algorithm monitors a grid of potential sources by continuously cross-correlating pre-computed Green's functions with a data stream, allowing source locations and mechanisms to be determined rapidly. The algorithm has been employed effectively by the Berkeley Seismological Laboratory, the Earthquake Research Institute at the University of Tokyo, and by Academia Sinica in Taiwan. We show preliminary results for Alaska obtained by running the off-line, research version of the GRiD MT code for a 8°×8° grid that covers Anchorage and a segment of the Aleutian megathrust. Because even broad-band instruments may be off scale in the event of a large earthquake, we applied the algorithm to both strong-motion and high-rate GPS data. The results show that the algorithm is able to quickly produce accurate moment tensors for test cases employing both synthetic and real data. Based on these encouraging initial results, we are now incorporating GRiD MT into the AEIC's monitoring infrastructure by developing an interface for the Antelope real-time system and by expanding the grid to cover a larger portion of the Alaska region. Moment tensors determined by GRiD MT will complement the AEIC's existing real-time monitoring capability.

  3. Glacier quakes mimicking volcanic earthquakes: The challenge of monitoring ice-clad volcanoes and some solutions

    NASA Astrophysics Data System (ADS)

    Allstadt, K.; Carmichael, J. D.; Malone, S. D.; Bodin, P.; Vidale, J. E.; Moran, S. C.

    2012-12-01

    Swarms of repeating earthquakes at volcanoes are often a sign of volcanic unrest. However, glaciers also can generate repeating seismic signals, so detecting unrest at glacier-covered volcanoes can be a challenge. We have found that multi-day swarms of shallow, low-frequency, repeating earthquakes occur regularly at Mount Rainier, a heavily glaciated stratovolcano in Washington, but that most swarms had escaped recognition until recently. Typically such earthquakes were too small to be routinely detected by the seismic network and were often buried in the noise on visual records, making the few swarms that had been detected seem more unusual and significant at the time they were identified. Our comprehensive search for repeating earthquakes through the past 10 years of continuous seismic data uncovered more than 30 distinct swarms of low-frequency earthquakes at Rainier, each consisting of hundreds to thousands of events. We found that these swarms locate high on the glacier-covered edifice, occur almost exclusively between late fall and early spring, and that their onset coincides with heavy snowfalls. We interpret the correlation with snowfall to indicate a seismically observable glacial response to snow loading. Efforts are underway to confirm this by monitoring glacier motion before and after a major snowfall event using ground based radar interferometry. Clearly, if the earthquakes in these swarms reflect a glacial source, then they are not directly related to volcanic activity. However, from an operational perspective they make volcano monitoring difficult because they closely resemble earthquakes that often precede and accompany volcanic eruptions. Because we now have a better sense of the background level of such swarms and know that their occurrence is seasonal and correlated with snowfall, it will now be easier to recognize if future swarms at Rainier are unusual and possibly related to volcanic activity. To methodically monitor for such unusual activity, we are implementing an automatic detection algorithm to continuously search for repeating earthquakes at Mount Rainier, an algorithm that we eventually intend to apply to other Cascade volcanoes. We propose that a comprehensive routine that characterizes background levels of repeating earthquakes and the degree of correlation with weather and seasonal forcing, combined with real-time monitoring for repeating earthquakes, will provide a means to more rapidly discriminate between glacier seismicity and seismicity related to volcanic activity on monitored glacier-clad volcanoes.

  4. Data Sets and Data Services at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2014-12-01

    The Northern California Earthquake Data Center (NCEDC) houses a unique and comprehensive data archive and provides real-time services for a variety of seismological and geophysical data sets that encompass northern and central California. We have over 80 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates in both raw and RINEX format. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 890,000 events from 1984 to the present, and the NCEDC provides catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also host and provide event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a variety of ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  5. Earthquakes and submarine volcanism in the Northeast Pacific: Exploration in the time domain based on 21-years of hydroacoustic monitoring

    NASA Astrophysics Data System (ADS)

    Hammond, S. R.; Dziak, R. P.; Fox, C. G.

    2012-12-01

    Monitoring of regional seismic activity in the Northeast Pacific has been accomplished for the past 21 years using US Navy's Sound Surveillance System (SOSUS) hydrophone arrays. Seafloor seismic activity in this region occurs along the spreading center and transform boundaries between the Juan de Fuca, Pacific and North American plates. During the time span, from 1991 through 2011, nearly 50,000 earthquakes were detected and located. The majority of these events were associated with these tectonic boundaries but sections of several plate boundaries were largely aseismic during the this time span. While most of the earthquakes were associated with geological structures revealed in bathymetric maps of the region, there were also less easily explained intraplate events including a swarm of events within the interior of the southern portion of the Juan de Fuca plate. The location and sequential timing of events on portions of the plate boundaries also suggests ordered patterns of stress release. Among the most scientifically significant outcomes of acoustic monitoring was the discovery that deep seafloor magmatic activity can be accompanied by intense (> 1000 events/day) earthquake swarms. The first swarm detected by SOSUS, in 1993, was confirmed to have been associated with an extrusive volcanic eruption which occurred along a segment of the Juan de Fuca spreading center. Notably, this was the first deep spreading center eruption detected, located, and studied while it was active. Subsequently, two more swarms were confirmed to have been associated with volcanic eruptions, one on the Gorda spreading center in 1996 and the other at Axial volcano in 1998. One characteristic of these swarm events is migration of their earthquake locations 10s of km along the ridge axis tracking the movement of magma down-rift. The most rapid magma propagation events have been shown to be associated with seafloor eruptions and dramatic, transient changes in hydrothermal circulation as well as discharges of large volumes of hot water, i.e., megaplumes. Hydroacoustic monitoring using SOSUS, and now augmented with hydrophones deployed on stationary moorings as well as mobile platforms (e.g. gliders), provides a unique means for gaining knowledge concerning a broad diversity of present-day topics of scientific importance including, sources and fate of carbon in the deep ocean, deep ocean micro- and macro-ecosystems, and changes in ocean ambient noise levels.

  6. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    NASA Astrophysics Data System (ADS)

    Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

    2009-12-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this collaboration, and lessons learned from interacting with free-choice learning institutions.

  7. EQInfo - earthquakes world-wide

    NASA Astrophysics Data System (ADS)

    Weber, Bernd; Herrnkind, Stephan

    2014-05-01

    EQInfo is a free Android app providing recent earthquake information from various earthquake monitoring centers as GFZ, EMSC, USGS and others. It allows filtering of agency, region and magnitude as well as controlling update interval, institute priority and alarm types. Used by more than 25k active users and beeing in the top ten list of Google Play, EQInfo is one of the most popular apps for earthquake information.

  8. Providing Seismotectonic Information to the Public Through Continuously Updated National Earthquake Information Center Products

    NASA Astrophysics Data System (ADS)

    Bernardino, M. J.; Hayes, G. P.; Dannemann, F.; Benz, H.

    2012-12-01

    One of the main missions of the United States Geological Survey (USGS) National Earthquake Information Center (NEIC) is the dissemination of information to national and international agencies, scientists, and the general public through various products such as ShakeMap and earthquake summary posters. During the summer of 2012, undergraduate and graduate student interns helped to update and improve our series of regional seismicity posters and regional tectonic summaries. The "Seismicity of the Earth (1900-2007)" poster placed over a century's worth of global seismicity data in the context of plate tectonics, highlighting regions that have experienced great (M+8.0) earthquakes, and the tectonic settings of those events. This endeavor became the basis for a series of more regionalized seismotectonic posters that focus on major subduction zones and their associated seismicity, including the Aleutian and Caribbean arcs. The first round of these posters were inclusive of events through 2007, and were made with the intent of being continually updated. Each poster includes a regional tectonic summary, a seismic hazard map, focal depth cross-sections, and a main map that illustrates the following: the main subduction zone and other physiographic features, seismicity, and rupture zones of historic great earthquakes. Many of the existing regional seismotectonic posters have been updated and new posters highlighting regions of current seismological interest have been created, including the Sumatra and Java arcs, the Middle East region and the Himalayas (all of which are currently in review). These new editions include updated lists of earthquakes, expanded tectonic summaries, updated relative plate motion vectors, and major crustal faults. These posters thus improve upon previous editions that included only brief tectonic discussions of the most prominent features and historic earthquakes, and which did not systematically represent non-plate boundary faults. Regional tectonic summaries provide the public with immediate background information useful for teaching and media related purposes and are an essential component to many NEIC products. As part of the NEIC's earthquake response, rapid earthquake summary posters are created in the hours following a significant global earthquake. These regional tectonic summaries are included in each earthquake summary poster along with a discussion of the event, written by research scientists at the NEIC, often with help from regional experts. Now, through the efforts of this and related studies, event webpages will automatically contain a regional tectonic summary immediately after an event has been posted. These new summaries include information about plate boundary interactions and other associated tectonic elements, trends in seismicity and brief descriptions of significant earthquakes that have occurred in a region. The tectonic summaries for the following regions have been updated as part of this work: South America, the Caribbean, Alaska and the Aleutians, Kuril-Kamchatka, Japan and vicinity, and Central America, with newly created summaries for Sumatra and Java, the Mediterranean, Middle East, and the Himalayas. The NEIC is currently planning to integrate concise stylized maps with each tectonic summary for display on the USGS website.

  9. Development of regional earthquake early warning and structural health monitoring system and real-time ground motion forecasting using front-site waveform data (Invited)

    NASA Astrophysics Data System (ADS)

    Motosaka, M.

    2009-12-01

    This paper presents firstly, the development of an integrated regional earthquake early warning (EEW) system having on-line structural health monitoring (SHM) function, in Miyagi prefecture, Japan. The system makes it possible to provide more accurate, reliable and immediate earthquake information for society by combining the national (JMA/NIED) EEW system, based on advanced real-time communication technology. The author has planned to install the EEW/SHM system to the public buildings around Sendai, a million city of north-eastern Japan. The system has been so far implemented in two buildings; one is in Sendai, and the other in Oshika, a front site on the Pacific Ocean coast for the approaching Miyagi-ken Oki earthquake. The data from the front-site and the on-site are processed by the analysis system which was installed at the analysis center of Disaster Control Research Center, Tohoku University. The real-time earthquake information from JMA is also received at the analysis center. The utilization of the integrated EEW/SHM system is addressed together with future perspectives. Examples of the obtained data are also described including the amplitude depending dynamic characteristics of the building in Sendai before, during, and after the 2008/6/14 Iwate-Miyagi Nairiku Earthquake, together with the historical change of dynamic characteristics for 40 years. Secondary, this paper presents an advanced methodology based on Artificial Neural Networks (ANN) for forward forecasting of ground motion parameters, not only PGA, PGV, but also Spectral information before S-wave arrival using initial part of P-waveform at a front site. The estimated ground motion information can be used as warning alarm for earthquake damage reduction. The Fourier Amplitude Spectra (FAS) estimated before strong shaking with high accuracy can be used for advanced engineering applications, e.g. feed-forward structural control of a building of interest. The validity and applicability of the method have been verified by using observation data sets of the K-NET sites of 39 earthquakes occurred in Miyagi Oki area. The initial part of P waveform data at the Oshika site (MYG011) of K-NET were used as the front-site waveform data. The earthquake observation data for 35 earthquakes among the 39 earthquakes, as well as the positional-information and site repartition information, were used as training data to construct the ANN structure. The data set for the remaining 4 earthquakes were used as the test data in the blind prediction of PGA and PGV at the 4 sites, namely, Sendai (MYG013), Taiwa (MYG009), Shiogama (MYG012), and Ishinomaki (MYG010).

  10. Earthquakes

    MedlinePlus

    ... or Traumatic Event Resources for Families Resources for Leaders Resources for State and Local Governments Emergency Responders: ... National Center for Environmental Health (NCEH) / Agency for Toxic Substances and Disease Registry (ATSDR) , Coordinating Center for ...

  11. Space Radiation Monitoring Center at SINP MSU

    NASA Astrophysics Data System (ADS)

    Kalegaev, Vladimir; Barinova, Wera; Barinov, Oleg; Bobrovnikov, Sergey; Dolenko, Sergey; Mukhametdinova, Ludmila; Myagkova, Irina; Nguen, Minh; Panasyuk, Mikhail; Shiroky, Vladimir; Shugay, Julia

    2015-04-01

    Data on energetic particle fluxes from Russian satellites have been collected in Space monitoring data center at Moscow State University in the near real-time mode. Web-portal http://smdc.sinp.msu.ru/ provides operational information on radiation state of the near-Earth space. Operational data are coming from space missions ELECTRO-L1, Meteor-M2. High-resolution data on energetic electron fluxes from MSU's satellite VERNOV with RELEC instrumentation on board are also available. Specific tools allow the visual representation of the satellite orbit in 3D space simultaneously with particle fluxes variations. Concurrent operational data coming from other spacecraft (ACE, GOES, SDO) and from the Earth's surface (geomagnetic indices) are used to represent geomagnetic and radiation state of near-Earth environment. Internet portal http://swx.sinp.msu.ru provides access to the actual data characterizing the level of solar activity, geomagnetic and radiation conditions in heliosphere and the Earth's magnetosphere in the real-time mode. Operational forecasting services automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons, using data from LEO and GEO orbits. The models of space environment working in autonomous mode are used to generalize the information obtained from different missions for the whole magnetosphere. On-line applications created on the base of these models provide short-term forecasting for SEP particles and relativistic electron fluxes at GEO and LEO, Dst and Kp indices online forecasting up to 1.5 hours ahead. Velocities of high-speed streams in solar wind on the Earth orbit are estimated with advance time of 3-4 days. Visualization system provides representation of experimental and modeling data in 2D and 3D.

  12. Earthquake.

    PubMed

    Cowen, A R; Denney, J P

    1994-04-01

    On January 25, 1 week after the most devastating earthquake in Los Angeles history, the Southern California Hospital Council released the following status report: 928 patients evacuated from damaged hospitals. 805 beds available (136 critical, 669 noncritical). 7,757 patients treated/released from EDs. 1,496 patients treated/admitted to hospitals. 61 dead. 9,309 casualties. Where do we go from here? We are still waiting for the "big one." We'll do our best to be ready when Mother Nature shakes, rattles and rolls. The efforts of Los Angeles City Fire Chief Donald O. Manning cannot be overstated. He maintained department command of this major disaster and is directly responsible for implementing the fire department's Disaster Preparedness Division in 1987. Through the chief's leadership and ability to forecast consequences, the city of Los Angeles was better prepared than ever to cope with this horrendous earthquake. We also pay tribute to the men and women who are out there each day, where "the rubber meets the road." PMID:10133439

  13. Implications of the World Trade Center Health Program (WTCHP) for the Public Health Response to the Great East Japan Earthquake

    PubMed Central

    CRANE, Michael A.; CHO, Hyunje G.; LANDRIGAN, Phillip J.

    2013-01-01

    The attacks on the World Trade Center (WTC) on September 11, 2001 resulted in a serious burden of physical and mental illness for the 50,000 rescue workers that responded to 9/11 as well as the 400,000 residents and workers in the surrounding areas of New York City. The Zadroga Act of 2010 established the WTC Health Program (WTCHP) to provide monitoring and treatment of WTC exposure-related conditions and health surveillance for the responder and survivor populations. Several reports have highlighted the applicability of insights gained from the WTCHP to the public health response to the Great East Japan Earthquake. Optimal exposure monitoring processes and attention to the welfare of vulnerable exposed sub-groups are critical aspects of the response to both incidents. The ongoing mental health care concerns of 9/11 patients accentuate the need for accessible and appropriately skilled mental health care in Fukushima. Active efforts to demonstrate transparency and to promote community involvement in the public health response will be highly important in establishing successful long-term monitoring and treatment programs for the exposed populations in Fukushima. PMID:24317449

  14. Implications of the World Trade Center Health Program (WTCHP) for the public health response to the Great East Japan Earthquake.

    PubMed

    Crane, Michael A; Cho, Hyunje G; Landrigan, Phillip J

    2014-01-01

    The attacks on the World Trade Center (WTC) on September 11, 2001 resulted in a serious burden of physical and mental illness for the 50,000 rescue workers that responded to 9/11 as well as the 400,000 residents and workers in the surrounding areas of New York City. The Zadroga Act of 2010 established the WTC Health Program (WTCHP) to provide monitoring and treatment of WTC exposure-related conditions and health surveillance for the responder and survivor populations. Several reports have highlighted the applicability of insights gained from the WTCHP to the public health response to the Great East Japan Earthquake. Optimal exposure monitoring processes and attention to the welfare of vulnerable exposed sub-groups are critical aspects of the response to both incidents. The ongoing mental health care concerns of 9/11 patients accentuate the need for accessible and appropriately skilled mental health care in Fukushima. Active efforts to demonstrate transparency and to promote community involvement in the public health response will be highly important in establishing successful long-term monitoring and treatment programs for the exposed populations in Fukushima. PMID:24317449

  15. Application of Collocated GPS and Seismic Sensors to Earthquake Monitoring and Early Warning

    PubMed Central

    Li, Xingxing; Zhang, Xiaohong; Guo, Bofeng

    2013-01-01

    We explore the use of collocated GPS and seismic sensors for earthquake monitoring and early warning. The GPS and seismic data collected during the 2011 Tohoku-Oki (Japan) and the 2010 El Mayor-Cucapah (Mexico) earthquakes are analyzed by using a tightly-coupled integration. The performance of the integrated results is validated by both time and frequency domain analysis. We detect the P-wave arrival and observe small-scale features of the movement from the integrated results and locate the epicenter. Meanwhile, permanent offsets are extracted from the integrated displacements highly accurately and used for reliable fault slip inversion and magnitude estimation. PMID:24284765

  16. Application of collocated GPS and seismic sensors to earthquake monitoring and early warning.

    PubMed

    Li, Xingxing; Zhang, Xiaohong; Guo, Bofeng

    2013-01-01

    We explore the use of collocated GPS and seismic sensors for earthquake monitoring and early warning. The GPS and seismic data collected during the 2011 Tohoku-Oki (Japan) and the 2010 El Mayor-Cucapah (Mexico) earthquakes are analyzed by using a tightly-coupled integration. The performance of the integrated results is validated by both time and frequency domain analysis. We detect the P-wave arrival and observe small-scale features of the movement from the integrated results and locate the epicenter. Meanwhile, permanent offsets are extracted from the integrated displacements highly accurately and used for reliable fault slip inversion and magnitude estimation. PMID:24284765

  17. A new Automatic Phase Picker for the National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Buland, R.

    2002-12-01

    The increasing need for rapid accurate earthquake locations for timely notification and damage assessment has placed greater demands on automatic phase picking technology. We are developing a new automatic phase picker for use by the National Earthquake Information Center (NEIC). Since the NEIC provides rapid notification for all felt earthquakes in the US and significant events worldwide, the picking algorithm must provide accurate arrival times for the wide range of waveforms generated by local, regional, and teleseismic events. The current picker applies a Short-Term-Average over Long-Term-Average algorithm (STA/LTA) to vertical-component records that have been narrow band filtered into two data streams with peaks at 1.5 Hz and 3.0 Hz. The use of this relatively high-frequency narrow-band data provides accurate arrival-time estimates. The travel-time residuals for 10,000 teleseismic P-wave picks have a spread (scaled median average deviation) of 1.3 seconds; this is similar to the spread of human made picks. Additionally, at these high-frequencies teleseismic picks are generally limited to compressional waves. This aids identification of arrival type and therefore simplifies the association of picks to events. Although the current picker works well, plans to improve the accuracy, reliability, and detection threshold of automatic locations require the picking of secondary phases and analysis of a larger frequency band. Several previous studies have presented picking methods but few published studies test them on numerous seismograms selected from a wide range of distances and magnitudes. Published techniques include: STA/LTA, auto-regressive, cross-correlation, and neural networks. We will present comparisons of several methods and discuss their fitness for implementation on our realtime system. Preference will be given to methods that provide the most reliable and accurate earthquake locations, not necessarily those which best reproduce human picks.

  18. Basin-centered asperities in great subduction zone earthquakes: A link between slip, subsidence, and subduction erosion?

    USGS Publications Warehouse

    Wells, R.E.; Blakely, R.J.; Sugiyama, Y.; Scholl, D. W.; Dinterman, P.A.

    2003-01-01

    Published areas of high coseismic slip, or asperities, for 29 of the largest Circum-Pacific megathrust earthquakes are compared to forearc structure revealed by satellite free-air gravity, bathymetry, and seismic profiling. On average, 71% of an earthquake's seismic moment and 79% of its asperity area occur beneath the prominent gravity low outlining the deep-sea terrace; 57% of an earthquake's asperity area, on average, occurs beneath the forearc basins that lie within the deep-sea terrace. In SW Japan, slip in the 1923, 1944, 1946, and 1968 earthquakes was largely centered beneath five forearc basins whose landward edge overlies the 350??C isotherm on the plate boundary, the inferred downdip limit of the locked zone. Basin-centered coseismic slip also occurred along the Aleutian, Mexico, Peru, and Chile subduction zones but was ambiguous for the great 1964 Alaska earthquake. Beneath intrabasin structural highs, seismic slip tends to be lower, possibly due to higher temperatures and fluid pressures. Kilometers of late Cenozoic subsidence and crustal thinning above some of the source zones are indicated by seismic profiling and drilling and are thought to be caused by basal subduction erosion. The deep-sea terraces and basins may evolve not just by growth of the outer arc high but also by interseismic subsidence not recovered during earthquakes. Basin-centered asperities could indicate a link between subsidence, subduction erosion, and seismogenesis. Whatever the cause, forearc basins may be useful indicators of long-term seismic moment release. The source zone for Cascadia's 1700 A.D. earthquake contains five large, basin-centered gravity lows that may indicate potential asperities at depth. The gravity gradient marking the inferred downdip limit to large coseismic slip lies offshore, except in northwestern Washington, where the low extends landward beneath the coast. Transverse gravity highs between the basins suggest that the margin is seismically segmented and could produce a variety of large earthquakes. Published in 2003 by the American Geophysical Union.

  19. Basin-centered asperities in great subduction zone earthquakes: A link between slip, subsidence, and subduction erosion?

    NASA Astrophysics Data System (ADS)

    Wells, Ray E.; Blakely, Richard J.; Sugiyama, Yuichi; Scholl, David W.; Dinterman, Philip A.

    2003-10-01

    Published areas of high coseismic slip, or asperities, for 29 of the largest Circum-Pacific megathrust earthquakes are compared to forearc structure revealed by satellite free-air gravity, bathymetry, and seismic profiling. On average, 71% of an earthquake's seismic moment and 79% of its asperity area occur beneath the prominent gravity low outlining the deep-sea terrace; 57% of an earthquake's asperity area, on average, occurs beneath the forearc basins that lie within the deep-sea terrace. In SW Japan, slip in the 1923, 1944, 1946, and 1968 earthquakes was largely centered beneath five forearc basins whose landward edge overlies the 350°C isotherm on the plate boundary, the inferred downdip limit of the locked zone. Basin-centered coseismic slip also occurred along the Aleutian, Mexico, Peru, and Chile subduction zones but was ambiguous for the great 1964 Alaska earthquake. Beneath intrabasin structural highs, seismic slip tends to be lower, possibly due to higher temperatures and fluid pressures. Kilometers of late Cenozoic subsidence and crustal thinning above some of the source zones are indicated by seismic profiling and drilling and are thought to be caused by basal subduction erosion. The deep-sea terraces and basins may evolve not just by growth of the outer arc high but also by interseismic subsidence not recovered during earthquakes. Basin-centered asperities could indicate a link between subsidence, subduction erosion, and seismogenesis. Whatever the cause, forearc basins may be useful indicators of long-term seismic moment release. The source zone for Cascadia's 1700 A.D. earthquake contains five large, basin-centered gravity lows that may indicate potential asperities at depth. The gravity gradient marking the inferred downdip limit to large coseismic slip lies offshore, except in northwestern Washington, where the low extends landward beneath the coast. Transverse gravity highs between the basins suggest that the margin is seismically segmented and could produce a variety of large earthquakes.

  20. Monitoring of Ecological Restoration at the Central Quake-Hit Areas of Wenchuan Earthquake Using RS & GIS Remote Sensing

    NASA Astrophysics Data System (ADS)

    Zhang, Q.

    2014-12-01

    The 2008 Sichuan earthquake, occurred on 12 May 2008 with a magnitude of 8.0 and the center at Wenchuan (31.021°N, 103.367°E), has not only caused a large number of human casualties and property loss, but also severely damaged the ecological system in its surrounding 10 counties, threatening the local ecological safety. As part of the post-disaster reconstruction services, a systematic monitoring of the ecological restoration at the central quake-hit areas has been made based on RS & GIS remote sensing. In this paper we selected the Dujiangyan area for analysis. The reason to select this region is because that Dujiangyan area is about 40 km from the epicenter, and as a region in the subtropical monsoon climate zone, it has a well developed forest ecosystem in the northern part before the earth quake. The coverage of grassland in this region is relatively less. Since the ecological restoration after the earthquake is a long term process, the restoration for different vegetation types has different characteristics. From the analysis of the spatiotemporal change of land-use and vegetation cover in Dujiangyan area from the post-earthquake in 2008 to 2013, we found: (1) During the earthquake, the major vegetation type destroyed is the woodland, which accounts for 99.34% of the destroyed area, and the next are arable land and grassland. (2) The ecological restoration started from the grassland and gradually transited to shrub. In two years after the earthquake, the most significant increase in both area of coverage and magnitude is the grassland, and by 2013, the area of grassland decreased slightly, and instead the area of shrub increased, demonstrating a transition trend from the grassland to the shrub. (3) From the map of vegetation cover, we can see these change occurs mainly in the northern mountain area, while the change of land use mainly occurred in the southern part of the city. These changes can be linked clearly with the earthquake disaster and the post- reconstruction human activities.

  1. Disasters; the 2010 Haitian earthquake and the evacuation of burn victims to US burn centers.

    PubMed

    Kearns, Randy D; Holmes, James H; Skarote, Mary Beth; Cairns, Charles B; Strickland, Samantha Cooksey; Smith, Howard G; Cairns, Bruce A

    2014-09-01

    Response to the 2010 Haitian earthquake included an array of diverse yet critical actions. This paper will briefly review the evacuation of a small group of patients with burns to burn centers in the southeastern United States (US). This particular evacuation brought together for the first time plans, groups, and organizations that had previously only exercised this process. The response to the Haitian earthquake was a glimpse at what the international community working together can do to help others, and relieve suffering following a catastrophic disaster. The international response was substantial. This paper will trace one evacuation, one day for one unique group of patients with burns to burn centers in the US and review the lessons learned from this process. The patient population with burns being evacuated from Haiti was very small compared to the overall operation. Nevertheless, the outcomes included a better understanding of how a larger event could challenge the limited resources for all involved. This paper includes aspects of the patient movement, the logistics needed, and briefly discusses reimbursement for the care provided. PMID:24411582

  2. Illnesses and injuries reported at Disaster Application Centers following the 1994 Northridge Earthquake.

    PubMed

    Teeter, D S

    1996-09-01

    The 1994 Northridge, California, earthquake caused extensive structural damage and disrupted lives for thousands of residents. Local resources treated those initially injured. Many victims were unable or unwilling to reenter their dwellings. Record numbers of victims spent many hours at Disaster Application Centers (DACs) applying for financial assistance and other services. This created a concern for the provision of primary health care services at these centers. Under the Federal Response Plan, registered nurses, nurse practitioners, and physician assistants from the Department of Veterans Affairs treated 17,883 patients at the DACs. This report documents the injuries and illnesses sustained by the public and response workers at the DACs. The findings demonstrate that this care eased the burden on the local health care system. This article illustrates applications for estimating health services needs and demands at similar mass gatherings that might be experienced in response to catastrophic events and in U.S. military operations involving humanitarian relief missions. PMID:8840792

  3. The German Task Force for Earthquakes - A temporary network aftershock monitoring

    NASA Astrophysics Data System (ADS)

    Sobiesiak, M.; Eggert, S.; Grosser, H.; Hainzl, S.; Günther, E.

    2009-04-01

    The German Task Force for Earthquakes (GTF) is an interdisciplinary group for immediate response on disastrous earthquakes with the aim to monitor the post-seismic processes and assess the impact of the seismic event on the disaster stricken area. For accomplishing this task, 20 short-period seismic stations for aftershock monitoring; and 10 strong motion instruments for purposes in engineering seismology are available exclusively for the use of the GTF. Furthermore, the GTF is equipped with tools for hydro-geological investigations and has 6 GPS instruments at hand for studying post-seismic deformation. Geological, sociological and remote sensing expertise is provided by a number of scientists at the GFZ or other national and international universities and organisations. We would like to present a variety of results achieved in 20 missions we were able to conduct up to date. This will give an overview on the scientific opportunities which lie in collecting and investigating high resolution local earthquake data. In future, online data transmission is envisaged to allow for aftershock hazard assessment to benefit the work of rescue teams and local authorities in the area concerned. In general, decision making in the beginning of a Task Force activity requires a system of fast dissemination of earthquake information which in our case, is provided by GEOFON.

  4. Novel Algorithms Enabling Rapid, Real-Time Earthquake Monitoring and Tsunami Early Warning Worldwide

    NASA Astrophysics Data System (ADS)

    Lomax, A.; Michelini, A.

    2012-12-01

    We have introduced recently new methods to determine rapidly the tsunami potential and magnitude of large earthquakes (e.g., Lomax and Michelini, 2009ab, 2011, 2012). To validate these methods we have implemented them along with other new algorithms within the Early-est earthquake monitor at INGV-Rome (http://early-est.rm.ingv.it, http://early-est.alomax.net). Early-est is a lightweight software package for real-time earthquake monitoring (including phase picking, phase association and event detection, location, magnitude determination, first-motion mechanism determination, ...), and for tsunami early warning based on discriminants for earthquake tsunami potential. In a simulation using archived broadband seismograms for the devastating M9, 2011 Tohoku earthquake and tsunami, Early-est determines: the epicenter within 3 min after the event origin time, discriminants showing very high tsunami potential within 5-7 min, and magnitude Mwpd(RT) 9.0-9.2 and a correct shallow-thrusting mechanism within 8 min. Real-time monitoring with Early-est givess similar results for most large earthquakes using currently available, real-time seismogram data. Here we summarize some of the key algorithms within Early-est that enable rapid, real-time earthquake monitoring and tsunami early warning worldwide: >>> FilterPicker - a general purpose, broad-band, phase detector and picker (http://alomax.net/FilterPicker); >>> Robust, simultaneous association and location using a probabilistic, global-search; >>> Period-duration discriminants TdT0 and TdT50Ex for tsunami potential available within 5 min; >>> Mwpd(RT) magnitude for very large earthquakes available within 10 min; >>> Waveform P polarities determined on broad-band displacement traces, focal mechanisms obtained with the HASH program (Hardebeck and Shearer, 2002); >>> SeisGramWeb - a portable-device ready seismogram viewer using web-services in a browser (http://alomax.net/webtools/sgweb/info.html). References (see also: http://alomax.net/pub_list.html): Lomax, A. and A. Michelini (2012), Tsunami early warning within 5 minutes, Pure and Applied Geophysics, 169, nnn-nnn, doi: 10.1007/s00024-012-0512-6. Lomax, A. and A. Michelini (2011), Tsunami early warning using earthquake rupture duration and P-wave dominant period: the importance of length and depth of faulting, Geophys. J. Int., 185, 283-291, doi: 10.1111/j.1365-246X.2010.04916.x. Lomax, A. and A. Michelini (2009b), Tsunami early warning using earthquake rupture duration, Geophys. Res. Lett., 36, L09306, doi:10.1029/2009GL037223. Lomax, A. and A. Michelini (2009a), Mwpd: A Duration-Amplitude Procedure for Rapid Determination of Earthquake Magnitude and Tsunamigenic Potential from P Waveforms, Geophys. J. Int.,176, 200-214, doi:10.1111/j.1365-246X.2008.03974.x

  5. Volcano and Earthquake Monitoring Plan for the Yellowstone Volcano Observatory, 2006-2015

    USGS Publications Warehouse

    Yellowstone Volcano Observatory

    2006-01-01

    To provide Yellowstone National Park (YNP) and its surrounding communities with a modern, comprehensive system for volcano and earthquake monitoring, the Yellowstone Volcano Observatory (YVO) has developed a monitoring plan for the period 2006-2015. Such a plan is needed so that YVO can provide timely information during seismic, volcanic, and hydrothermal crises and can anticipate hazardous events before they occur. The monitoring network will also provide high-quality data for scientific study and interpretation of one of the largest active volcanic systems in the world. Among the needs of the observatory are to upgrade its seismograph network to modern standards and to add five new seismograph stations in areas of the park that currently lack adequate station density. In cooperation with the National Science Foundation (NSF) and its Plate Boundary Observatory Program (PBO), YVO seeks to install five borehole strainmeters and two tiltmeters to measure crustal movements. The boreholes would be located in developed areas close to existing infrastructure and away from sensitive geothermal features. In conjunction with the park's geothermal monitoring program, installation of new stream gages, and gas-measuring instruments will allow YVO to compare geophysical phenomena, such as earthquakes and ground motions, to hydrothermal events, such as anomalous water and gas discharge. In addition, YVO seeks to characterize the behavior of geyser basins, both to detect any precursors to hydrothermal explosions and to monitor earthquakes related to fluid movements that are difficult to detect with the current monitoring system. Finally, a monitoring network consists not solely of instruments, but requires also a secure system for real-time transmission of data. The current telemetry system is vulnerable to failures that could jeopardize data transmission out of Yellowstone. Future advances in monitoring technologies must be accompanied by improvements in the infrastructure for data transmission. Overall, our strategy is to (1) maximize our ability to provide rapid assessments of changing conditions to ensure public safety, (2) minimize environmental and visual impact, and (3) install instrumentation in developed areas.

  6. New approach for earthquake/tsunami monitoring using dense GPS networks

    PubMed Central

    Li, Xingxing; Ge, Maorong; Zhang, Yong; Wang, Rongjiang; Xu, Peiliang; Wickert, Jens; Schuh, Harald

    2013-01-01

    In recent times increasing numbers of high-rate GPS stations have been installed around the world and set-up to provide data in real-time. These networks provide a great opportunity to quickly capture surface displacements, which makes them important as potential constituents of earthquake/tsunami monitoring and warning systems. The appropriate GPS real-time data analysis with sufficient accuracy for this purpose is a main focus of the current GPS research. In this paper we propose an augmented point positioning method for GPS based hazard monitoring, which can achieve fast or even instantaneous precise positioning without relying on data of a specific reference station. The proposed method overcomes the limitations of the currently mostly used GPS processing approaches of relative positioning and global precise point positioning. The advantages of the proposed approach are demonstrated by using GPS data, which was recorded during the 2011 Tohoku-Oki earthquake in Japan. PMID:24045328

  7. Quantifying 10 years of Improvements in Earthquake and Tsunami Monitoring in the Caribbean and Adjacent Regions

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.; Huerfano Moreno, V. A.; McNamara, D. E.; Saurel, J. M.

    2014-12-01

    The magnitude-9.3 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness to the destructive hazard of earthquakes and tsunamis. Post event assessments of global coastline vulnerability highlighted the Caribbean as a region of high hazard and risk and that it was poorly monitored. Nearly 100 tsunamis have been reported for the Caribbean region and Adjacent Regions in the past 500 years and continue to pose a threat for its nations, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North and South America. Significant efforts to improve monitoring capabilities have been undertaken since this time including an expansion of the United States Geological Survey (USGS) Global Seismographic Network (GSN) (McNamara et al., 2006) and establishment of the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS). The minimum performance standards it recommended for initial earthquake locations include: 1) Earthquake detection within 1 minute, 2) Minimum magnitude threshold = M4.5, and 3) Initial hypocenter error of <30 km. In this study, we assess current compliance with performance standards and model improvements in earthquake and tsunami monitoring capabilities in the Caribbean region since the first meeting of the UNESCO ICG-Caribe EWS in 2006. The three measures of network capability modeled in this study are: 1) minimum Mw detection threshold; 2) P-wave detection time of an automatic processing system and; 3) theoretical earthquake location uncertainty. By modeling three measures of seismic network capability, we can optimize the distribution of ICG-Caribe EWS seismic stations and select an international network that will be contributed from existing real-time broadband national networks in the region. Sea level monitoring improvements both offshore and along the coast will also be addressed. With the support of Member States and other countries and organizations it has been possible to significantly expand the sea level network thus reducing the amount of time it now takes to verify tsunamis.

  8. Network of seismo-geochemical monitoring observatories for earthquake prediction research in India

    NASA Astrophysics Data System (ADS)

    Chaudhuri, Hirok; Barman, Chiranjib; Iyengar, A.; Ghose, Debasis; Sen, Prasanta; Sinha, Bikash

    2013-08-01

    Present paper deals with a brief review of the research carried out to develop multi-parametric gas-geochemical monitoring facilities dedicated to earthquake prediction research in India by installing a network of seismo-geochemical monitoring observatories at different regions of the country. In an attempt to detect earthquake precursors, the concentrations of helium, argon, nitrogen, methane, radon-222 (222Rn), polonium-218 (218Po), and polonium-214 (214Po) emanating from hydrothermal systems are monitored continuously and round the clock at these observatories. In this paper, we make a cross correlation study of a number of geochemical anomalies recorded at these observatories. With the data received from each of the above observatories we attempt to make a time series analysis to relate magnitude and epicentral distance locations through statistical methods, empirical formulations that relate the area of influence to earthquake scale. Application of the linear and nonlinear statistical techniques in the recorded geochemical data sets reveal a clear signature of long-range correlation in the data sets.

  9. Real-time seismic monitoring of the integrated cape girardeau bridge array and recorded earthquake response

    USGS Publications Warehouse

    Celebi, M.

    2006-01-01

    This paper introduces the state of the art, real-time and broad-band seismic monitoring network implemented for the 1206 m [3956 ft] long, cable-stayed Bill Emerson Memorial Bridge in Cape Girardeau (MO), a new Mississippi River crossing, approximately 80 km from the epicentral region of the 1811-1812 New Madrid earthquakes. The bridge was designed for a strong earthquake (magnitude 7.5 or greater) during the design life of the bridge. The monitoring network comprises a total of 84 channels of accelerometers deployed on the superstructure, pier foundations and at surface and downhole free-field arrays of the bridge. The paper also presents the high quality response data obtained from the network. Such data is aimed to be used by the owner, researchers and engineers to assess the performance of the bridge, to check design parameters, including the comparison of dynamic characteristics with actual response, and to better design future similar bridges. Preliminary analyses of ambient and low amplitude small earthquake data reveal specific response characteristics of the bridge and the free-field. There is evidence of coherent tower, cable, deck interaction that sometimes results in amplified ambient motions. Motions at the lowest tri-axial downhole accelerometers on both MO and IL sides are practically free from any feedback from the bridge. Motions at the mid-level and surface downhole accelerometers are influenced significantly by feedback due to amplified ambient motions of the bridge. Copyright ASCE 2006.

  10. Cloud-based systems for monitoring earthquakes and other environmental quantities

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Olson, M.; Liu, A.; Chandy, M.; Bunn, J.; Guy, R.

    2013-12-01

    There are many advantages to using a cloud-based system to record and analyze environmental quantities such as earthquakes, radiation, various gases, dust and meteorological parameters. These advantages include robustness and dynamic scalability, and also reduced costs. In this paper, we present our experiences over the last three years in developing a cloud-based earthquake monitoring system (the Community Seismic Network). This network consists of over 600 sensors (accelerometers) in the S. California region that send data directly to the Google App Engine where they are analyzed. The system is capable of handing many other types of sensor data and generating a situation-awareness analysis as a product. Other advantages to the cloud-based system are integration with other peer networks, and being able to deploy anywhere in the world without have to build addition computing infrastructure.

  11. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    NASA Astrophysics Data System (ADS)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  12. Grand Canyon Monitoring and Research Center

    USGS Publications Warehouse

    Hamill, John F.

    2009-01-01

    The Grand Canyon of the Colorado River, one of the world's most spectacular gorges, is a premier U.S. National Park and a World Heritage Site. The canyon supports a diverse array of distinctive plants and animals and contains cultural resources significant to the region's Native Americans. About 15 miles upstream of Grand Canyon National Park sits Glen Canyon Dam, completed in 1963, which created Lake Powell. The dam provides hydroelectric power for 200 wholesale customers in six western States, but it has also altered the Colorado River's flow, temperature, and sediment-carrying capacity. Over time this has resulted in beach erosion, invasion and expansion of nonnative species, and losses of native fish. Public concern about the effects of Glen Canyon Dam operations prompted the passage of the Grand Canyon Protection Act of 1992, which directs the Secretary of the Interior to operate the dam 'to protect, mitigate adverse impacts to, and improve values for which Grand Canyon National Park and Glen Canyon National Recreation Area were established...' This legislation also required the creation of a long-term monitoring and research program to provide information that could inform decisions related to dam operations and protection of downstream resources.

  13. Recorded earthquake responses from the integrated seismic monitoring network of the Atwood Building, Anchorage, Alaska

    USGS Publications Warehouse

    Celebi, M.

    2006-01-01

    An integrated seismic monitoring system with a total of 53 channels of accelerometers is now operating in and at the nearby free-field site of the 20-story steel-framed Atwood Building in highly seismic Anchorage, Alaska. The building has a single-story basement and a reinforced concrete foundation without piles. The monitoring system comprises a 32-channel structural array and a 21-channel site array. Accelerometers are deployed on 10 levels of the building to assess translational, torsional, and rocking motions, interstory drift (displacement) between selected pairs of adjacent floors, and average drift between floors. The site array, located approximately a city block from the building, comprises seven triaxial accelerometers, one at the surface and six in boreholes ranging in depths from 15 to 200 feet (???5-60 meters). The arrays have already recorded low-amplitude shaking responses of the building and the site caused by numerous earthquakes at distances ranging from tens to a couple of hundred kilometers. Data from an earthquake that occurred 186 km away traces the propagation of waves from the deepest borehole to the roof of the building in approximately 0.5 seconds. Fundamental structural frequencies [0.58 Hz (NS) and 0.47 Hz (EW)], low damping percentages (2-4%), mode coupling, and beating effects are identified. The fundamental site frequency at approximately 1.5 Hz is close to the second modal frequencies (1.83 Hz NS and 1.43 EW) of the building, which may cause resonance of the building. Additional earthquakes prove repeatability of these characteristics; however, stronger shaking may alter these conclusions. ?? 2006, Earthquake Engineering Research Institute.

  14. Continuous Monitoring of Potential Geochemical and Geomagnetic Earthquake Precursors: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Burjanek, J.; Faeh, D.; Surbeck, H.; Balderer, W.; Kaestli, P.; Gassner, G.

    2014-12-01

    In the last decades different studies have addressed short-term earthquake precursors by focusing on the observation of a wide variety of physical phenomena that can precede strong earthquakes. This includes anomalous seismicity patterns, ground water level changes, gas emissions, geochemical changes in groundwater, seismo-electromagnetic phenomena, seismo-ionospheric coupling, surface deformations etc. Some cases of the potential precursors observed in the past where later found as an artifact of sensor-system malfunctioning. Therefore, such monitoring needs to be long-term, time-stamped and continuous, with a professional quality assurance procedure. Moreover, it is important to correlate data recorded with multi-sensor systems. We present a case study of running a multi-sensor system in Valais, Switzerland. The Valais is the area of highest seismic hazard in Switzerland and has experienced a magnitude 6 or larger event every 100 years on average. The system consists of seismic, geodetic (GPS), geochemical and geomagnetic instruments. Here we focus on the latter two. In particular, the observation of possible geochemical earthquake precursory signals is carried out by the installation of two instruments: 1) field fluorometer for monitoring of the fluorescence spectral analysis of water, which can monitor 3 different wavelength bands, temperature and turbidity; 2) geochemical gas concentration sensors, which monitor radon, CO2, and CH4 gases. The geomagnetic observations are performed by three component coil magnetometers. All instrument are designed to run in continuous mode and stream data in real-time. In this presentation we focus mainly on operational aspects of such system. We discuss problems faced during operation, feasibility of the installation, and in general lessons learned for potential future applications.

  15. Monitoring the Galactic Center with ATCA

    NASA Astrophysics Data System (ADS)

    Borkar, A.; Eckart, A.; Straubmeier, C.; Kunneriath, D.; Jalali, B.; Sabha, N.; Shahzamanian, B.; García-Marín, M.; Valencia-S, M.; Sjouwerman, L.; Britzen, S.; Karas, V.; Dovčiak, M.; Donea, A.; Zensus, A.

    2016-02-01

    The supermassive black hole, Sagittarius A* (Sgr A*), at the centre of the Milky Way undergoes regular flaring activity which is thought to arise from the innermost region of the accretion flow. We performed the monitoring observations of the Galactic Centre to study the flux-density variations at 3mm using the Australia Telescope Compact Array (ATCA) between 2010 and 2014. We obtain the light curves of Sgr A* by subtracting the contributions from the extended emission around it, and the elevation and time dependent gains of the telescope. We perform structure function analysis and the Bayesian blocks representation to detect flare events. The observations detect six instances of significant variability in the flux density of Sgr A* in three observations, with variations between 0.5 to 1.0 Jy, which last for 1.5 - 3 hours. We use the adiabatically expanding plasmon model to explain the short time-scale variations in the flux density. We derive the physical quantities of the modelled flare emission, such as the source expansion speed vexp, source sizes, spectral indices, and the turnover frequency. These parameters imply that the expanding source components are either confined to the immediate vicinity of Sgr A* by contributing to the corona or the disc, or have a bulk motion greater than vexp. No exceptional flux density variation on short flare time-scales was observed during the approach and the flyby of the dusty S-cluster object (DSO/G2). This is consistent with its compactness and the absence of a large bow shock.

  16. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    NASA Astrophysics Data System (ADS)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  17. Monitoring the mental well-being of caregivers during the Haiti-earthquake.

    PubMed Central

    Van der Auwera, Marcel; Debacker, Michel; Hubloue, Ives

    2012-01-01

    Introduction During disaster relief, personnels safety is very important. Mental well being is a part of this safety issue. There is however a lack of objective mental well being monitoring tools, usable on scene, during disaster relief. This study covers the use of validated tools towards detection of psychological distress and monitoring of mental well being of disaster relief workers, during the Belgian First Aid and Support Team deployment after the Haiti earthquake in 2010. Methodology The study was conducted using a demographic questionnaire combined with validated measuring instruments: Belbin Team Role, Compassion Fatigue and Satisfaction Self-Test for Helpers, DMAT PsySTART, K6+ Self Report. A baseline measurement was performed before departure on mission, and measurements were repeated at day 1 and day 7 of the mission, at the end of mission, and 7 days, 30 days and 90 days post mission. Results 23 out of the 27 team members were included in the study. Using the Compassion Fatigue and Satisfaction Self-Test for Helpers as a monitoring tool, a stable condition was monitored in 7 participants, a dip in 5 participants, an arousal in 10 participants and a double pattern in 1 participant. Conclusions The study proved the ability to monitor mental well being and detect psychological distress, by self administered validated tools, during a real disaster relief mission. However for practical reasons some tools should be adapted to the specific use in the field. This study opens a whole new research area within the mental well being and monitoring field. Citation: Van der Auwera M, Debacker M, Hubloue I. Monitoring the mental well-being of caregivers during the Haiti-earthquake.. PLoS Currents Disasters. 2012 Jul 18 PMID:22953241

  18. Web Services and Data Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.

    2013-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest quality waveform from the archive.

  19. Open Access to Decades of NCSN Waveforms at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Klein, F.; Zuzlewski, S.; Jensen, E. G.; Oppenheimer, D.; Gee, L.; Romanowicz, B.

    2003-12-01

    The USGS in Menlo Park has operated the Northern California Seismic Network (NCSN) since 1967 and has generated digital seismograms since 1984. Since its inception, the NCSN has recorded 2900 distinct channels at over 500 distinct sites. Although originally used only for earthquake location and coda magnitude, these seismograms are now of interest to seismologists for studying earth structure, precision relocations through cross correlation timing, and analysis of strong motion records. Until recently, the NCSN waveform data were available only through research accounts and special request methods due to incomplete instrument responses. Over the past 2 years, the USGS has assembled the necessary descriptions for both historic and current NCSN instrumentation. The NCEDC and USGS jointly developed a procedure to assemble the hardware attributes and instrument responses for the NCSN data channels using a combination of a simple spreadsheet that defines the attributes of each data channel, and a limited number of attribute files for classes of sensors and shared digitizers. These files are used by programs developed by the NCEDC to populate the NCEDC hardware tracking database tables and then to generate both the simple response and the full SEED instrument response database tables. As a result, the NCSN waveform data can now be distributed in SEED format with any of the NCEDC standard waveform request methods. The NCEDC provides access to waveform data through Web forms, email requests, and programming interfaces. The SeismiQuery Web interface provides information about data holdings. NetDC allows users to retrieve inventory information, instrument responses, and waveforms in SEED format. STP provides both a Web and programming interface to retrieve data in SEED or other user-friendly formats. Through the California Integrated Seismic Network, we are working with the SCEDC to provide unified access to California earthquake data. The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and the USGS Menlo Park that provides a long-term archive and distribution center for geophysical data for northern California.

  20. Data and Visualizations in the Southern California Earthquake Center's Fault Information System

    NASA Astrophysics Data System (ADS)

    Perry, S.

    2003-12-01

    The Southern California Earthquake Center's Fault Information System (FIS) provides a single point of access to fault-related data and models from multiple databases and datasets. The FIS is built of computer code, metadata and Web interfaces based on Web services technology, which enables queries and data interchange irrespective of computer software or platform. Currently we have working prototypes of programmatic and browser-based access. The first generation FIS may be searched and downloaded live, by automated processes, as well as interactively, by humans using a browser. Users get ascii data in plain text or encoded in XML. Via the Earthquake Information Technology (EIT) Interns (Juve and others, this meeting), we are also testing the effectiveness of querying multiple databases using a fault database ontology. For more than a decade, the California Geological Survey (CGS), SCEC, and the U. S. Geological Survey (USGS) have put considerable, shared resources into compiling and assessing published fault data, then providing the data on the Web. Several databases now exist, with different formats, datasets, purposes, and users, in various stages of completion. When fault databases were first envisioned, the full power of today's internet was not yet recognized, and the databases became the Web equivalents of review papers, where one could read an overview summation of a fault, then copy and paste pertinent data. Today, numerous researchers also require rapid queries and downloads of data. Consequently, the first components of the FIS are MySQL databases that deliver numeric values from earlier, text-based databases. Another essential service provided by the FIS is visualizations of fault representations such as those in SCEC's Community Fault Model. The long term goal is to provide a standardized, open-source, platform-independent visualization technique. Currently, the FIS makes available fault model viewing software for users with access to Matlab or Java3D. The latter is the interactive LA3D software of the SCEC EIT intern team, which will be demonstrated at this session.

  1. USGS contributions to earthquake and tsunami monitoring in the Caribbean Region

    NASA Astrophysics Data System (ADS)

    McNamara, D.; Caribbean Project Team, U.; Partners, C.

    2007-05-01

    USGS Caribbean Project Team: Lind Gee, Gary Gyure, John Derr, Jack Odum, John McMillan, David Carver, Jim Allen, Susan Rhea, Don Anderson, Harley Benz Caribbean Partners: Christa von Hillebrandt-Andrade-PRSN, Juan Payero ISU-UASD,DR, Eduardo Camacho - UPAN, Panama, Lloyd Lynch - SRU,Gonzalo Cruz - UNAH,Honduras, Margaret Wiggins-Grandison - Jamaica, Judy Thomas - CERO Barbados, Sylvan McIntyre - NADMA Grenada, E. Bermingham - STRI. The magnitude-9 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness of the destructive hazard posed by earthquakes and tsunamis. In response to this tragedy, the US government undertook a collaborative project to improve earthquake and tsunami monitoring along a major portion of vulnerable coastal regions, in the Caribbean Sea, the Gulf of Mexico, and the Atlantic Ocean. Seismically active areas of the Caribbean Sea region pose a tsunami risk for Caribbean islands, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North America. Nearly 100 tsunamis have been reported for the Caribbean region in the past 500 years, including 14 tsunamis reported in Puerto Rico and the U.S. Virgin Islands. Partners in this project include the United States Geological Survey (USGS), the Smithsonian Institute, the National Oceanic and Aeronautic Administration (NOAA), and several partner institutions in the Caribbean region. This presentation focuses on the deployment of nine broadband seismic stations to monitor earthquake activity in the Caribbean region that are affiliated with the Global Seismograph Network (GSN). By the end of 2006, five stations were transmitting data to the USGS National Earthquake Information Service (NEIS), and regional partners through Puerto Rico seismograph network (PRSN) Earthworm systems. The following stations are currently operating: SDDR - Sabaneta Dam Dominican Republic, BBGH - Gun Hill Barbados, GRGR - Grenville, Grenada, BCIP - Barro Colorado, Panama, TGUH - Tegucigalpa, Honduras. These stations complement the existing GSN stations SJG - San Juan, Puerto Rico, SDV - Santo Domingo, Venezuela, TEIG - Tepich, Yucatan, Mexico, and JTS - Costa, Rica. 2007 will see the construction of two additional stations in Guantanamo Bay, Cuba and Barbuda. Planned stations in Jamaica and Grand Turks are awaiting local approval. In this presentation we examine noise conditions at the five operating sites and assess the capabilities of the current seismic network using three different measures of capability. The three measures of network capability are: 1) minimum Mw detection threshold; 2) response time of the automatic processing system and; 3) theoretical earthquake location errors. The new seismic stations are part of a larger effort to monitor and mitigate tsunami hazard in the region. Destructive earthquakes and tsunamis are known to be a threat in various parts of the Caribbean. We demonstrate that considerable improvement in network magnitude threshold, response time and earthquake location error have been achieved.

  2. Southern California Earthquake Center - SCEC1: Final Report Summary Alternative Earthquake Source Characterization for the Los Angeles Region

    SciTech Connect

    Foxall, B

    2003-02-26

    The objective my research has been to synthesize current understanding of the tectonics and faults of the Los Angeles Basin and surrounding region to quantify uncertainty in the characterization of earthquake sources used for geologically- and geodetically-based regional earthquake likelihood models. This work has focused on capturing epistemic uncertainty; i.e. uncertainty stemming from ignorance of the true characteristics of the active faults in the region and of the tectonic forces that drive them. In the present context, epistemic uncertainty has two components: First, the uncertainty in source geometrical and occurrence rate parameters deduced from the limited geological, geophysical and geodetic observations available; and second. uncertainties that result from fundamentally different interpretations of regional tectonic deformation and faulting. Characterization of the large number of active and potentially active faults that need to be included in estimating earthquake occurrence likelihoods for the Los Angeles region requires synthesis and evaluation of large amounts of data and numerous interpretations. This was accomplished primarily through a series of carefully facilitated workshops, smaller meetings involving key researchers, and email groups. The workshops and meetings were made possible by the unique logistical and financial resources available through SCEC, and proved to be extremely effective forums for the exchange and critical debate of data and interpretations that are essential in constructing fully representative source models. The main products from this work are a complete source model that characterizes all know or potentially active faults in the greater Los Angeles region. which includes the continental borderland as far south as San Diego, the Ventura Basin, and the Santa Barbara Channel. The model constitutes a series of maps and representative cross-sections that define alternative fault geometries, a table containing rault geometrical and slip-rate parameters, including full uncertainty distributions, and a set of logic trees that define alternative source characterizations, particularly for sets of fault systems having inter-dependent geometries and kinematics resulting from potential intersection and interaction in the sub-surface. All of these products exist in a form suitable for input to earthquake likelihood and seismic hazard analyses. In addition, moment-balanced Poissonian earthquake rates for the alternative multi-segment characterizations of each fault system have been estimated. Finally, this work has served an important integrative function in that the exchange and debate of data, results and ideas that it has engendered has helped to focus SCEC research over the past six years on to key issues in tectonic deformation and faulting.

  3. Near-Real time, High Resolution Reservoir Monitoring and Modeling with Micro-earthquake Data

    NASA Astrophysics Data System (ADS)

    Hutchings, L. J.; Jarpe, S.; Boyle, K. L.; Bonner, B. P.; Viegas, G.; Philson, H.; Statz-Boyer, P.; Majer, E.

    2011-12-01

    We present a micro-earthquake recording and automated processing system along with a methodology to provide near-real time, high resolution reservoir monitoring and modeling. An interactive program for testing micro-earthquake network designs helps identify configurations for optimum accuracy and resolution. We select the Northwest Geysers, California geothermal field to showcase the usefulness of the system. The system's inexpensive recorders requires very little time or expertise to install, and the automated processing requires merely placing flash memory chips (or telemetry) into a computer. Together these make the deployment of a large numbers of sensors feasible and thus rapid, high resolution results possible. Data are arranged into input files for tomography for Vp, Vs, Qp and Qs, and their combinations to provide for interpretation in terms of rock properties. Micro-earthquake source parameters include seismic moments, full moment tensor solutions, stress drops, source durations, radiated energy, and hypocentral locations. The methodology for interpretation is to utilize visualization with GUI analysis to cross compare tomography and source property results along with borehole or other independent information and rock physics to identify reservoir properties. The system can potentially provide information heretofore unattainable or affordable to many small companies, organizations, and countries around the world.

  4. Role of WEGENER (World Earthquake GEodesy Network for Environmental Hazard Research) in monitoring natural hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Ozener, H.; Zerbini, S.; Bastos, M. L.; Becker, M. H.; Meghraoui, M.; Reilinger, R. E.

    2013-12-01

    WEGENER was originally the acronym for Working Group of European Geoscientists for the Establishment of Networks for Earth-science Research. It was founded in March 1981 in response to an appeal delivered at the Journées Luxembourgeoises de Geodynamique in December 1980 to respond with a coordinated European proposal to a NASA Announcement of Opportunity inviting participation in the Crustal Dynamics and Earthquake Research Program. WEGENER, during the past 33 years, has always kept a close contact with the Agencies and Institutions responsible for the development and maintenance of the global space geodetic networks with the aim to make them aware of the scientific needs and outcomes of the project which might have an influence on the general science policy trends. WEGENER served as Inter-commission Project 3.2, between Commission 1 and Commission 3, of the International Association of Geodesy (IAG) until 2012. Since then, WEGENER project has become the Sub-commission 3.5 of IAG commission 3, namely Tectonics and Earthquake Geodesy. In this presentation, we briefly review the accomplishments of WEGENER as originally conceived and outline and justify the new focus of the WEGENER consortium. The remarkable and rapid evolution of the present state of global geodetic monitoring in regard to the precision of positioning capabilities (and hence deformation) and global coverage, the development of InSAR for monitoring strain with unprecedented spatial resolution, and continuing and planned data from highly precise satellite gravity and altimetry missions, encourage us to shift principal attention from mainly monitoring capabilities by a combination of space and terrestrial geodetic techniques to applying existing observational methodologies to the critical geophysical phenomena that threaten our planet and society. Our new focus includes developing an improved physical basis to mitigate earthquake, tsunami, and volcanic risks, and the effects of natural and anthropogenic climate change (sea level, ice degradation). In addition, expanded applications of space geodesy to atmospheric studies will remain a major focus with emphasis on ionospheric and tropospheric monitoring to support forecasting extreme events. Towards these ends, we will encourage and foster interdisciplinary, integrated initiatives to develop a range of case studies for these critical problems. Geological studies are needed to extend geodetic deformation studies to geologic time scales, and new modeling approaches will facilitate full exploitation of expanding geodetic databases. In light of this new focus, the WEGENER acronym now represents, 'World Earthquake GEodesy Network for Environmental Hazard Research.

  5. Postseismic Deformation after the 1964 Great Alaskan Earthquake: Collaborative Research with Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Freymueller, Jeffrey T.

    1999-01-01

    The purpose of this project was to carry out GPS observations on the Kenai Peninsula, southern Alaska, in order to study the postseismic and contemporary deformation following the 1964 Alaska earthquake. All of the research supported in this grant was carried out in collaboration with Dr. Steven Cohen of Goddard Space Flight Center. The research funding from this grant primarily supported GPS fieldwork, along with the acquisition of computer equipment to allow analysis and modeling of the GPS data. A minor amount of salary support was provided by the PI, but the great majority of the salary support was provided by the Geophysical Institute. After the expiration of this grant, additional funding was obtained from the National Science Foundation to continue the work. This grant supported GPS field campaigns in August 1995, June 1996, May-June and September 1997, and May-June 1998. We initially began the work by surveying leveling benchmarks on the Kenai peninsula that had been surveyed after the 1964 earthquake. Changes in height from the 1964 leveling data to the 1995+ GPS data, corrected for the geoid-ellipsoid separation, give the total elevation change since the earthquake. Beginning in 1995, we also identified or established sites that were suitable for long-term surveying using GPS. In the subsequent annual GPS campaigns, we made regular measurements at these GPS marks, and steadily enhanced our set of points for which cumulative postseismic uplift data were available. From 4 years of Global Positioning System (GPS) measurements, we find significant spatial variations in present-day deformation between the eastern and western Kenai peninsula, Alaska. Sites in the eastern Kenai peninsula and Prince William Sound move to the NNW relative to North America, in the direction of Pacific-North America relative plate motion. Velocities decrease in magnitude from nearly the full plate rate in southern Prince William Sound to about 30 mm/yr at Seward and to about 5 mm/yr near Anchorage. In contrast, sites in the western Kenai peninsula move to the SW, in a nearly trenchward direction, with a velocity of about 20 mm/yr. The data are consistent with the shallow plate interface offshore and beneath the eastern Kenai and Prince William Sound being completely locked or nearly so, with elastic strain accumulation resulting in rapid motion in the direction of relative plate motion of sites in the overriding plate. The velocities of sites in the western Kenai, along strike to the southwest, are opposite in sign with those predicted from elastic strain accumulation. These data are incompatible with a significant locked region in this segment of the plate boundary. Trenchward velocities are found also for some sites in the Anchorage area. We interpret the trenchward velocities as being caused by a continuing postseismic transient from the 1964 great Alaska earthquake.

  6. Change of permeability caused by 2011 Tohoku earthquake detected from pore pressure monitoring

    NASA Astrophysics Data System (ADS)

    Kinoshita, C.; Kano, Y.; Ito, H.

    2013-12-01

    Earthquake-induced groundwater changes which are the pre- and co-seismic changes have been long reported (e.g. Roeloffs, 1996). For example, 1995 Kobe earthquake, water inflow into observation tunnel changed at Rokko (Fujimori et al., 1995), at the times of 1964 Alaska earthquake (M8.6) (Coble, 1967) and 1999 Taiwan Chi-Chi earthquake (M7.6) (Chia et al., 2001), groundwater leve were fluctuated. The shaking of seismic waves and crack formation by crustal deformation are proposed as one causes but the mechanism is controversial. We are monitoring pore pressure from 2005 to measure the stress changes at Kamioka mine, Gifu prefecture, central Japan. Barometric pressure and strain are observed to correct the pore pressure data. In general, the pore pressure changes associate with the meteorological effects, Earth tides and crustal deformation. Increase of pore pressure depends on the precipitation which flows into the ground. Especially, snow effects are bigger than the usual rainfall because our observation site has heavy snow in winter season. Melted snow flows in the ground and pore pressure increases at the March to April every year. When the 2011 Tohoku earthquake (M9.0) occurred, pore pressure remarkably decreased because the permeability increases by crustal deformation at Kamioka region. Thus, we estimated the hydraulic diffusivity before and after the earthquake from pore pressure response to crustal deformation. We made separated analyses on three frequency bands. First is the high frequency band, especially, seismic response. Second is response to Earth tides. Third frequency band is that of barometric response which is lower than other two bands. At high frequency band, we confirmed that the deformation occurred under undrained condition and estimated the bulk modulus from pore pressure and strain data. Next, tidal response is extracted from pore pressure which applied to every three months data of pore pressure, barometric pressure and strain. Time window shifted every one day. As a result, amplitude of O1 and M2 constituents decreased after the Tohoku earthquake. M2 and O1 amplitudes were 0.575 hPa and 0.277 hPa before the earthquake, and decreased to 0.554 hPa and 0.184 hPa after the earthquake respectively. The phase between pore pressure and strain, changed after the event and soon recovered. We estimated the hydraulic diffusivity from the change in ratio of tidal response. We have no strain data due to apparatus problem, so we used synthetic strain. From one-dimensional diffusion equation and poroelastic constitutive relations, we could approximate the relation between pore pressure and strain by the exponential curve. Estimated hydraulic diffusivity of preseismic period is 8.0 m2/s and postseismic period is 19 m2/s, and these results correspond to pore pressure decreases. In the case of the barometric pressure response, we made the spectrum analysis and estimated the hydraulic diffusivity. The results from three frequency domain bands were integrated to show how the hydraulic diffusivity depends on to frequency.

  7. The academic health center in complex humanitarian emergencies: lessons learned from the 2010 Haiti earthquake.

    PubMed

    Babcock, Christine; Theodosis, Christian; Bills, Corey; Kim, Jimin; Kinet, Melodie; Turner, Madeleine; Millis, Michael; Olopade, Olufunmilayo; Olopade, Christopher

    2012-11-01

    On January 12, 2010, a 7.0-magnitude earthquake struck Haiti. The event disrupted infrastructure and was marked by extreme morbidity and mortality. The global response to the disaster was rapid and immense, comprising multiple actors-including academic health centers (AHCs)-that provided assistance in the field and from home. The authors retrospectively examine the multidisciplinary approach that the University of Chicago Medicine (UCM) applied to postearthquake Haiti, which included the application of institutional structure and strategy, systematic deployment of teams tailored to evolving needs, and the actual response and recovery. The university mobilized significant human and material resources for deployment within 48 hours and sustained the effort for over four months. In partnership with international and local nongovernmental organizations as well as other AHCs, the UCM operated one of the largest and more efficient acute field hospitals in the country. The UCM's efforts in postearthquake Haiti provide insight into the role AHCs can play, including their strengths and limitations, in complex disasters. AHCs can provide necessary intellectual and material resources as well as technical expertise, but the cost and speed required for responding to an emergency, and ongoing domestic responsibilities, may limit the response of a large university and hospital system. The authors describe the strong institutional backing, the detailed predeployment planning and logistical support UCM provided, the engagement of faculty and staff who had previous experience in complex humanitarian emergencies, and the help of volunteers fluent in the local language which, together, made UCM's mission in postearthquake Haiti successful. PMID:23018336

  8. Real-Time Seismic Monitoring of Thenewcape Girardeau (mo) Bridge and Recorded Earthquake Response

    NASA Astrophysics Data System (ADS)

    elebi, Mehmet

    This paper introduces the state of the art, real-time and broad-band seismic monitoring network implemented for the 1206 m [3956 ft] long, cable-stayed Bill Emerson Memorial Bridge in Cape Girardeau (MO), a new Mississippi River crossing, approximately 80 km from the epicentral region of the 1811-1812 New Madrid earthquakes. Design of the bridge accounted for the possibility of a strong earthquake (magnitude 7.5 or greater) during the design life of the bridge. The monitoring network consists of a superstructure and two free-field arrays and comprises a total of 84 channels of accelerometers deployed on the superstructure, pier foundations and free-field in the vicinity of the bridge. The paper also introduces the high quality response data obtained from the network. Such data is aimed to be used by the owner, researchers and engineers to (1) assess the performance of the bridge, (2) check design parameters, including the comparison of dynamic characteristics with actual response, and (3) better design future similar bridges. Preliminary analyses of low-amplitude ambient vibration data and that from a small earthquake reveal specific response characteristics of this new bridge and the free-field in its proximity. There is coherent tower-cabledeck interaction that sometimes results in amplified ambient motions. Also, while the motions at the lowest (tri-axial) downhole accelerometers on both MO and IL sides are practically free-from any feedback from the bridge, the motions at the middle downhole and surface accelerometers are significantly influenced by amplified ambient motions of the bridge.

  9. Analogue models of subduction megathrust earthquakes: improving rheology and monitoring technique

    NASA Astrophysics Data System (ADS)

    Brizzi, Silvia; Corbi, Fabio; Funiciello, Francesca; Moroni, Monica

    2015-04-01

    Most of the world's great earthquakes (Mw > 8.5, usually known as mega-earthquakes) occur at shallow depths along the subduction thrust fault (STF), i.e., the frictional interface between the subducting and overriding plates. Spatiotemporal occurrences of mega-earthquakes and their governing physics remain ambiguous, as tragically demonstrated by the underestimation of recent megathrust events (i.e., 2011 Tohoku). To help unravel seismic cycle at STF, analogue modelling has become a key-tool. First properly scaled analogue models with realistic geometries (i.e., wedge-shaped) suitable for studying interplate seismicity have been realized using granular elasto-plastic [e.g., Rosenau et al., 2009] and viscoelastic materials [i.e., Corbi et al., 2013]. In particular, viscoelastic laboratory experiments realized with type A gelatin 2.5 wt% simulate, in a simplified yet robust way, the basic physics governing subduction seismic cycle and related rupture process. Despite the strength of this approach, analogue earthquakes are not perfectly comparable to their natural prototype. In this work, we try to improve subduction seismic cycle analogue models by modifying the rheological properties of the analogue material and adopting a new image analysis technique (i.e., PEP - ParticlE and Prediction velocity). We test the influence of lithosphere elasticity by using type A gelatin with greater concentration (i.e., 6 wt%). Results show that gelatin elasticity plays important role in controlling seismogenic behaviour of STF, tuning the mean and the maximum magnitude of analogue earthquakes. In particular, by increasing gelatin elasticity, we observe decreasing mean magnitude, while the maximum magnitude remains the same. Experimental results therefore suggest that lithosphere elasticity could be one of the parameters that tunes seismogenic behaviour of STF. To increase gelatin elasticity also implies improving similarities with their natural prototype in terms of coseismic duration and rupture width. Experimental monitoring has been performed by means of both PEP and PIV (i.e., Particle Image Velocimetry) algorithms. PEP differs from classic cross-correlation techniques (i.e., PIV) in its ability to provide sparse velocity vectors at points coincident with particle barycentre positions, allowing a lagrangian description of the velocity field and a better spatial resolution (i.e., ≈ 0.03 mm2) with respect to PIV. Results show that PEP algorithm is able to identify a greater number of analogue earthquakes (i.e., ≈ 20% more than PIV algorithm), decreasing the minimum detectable magnitude from 6.6 to 4.5. Furthermore, earthquake source parameters (e.g., hypocentre position, rupture limits and slip distribution) are more accurately defined. PEP algorithm is then suitable to potentially gain new insights on seismogenic process of STF, by extending the analysable magnitude range of analogue earthquakes and having implications on applicability of scaling relationship, such as Gutenberg - Richter law, to experimental results.

  10. Application of multimode airborne digital camera system in Wenchuan earthquake disaster monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Xue; Li, Qingting; Fang, Junyong; Tong, Qingxi; Zheng, Lanfen

    2009-06-01

    Remote sensing, especially airborne remote sensing, can be an invaluable technique for quick response to natural disasters. Timely acquired images by airborne remote sensing can provide very important information for the headquarters and decision makers to be aware of the disaster situation, and make effective relief arrangements. The image acquisition and processing of Multi-mode Airborne Digital Camera System (MADC) and its application in Wenchuan earthquake disaster monitoring are presented in this paper. MADC system is a novel airborne digital camera developed by Institute of Remote Sensing Applications, Chinese Academy of Sciences. This camera system can acquire high quality images in three modes, namely wide field, multi-spectral (hyper-spectral) and stereo conformation. The basic components and technical parameters of MADC are also presented in this paper. MADC system played a very important role in the disaster monitoring of Wenchuan earthquake. In particular, the map of dammed lakes in Jianjiang river area was produced and provided to the front line headquarters. Analytical methods and information extraction techniques of MADC are introduced. Some typical analytical and imaging results are given too. Suggestions for the design and configuration of the airborne sensors are discussed at the end of this paper.

  11. Real-time earthquake monitoring for tsunami warning in the Indian Ocean and beyond

    NASA Astrophysics Data System (ADS)

    Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Harjadi, P.; Fauzi; Gitews Seismology Group

    2010-12-01

    The Mw = 9.3 Sumatra earthquake of 26 December 2004 generated a tsunami that affected the entire Indian Ocean region and caused approximately 230 000 fatalities. In the response to this tragedy the German government funded the German Indonesian Tsunami Early Warning System (GITEWS) Project. The task of the GEOFON group of GFZ Potsdam was to develop and implement the seismological component. In this paper we describe the concept of the GITEWS earthquake monitoring system and report on its present status. The major challenge for earthquake monitoring within a tsunami warning system is to deliver rapid information about location, depth, size and possibly other source parameters. This is particularly true for coast lines adjacent to the potential source areas such as the Sunda trench where these parameters are required within a few minutes after the event in order to be able to warn the population before the potential tsunami hits the neighbouring coastal areas. Therefore, the key for a seismic monitoring system with short warning times adequate for Indonesia is a dense real-time seismic network across Indonesia with densifications close to the Sunda trench. A substantial number of supplementary stations in other Indian Ocean rim countries are added to strengthen the teleseismic monitoring capabilities. The installation of the new GITEWS seismic network - consisting of 31 combined broadband and strong motion stations - out of these 21 stations in Indonesia - is almost completed. The real-time data collection is using a private VSAT communication system with hubs in Jakarta and Vienna. In addition, all available seismic real-time data from the other seismic networks in Indonesia and other Indian Ocean rim countries are acquired also directly by VSAT or by Internet at the Indonesian Tsunami Warning Centre in Jakarta and the resulting "virtual" network of more than 230 stations can jointly be used for seismic data processing. The seismological processing software as part of the GITEWS tsunami control centre is an enhanced version of the widely used SeisComP software and the well established GEOFON earthquake information system operated at GFZ in Potsdam (http://geofon.gfz-potsdam.de/db/eqinfo.php). This recently developed software package (SeisComP3) is reliable, fast and can provide fully automatic earthquake location and magnitude estimates. It uses innovative visualization tools, offers the possibility for manual correction and re-calculation, flexible configuration, support for distributed processing and data and parameter exchange with external monitoring systems. SeisComP3 is not only used for tsunami warning in Indonesia but also in most other Tsunami Warning Centres in the Indian Ocean and Euro-Med regions and in many seismic services worldwide.

  12. Earthquake Monitoring: SeisComp3 at the Swiss National Seismic Network

    NASA Astrophysics Data System (ADS)

    Clinton, J. F.; Diehl, T.; Cauzzi, C.; Kaestli, P.

    2011-12-01

    The Swiss Seismological Service (SED) has an ongoing responsibility to improve the seismicity monitoring capability for Switzerland. This is a crucial issue for a country with low background seismicity but where a large M6+ earthquake is expected in the next decades. With over 30 stations with spacing of ~25km, the SED operates one of the densest broadband networks in the world, which is complimented by ~ 50 realtime strong motion stations. The strong motion network is expected to grow with an additional ~80 stations over the next few years. Furthermore, the backbone of the network is complemented by broadband data from surrounding countries and temporary sub-networks for local monitoring of microseismicity (e.g. at geothermal sites). The variety of seismic monitoring responsibilities as well as the anticipated densifications of our network demands highly flexible processing software. We are transitioning all software to the SeisComP3 (SC3) framework. SC3 is a fully featured automated real-time earthquake monitoring software developed by GeoForschungZentrum Potsdam in collaboration with commercial partner, gempa GmbH. It is in its core open source, and becoming a community standard software for earthquake detection and waveform processing for regional and global networks across the globe. SC3 was originally developed for regional and global rapid monitoring of potentially tsunamagenic earthquakes. In order to fulfill the requirements of a local network recording moderate seismicity, SED has tuned configurations and added several modules. In this contribution, we present our SC3 implementation strategy, focusing on the detection and identification of seismicity on different scales. We operate several parallel processing "pipelines" to detect and locate local, regional and global seismicity. Additional pipelines with lower detection thresholds can be defined to monitor seismicity within dense subnets of the network. To be consistent with existing processing procedures, the nonlinloc algorithm was implemented for manual and automatic locations using 1D and 3D velocity models; plugins for improved automatic phase picking and Ml computation were developed; and the graphical user interface for manual review was extended (including pick uncertainty definition; first motion focal mechanisms; interactive review of station magnitude waveforms; full inclusion of strong motion data). SC3 locations are fully compatible with those derived from the existing in-house processing tools and are stored in a database derived from the QuakeML data model. The database is shared with the SED alerting software, which merges origins from both SC3 and external sources in realtime and handles the alerting procedure. With the monitoring software being transitioned to SeisComp3, acquisition, archival and dissemination of SED waveform data now conforms to the seedlink and ArcLink protocols and continuous archives can be accessed via SED and all EIDA (European Integrated Data Archives) web-sites. Further, a SC3 module for waveform parameterisation has been developed, allowing rapid computation of peak values of ground motion and other engineering parameters within minutes of a new event. An output of this module is USGS ShakeMap XML. n minutes of a new event. An output of this module is USGS ShakeMap XML.

  13. Application for temperature and humidity monitoring of data center environment

    NASA Astrophysics Data System (ADS)

    Albert, Ş.; Truşcǎ, M. R. C.; Soran, M. L.

    2015-12-01

    The technology and computer science registered a large development in the last years. Most systems that use high technologies require special working conditions. The monitoring and the controlling are very important. The temperature and the humidity are important parameters in the operation of computer systems, industrial and research, maintaining it between certain values to ensure their proper functioning being important. Usually, the temperature is maintained in the established range using an air conditioning system, but the humidity is affected. In the present work we developed an application based on a board with own firmware called "AVR_NET_IO" using a microcontroller ATmega32 type for temperature and humidity monitoring in Data Center of INCDTIM. On this board, temperature sensors were connected to measure the temperature in different points of the Data Center and outside of this. Humidity monitoring is performed using data from integrated sensors of the air conditioning system, thus achieving a correlation between humidity and temperature variation. It was developed a software application (CM-1) together with the hardware, which allows temperature monitoring and register inside Data Center and trigger an alarm when variations are greater with 3°C than established limits of the temperature.

  14. Non-intrusive human fatigue monitoring in command centers

    NASA Astrophysics Data System (ADS)

    Alsamman, A.; Ratecki, T.

    2011-04-01

    An inexpensive, non-intrusive, vision-based, active fatigue monitoring system is presented. The system employs a single consumer webcam that is modified to operate in the near-IR range. An active IR LED system is developed to facilitate the quick localization of the eye pupils. Imaging software tracks the eye features by analyzing intensity areas and their changes in the vicinity of localization. To quantify the level of fatigue the algorithm measures the opening of the eyelid, PERCLOS. The software developed runs on the workstation and is designed to draw limited computational power, so as to not interfere with the user task. To overcome low-frame rate and improve real-time monitoring, a two-phase detection and tacking algorithm is implemented. The results presented show that the system successfully monitors the level of fatigue at a low rate of 8 fps. The system is well suited to monitor users in command centers, flight control centers, airport traffic dispatchers, military operation and command centers, etc., but the work can be extended to wearable devices and other environments.

  15. Federal Radiological Monitoring and Assessment Center Overview of FRMAC Operations

    SciTech Connect

    1998-03-01

    In the event of a major radiological emergency, 17 federal agencies with various statutory responsibilities have agreed to coordinate their efforts at the emergency scene under the umbrella of the Federal Radiological Emergency Response Plan. This cooperative effort will ensure that all federal radiological assistance fully supports their efforts to protect the public. the mandated federal cooperation ensures that each agency can obtain the data critical to its specific responsibilities. This Overview of Federal Radiological Monitoring and Assessment Center (FRMAC) describes the FRMAC response activities to a major radiological emergency. It also describes the federal assets and subsequent operational activities which provide federal radiological monitoring and assessment of the off-site areas.

  16. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 2, Radiation Monitoring and Sampling

    SciTech Connect

    NSTec Aerial Measurement Systems

    2012-07-31

    The FRMAC Monitoring and Sampling Manual, Volume 2 provides standard operating procedures (SOPs) for field radiation monitoring and sample collection activities that are performed by the Monitoring group during a FRMAC response to a radiological emergency.

  17. Student-centered Experiments on Earthquake Occurrence Using the Seismic/Eruption Program

    NASA Astrophysics Data System (ADS)

    Barker, J. S.; Jones, A. L.; Hubenthal, M.

    2005-12-01

    Seismic/Eruption is a free Windows program that plots the locations of earthquakes and volcanic eruptions through time on maps of the world or various geographical areas. The hypocenter database can be updated via internet to include the NEIC catalog from 1960 to present. Many teaching activities based on this program (e.g. Braile and Braile, 2001) can help students draw conclusions about the distribution and rate of occurrence of earthquakes. In this activity students, individually or in small groups, select a seismically active region of interest and make their own map. They select a time window, perhaps 20 years. By changing the minimum magnitude setting in Seismic/Eruption and replaying the plots, they observe first-hand that large earthquakes occur less often than smaller earthquakes. The total number of earthquakes plotted is easily read from a counter on the screen. Students compile a table of the number of earthquakes per year with magnitude greater or equal to a certain magnitude, using a range of magnitude thresholds. These are then plotted on semi-log paper in the form of a Gutenberg-Richter plot. Connecting the points on the plot allows students to see a linear trend, and to think about why there may be departures from that linear trend for very small and very large magnitudes. If they assume earthquake occurrence is equally distributed in time, they can predict how often an earthquake of a given magnitude is likely to occur in their chosen region. They can also replay Seismic/Eruption to see whether that assumption is valid. Allowing students to interrogate the most accurate, complete and up-to-date earthquake catalog about a region of their own choosing provides ownership of the experiment. Students may choose an area of a recent newsworthy earthquake (e.g. Sumatra), or their family's ancestral region, or an area they are studying in another class. Students should be encouraged to pose questions and hypotheses about earthquake occurrence, knowing that they have the data and a display tool at hand to answer those questions.

  18. Robust Satellite Techniques (RST) for monitoring earthquake prone areas by satellite TIR observations: The case of 1999 Chi-Chi earthquake (Taiwan)

    NASA Astrophysics Data System (ADS)

    Genzano, N.; Filizzola, C.; Paciello, R.; Pergola, N.; Tramutoli, V.

    2015-12-01

    For more than 13 years a multi-temporal data-analysis method, named Robust Satellite Techniques (RST), has been being applied to satellite Thermal InfraRed (TIR) monitoring of seismically active regions. It gives a clear definition of a TIR anomaly within a validation/confutation scheme devoted to verify if detected anomalies can be associated or not to the time and location of the occurrence of major earthquakes. In this scheme, the confutation part (i.e. verifying if similar anomalies do not occur in the absence of a significant seismic activity) assumes a role even much important than the usual validation component devoted to verify the presence of anomalous signal transients before (or in association with) specific seismic events. Since 2001, RST approach has been being used to study tens of earthquakes with a wide range of magnitudes (from 4.0 to 7.9) occurred in different continents and in various geo-tectonic settings. In this paper such a long term experience is exploited in order to give a quantitative definition of a significant sequence of TIR anomalies (SSTA) in terms of the required space-time continuity constraints (persistence), identifying also the different typologies of known spurious sequences of TIR anomalies that have to be excluded from the following validation steps. On the same basis, taking also into account for the physical models proposed for justifying the existence of a correlation between TIR anomalies and earthquakes occurrence, specific validation rules (in line with the ones used by the Collaboratory for the Study of Earthquake Predictability - CSEP - Project) have been defined to drive the validation process. In this work, such an approach is applied for the first time to a long-term dataset of night-time GMS-5/VISSR (Geostationary Meteorological Satellite/Visible and Infrared Spin-Scan Radiometer) TIR measurements, comparing SSTAs and earthquakes with M > 4 which occurred in a wide area around Taiwan, in the month of September of different years (from 1995 to 2002). In this dataset the Chi-Chi earthquake (MW = 7.6) which occurred on September 20, 1999 represents the major, but not unique, event. The analysis shows that all identified SSTAs occur in the pre-fixed space-time window around (in terms of time and location) earthquakes with M > 4. The false positive rate remains zero even if only earthquakes with M > 4.5 are considered. In the case of the Chi-Chi earthquake, 3 SSTAs were identified (all within the established space-time correlation window), one of them appearing about 2 weeks before and very close to the epicentre of the earthquake just along the associated tectonic lineaments. The wide considered space-time window, together with the high seismicity of the considered area, surely positively conditioned the achieved results, so that further analyses should be carried out by using longer datasets and different geographic areas. However, also considering the coincidence with other (possible) precursor phenomena, independently reported (particularly within the iSTEP project) at the time of the Chi Chi earthquake, achieved results seem already sufficient (at least) to qualify TIR anomalies (identified by RST) among the parameters to be considered in the framework of a multi-parametric approach to a time-Dependent Assessment of Seismic Hazard (t-DASH).

  19. The effects of educational program on health volunteers’ knowledge regarding their approach to earthquake in health centers in Tehran

    PubMed Central

    JOUHARI, ZAHRA; PIRASTEH, AFSHAR; GHASSEMI, GHOLAM REZA; BAZRAFKAN, LEILA

    2015-01-01

    Introduction The people's mental, intellectual and physical non-readiness to confront earthquake may result in disastrous outcomes. This research aimed to study of effects of a training intervention on health connector’s knowledge regarding their approach to earthquake in health-training centers in East of Tehran. Methods This research which is a semi-experimental study was designed and executed in 2011, using a questionnaire with items based on the information of Crisis Management Org. After a pilot study and making the questionnaire valid and reliable, we determined the sample size. Then, the questionnaires were completed before and after the training program by 82 health connectors at health-treatment centers in the East of Tehran. Finally, the collected data were analyzed by SPSS 14, using paired sample t–test and Pearson's correlation coefficient. Results Health connectors were women with the mean age of 43.43±8.51 years. In this research, the mean score of connectors’ knowledge before and after the training was 35.15±4.3 and 43.73±2.91 out of 48, respectively. The difference was statistically significant (p=0.001). The classes were the most important source of information for the health connectors. Conclusion The people's knowledge to confront earthquake can be increased by holding training courses and workshops. Such training courses and workshops have an important role in data transfer and readiness of health connectors. PMID:25927068

  20. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  1. Advanced earthquake monitoring system for U.S. Department of Veterans Affairs medical buildings--instrumentation

    USGS Publications Warehouse

    Kalkan, Erol; Banga, Krishna; Ulusoy, Hasan S.; Fletcher, Jon Peter B.; Leith, William S.; Reza, Shahneam; Cheng, Timothy

    2012-01-01

    In collaboration with the U.S. Department of Veterans Affairs (VA), the National Strong Motion Project (NSMP; http://nsmp.wr.usgs.gov/) of the U.S. Geological Survey has been installing sophisticated seismic systems that will monitor the structural integrity of 28 VA hospital buildings located in seismically active regions of the conterminous United States, Alaska, and Puerto Rico during earthquake shaking. These advanced monitoring systems, which combine the use of sensitive accelerometers and real-time computer calculations, are designed to determine the structural health of each hospital building rapidly after an event, helping the VA to ensure the safety of patients and staff. This report presents the instrumentation component of this project by providing details of each hospital building, including a summary of its structural, geotechnical, and seismic hazard information, as well as instrumentation objectives and design. The structural-health monitoring component of the project, including data retrieval and processing, damage detection and localization, automated alerting system, and finally data dissemination, will be presented in a separate report.

  2. Monitoring of the stress state variations of the Southern California for the purpose of earthquake prediction

    NASA Astrophysics Data System (ADS)

    Gokhberg, M.; Garagash, I.; Bondur, V.; Steblov, G. M.

    2014-12-01

    The three-dimensional geomechanical model of Southern California was developed, including a mountain relief, fault tectonics and characteristic internal features such as the roof of the consolidated crust and Moho surface. The initial stress state of the model is governed by the gravitational forces and horizontal tectonic motions estimated from GPS observations. The analysis shows that the three-dimensional geomechanical models allows monitoring of the changes in the stress state during the seismic process in order to constrain the distribution of the future places with increasing seismic activity. This investigation demonstrates one of possible approach to monitor upcoming seismicity for the periods of days - weeks - months. Continuous analysis of the stress state was carried out during 2009-2014. Each new earthquake with М~1 and above from USGS catalog was considered as the new defect of the Earth crust which has some definite size and causes redistribution of the stress state. Overall calculation technique was based on the single function of the Earth crust damage, recalculated each half month. As a result each half month in the upper crust layers and partially in the middle layers we revealed locations of the maximal values of the stress state parameters: elastic energy density, shear stress, proximity of the earth crust layers to their strength limit. All these parameters exhibit similar spatial and temporal distribution. How follows from observations all four strongest events with М ~ 5.5-7.2 occurred in South California during the analyzed period were prefaced by the parameters anomalies in peculiar advance time of weeks-months in the vicinity of 10-50 km from the upcoming earthquake. After the event the stress state source disappeared. The figure shows migration of the maximums of the stress state variations gradients (parameter D) in the vicinity of the epicenter of the earthquake 04.04.2010 with М=7.2 in the period of 01.01.2010-01.05.2010. Grey lines show the major faults. In the table the values are sampled by 2 weeks, "-" indicates time before the event, "+" indicates time after the event.

  3. Seismic ACROSS Transmitter Installed at Morimachi above the Subducting Philippine Sea Plate for the Test Monitoring of the Seismogenic Zone of Tokai Earthquake not yet to Occur

    NASA Astrophysics Data System (ADS)

    Kunitomo, T.; Kumazawa, M.; Masuda, T.; Morita, N.; Torii, T.; Ishikawa, Y.; Yoshikawa, S.; Katsumata, A.; Yoshida, Y.

    2008-12-01

    Here we report the first seismic monitoring system in active and constant operation for the wave propagation characteristics in tectonic region just above the subducting plate driving the coming catastrophic earthquakes. Developmental works of such a system (ACROSS; acronym for Accurately Controlled, Routinely Operated, Signal System) have been started in 1994 at Nagoya University and since 1996 also at TGC (Tono Geoscience Center) of JAEA promoted by Hyogoken Nanbu Earthquakes (1995 Jan.17, Mj=7.3). The ACROSS is a technology system including theory of signal and data processing based on the brand new concept of measurement methodology of Green function between a signal source and observation site. The works done for first generation system are reported at IWAM04 and in JAEA report (Kumazawa et al.,2007). The Meteorological Research Institute of JMA has started a project of test monitoring of Tokai area in 2004 in corporation with Shizuoka University to realize the practical use of the seismic ACROSS for earthquake prediction researches. The first target was set to Tokai Earthquake not yet to take place. The seismic ACROSS transmitter was designed so as to be appropriate for the sensitive monitoring of the deep active fault zone on the basis of the previous technology elements accumulated so far. The ground coupler (antenna) is a large steel-reinforced concrete block (over 20m3) installed in the basement rocks in order to preserve the stability. Eccentric moment of the rotary transmitter is 82 kgm at maximum, 10 times larger than that of the first generation. Carrier frequency of FM signal for practical use can be from 3.5 to 15 Hz, and the signal phase is accurately controlled by a motor with vector inverter synchronized with GPS clock with a precision of 10-4 radian or better. By referring to the existing structure model in this area (Iidaka et al., 2003), the site of the transmitting station was chosen at Morimachi so as to be appropriate for detecting the reflected wave from an anticipated fault plane of Tokai Earthquake, the boundary between Eurasian lithosphere and the subducting Philippine Sea Plate. Further several trials of new transmission protocol and also remote control are being made for the transmitter network of the next generation. The whole system appears working well as reported by Yoshida et al. (2008, This meeting).

  4. Federal Radiological Monitoring and Assessment Center: Phase I Response

    SciTech Connect

    C. Riland; D. R. Bowman; R. Lambert; R. Tighe

    1999-09-30

    A Federal Radiological Monitoring and Assessment Center (FRMAC) is established in response to a Lead Federal Agency (LFA) or State request when a radiological emergency is anticipated or has occurred. The FRMAC coordinates the off-site monitoring, assessment, and analysis activities during such an emergency. The FRMAC response is divided into three phases. FRMAC Phase 1 is a rapid, initial-response capability that can interface with Federal or State officials and is designed for a quick response time and rapid radiological data collection and assessment. FRMAC Phase 1 products provide an initial characterization of the radiological situation and information on early health effects to officials responsible for making and implementing protective action decisions.

  5. Self-Powered WSN for Distributed Data Center Monitoring.

    PubMed

    Brunelli, Davide; Passerone, Roberto; Rizzon, Luca; Rossi, Maurizio; Sartori, Davide

    2016-01-01

    Monitoring environmental parameters in data centers is gathering nowadays increasing attention from industry, due to the need of high energy efficiency of cloud services. We present the design and the characterization of an energy neutral embedded wireless system, prototyped to monitor perpetually environmental parameters in servers and racks. It is powered by an energy harvesting module based on Thermoelectric Generators, which converts the heat dissipation from the servers. Starting from the empirical characterization of the energy harvester, we present a power conditioning circuit optimized for the specific application. The whole system has been enhanced with several sensors. An ultra-low-power micro-controller stacked over the energy harvesting provides an efficient power management. Performance have been assessed and compared with the analytical model for validation. PMID:26729135

  6. Self-Powered WSN for Distributed Data Center Monitoring

    PubMed Central

    Brunelli, Davide; Passerone, Roberto; Rizzon, Luca; Rossi, Maurizio; Sartori, Davide

    2016-01-01

    Monitoring environmental parameters in data centers is gathering nowadays increasing attention from industry, due to the need of high energy efficiency of cloud services. We present the design and the characterization of an energy neutral embedded wireless system, prototyped to monitor perpetually environmental parameters in servers and racks. It is powered by an energy harvesting module based on Thermoelectric Generators, which converts the heat dissipation from the servers. Starting from the empirical characterization of the energy harvester, we present a power conditioning circuit optimized for the specific application. The whole system has been enhanced with several sensors. An ultra-low-power micro-controller stacked over the energy harvesting provides an efficient power management. Performance have been assessed and compared with the analytical model for validation. PMID:26729135

  7. Monitoring of fluid-rock interaction in active fault zones: a new method of earthquake prediction/forecasting?

    NASA Astrophysics Data System (ADS)

    Claesson, L.; Skelton, A.; Graham, C.; Dietl, C.; Morth, M.; Torssander, P.

    2003-12-01

    We propose a new method for earthquake forecasting based on the "prediction in hindsight" of a Mw 5.8 earthquake on Iceland, on September 16, 2002. The "prediction in hindsight" is based on geochemical monitoring of geothermal water at site HU-01 located within the Tj”rnes Fracture Zone, northern Iceland, before and after the earthquake. During the 4 weeks before the earthquake exponential (<800%) increases in the concentration of Cu, Zn and Fe in the fluid, was measured, together with a linear increase of Na/Ca and a slight increase of δ 18O. We relate the hydrogeochemical changes before the earthquake to influx of fluid which interacted with the host rock at higher temperatures and suggest that fluid flow was facilitated by stress-induced modification of rock permeability, which enabled more rapid fluid-rock interaction. Stepwise increases (13-35 %) in the concentration of, Ba, Ca, K, Li, Na, Rb, S, Si, Sr, Cl, Br and SO4 and negative shifts in δ 18O and δ D was detected in the fluid immediately after the earthquake, which we relate to seismically-induced source switching and consequent influx of older (or purer) ice age meteoric waters. The newly tapped source reservoir has a chemically and isotopically distinct ice-age meteoric water signature, which is the result of a longer residence in the crust. The immediancy of these changes is consistent with experimentally-derived timescales of fault-sealing in response to coupled deformation and fluid flow, interpreted as source-switching. These precursory changes may be used to "predict" the earthquake up to 2 weeks before it occurs.

  8. Statistical monitoring of aftershock sequences: a case study of the 2015 Mw7.8 Gorkha, Nepal, earthquake

    NASA Astrophysics Data System (ADS)

    Ogata, Yosihiko; Tsuruoka, Hiroshi

    2016-03-01

    Early forecasting of aftershocks has become realistic and practical because of real-time detection of hypocenters. This study illustrates a statistical procedure for monitoring aftershock sequences to detect anomalies to increase the probability gain of a significantly large aftershock or even an earthquake larger than the main shock. In particular, a significant lowering (relative quiescence) in aftershock activity below the level predicted by the Omori-Utsu formula or the epidemic-type aftershock sequence model is sometimes followed by a large earthquake in a neighboring region. As an example, we detected significant lowering relative to the modeled rate after approximately 1.7 days after the main shock in the aftershock sequence of the Mw7.8 Gorkha, Nepal, earthquake of April 25, 2015. The relative quiescence lasted until the May 12, 2015, M7.3 Kodari earthquake that occurred at the eastern end of the primary aftershock zone. Space-time plots including the transformed time can indicate the local places where aftershock activity lowers (the seismicity shadow). Thus, the relative quiescence can be hypothesized to be related to stress shadowing caused by probable slow slips. In addition, the aftershock productivity of the M7.3 Kodari earthquake is approximately twice as large as that of the M7.8 main shock.

  9. The Evolution of the Federal Monitoring and Assessment Center

    SciTech Connect

    NSTec Aerial Measurement System

    2012-07-31

    The Federal Radiological Monitoring and Assessment Center (FRMAC) is a federal emergency response asset whose assistance may be requested by the Department of Homeland Security (DHS), the Department of Defense (DoD), the Environmental Protection Agency (EPA), the Nuclear Regulatory Commission (NRC), and state and local agencies to respond to a nuclear or radiological incident. It is an interagency organization with representation from the Department of Energy’s National Nuclear Security Administration (DOE/NNSA), the Department of Defense (DoD), the Environmental Protection Agency (EPA), the Department of Health and Human Services (HHS), the Federal Bureau of Investigation (FBI), and other federal agencies. FRMAC, in its present form, was created in 1987 when the radiological support mission was assigned to the DOE’s Nevada Operations Office by DOE Headquarters. The FRMAC asset, including its predecessor entities, was created, grew, and evolved to function as a response to radiological incidents. Radiological emergency response exercises showed the need for a coordinated approach to managing federal emergency monitoring and assessment activities. The mission of FRMAC is to coordinate and manage all federal radiological environmental monitoring and assessment activities during a nuclear or radiological incident within the United States in support of state,local, tribal governments, DHS, and the federal coordinating agency. Radiological emergency response professionals with the DOE’s national laboratories support the Radiological Assistance Program (RAP), National Atmospheric Release Advisory Center (NARAC), the Aerial MeasuringSystem (AMS), and the Radiation Emergency Assistance Center/Training Site (REAC/TS). These teams support the FRMAC to provide:  Atmospheric transport modeling  Radiation monitoring  Radiological analysis and data assessments  Medical advice for radiation injuries In support of field operations, the FRMAC provides geographic information systems, communications, mechanical, electrical, logistics, and administrative support. The size of the FRMAC is tailored to the incident and is comprised of emergency response professionals drawn from across the federal government. State and local emergency response teams may also integrate their operations with FRMAC, but are not required to.

  10. Early Results of Three-Year Monitoring of Red Wood Ants' Behavioral Changes and Their Possible Correlation with Earthquake Events.

    PubMed

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-01-01

    Short-term earthquake predictions with an advance warning of several hours or days are currently not possible due to both incomplete understanding of the complex tectonic processes and inadequate observations. Abnormal animal behaviors before earthquakes have been reported previously, but create problems in monitoring and reliability. The situation is different with red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae)). They have stationary mounds on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas. For three years (2009-2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras with both a color and an infrared sensor. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants' behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the more than 45,000 hours of video streams. Based on this automated approach, a statistical analysis of the ants' behavior will be carried out. In addition, other parameters (climate, geotectonic and biological), which may influence behavior, will be included in the analysis. PMID:26487310

  11. Results of seismological monitoring in the Cascade Range 1962-1989: earthquakes, eruptions, avalanches and other curiosities

    USGS Publications Warehouse

    Weaver, C.S.; Norris, R.D.; Jonientz-Trisler, C.

    1990-01-01

    Modern monitoring of seismic activity at Cascade Range volcanoes began at Longmire on Mount Rainier in 1958. Since then, there has been an expansion of the regional seismic networks in Washington, northern Oregon and northern California. Now, the Cascade Range from Lassen Peak to Mount Shasta in the south and Newberry Volcano to Mount Baker in the north is being monitored for earthquakes as small as magnitude 2.0, and many of the stratovolcanoes are monitored for non-earthquake seismic activity. This monitoring has yielded three major observations. First, tectonic earthquakes are concentrated in two segments of the Cascade Range between Mount Rainier and Mount Hood and between Mount Shasta and Lassen Peak, whereas little seismicity occurs between Mount Hood and Mount Shasta. Second, the volcanic activity and associated phenomena at Mount St. Helens have produced intense and widely varied seismicity. And third, at the northern stratovolcanoes, signals generated by surficial events such as debris flows, icequakes, steam emissions, rockfalls and icefalls are seismically recorded. Such records have been used to alert authorities of dangerous events in progress. -Authors

  12. Magma Ascent to Submarine Volcanoes: Real-Time Monitoring by Means of Teleseismic Observations of Earthquake Swarms

    NASA Astrophysics Data System (ADS)

    Spicak, A.; Vanek, J.; Kuna, V. M.

    2013-12-01

    Earthquake swarm occurrence belongs to reliable indicators of magmatic activity in the Earth crust. Their occurrence beneath submarine portions of volcanic arcs brings valuable information on plumbing systems of this unsufficiently understood environment and reveals recently active submarine volcanoes. Utilisation of teleseismically recorded data (NEIC, GCMT Project) enables to observe magmatic activity in almost real time. We analysed seismicity pattern in two areas - the Andaman-Nicobar region in April 2012 and the southern Ryukyu in April 2013. In both regions, the swarms are situated 80-100 km above the Wadati-Benioff zone of the subducting slab. Foci of the swarm earthquakes delimit a seismogenic layer at depths between 9 - 35 km that should be formed by brittle and fractured rock environment. Repeated occurrence of earthquakes clustered in swarms excludes large accumulations of melted rocks in this layer. Magma reservoirs should be situated at depths greater than 35 km. Upward magma migration from deeper magma reservoirs to shallow magma chambers or to the seafloor induce earthquake swarms by increasing tectonic stress and/or decreasing friction at faults. Frequency of earthquake swarm occurrence in the investigated areas has made a volcanic eruption at the seafloor probable. Moreover, epicentral zones of the swarms often coincide with distinct elevations at the seafloor - seamounts and seamount ranges. High accuracy of global seismological data enabled also to observe migration of earthquakes during individual swarms (Fig. 1), probably reflecting dike and/or sill propagation. Triggering of earthquake swarms by distant strong earthquakes was repeatedly observed in the Andaman-Nicobar region. The presented study documents high accuracy of hypocentral determinations published by the above mentioned data centers and usefulness of the EHB relocation procedure. Epicentral map of the October 2002 earthquake swarm in southern Ryukyu showing E-W migration of events during the swarm. The swarm occurred during 29 hours on October 23 - 25 in the magnitude range 4.0 - 5.2. Open circles - epicenters of all 54 events of the swarm; red circles - epicenters of events that occurred in a particular time interval of the swarm development: (a) - starting 3 hours; (b) - following 4 hours; (c) - final 22 hours.

  13. Emergency radiological monitoring and analysis United States Federal Radiological Monitoring and Assessment Center

    SciTech Connect

    Thome, D.J.

    1994-09-01

    The United States Federal Radiological Emergency Response Plan (FRERP) provides the framework for integrating the various Federal agencies responding to a major radiological emergency. Following a major radiological incident the FRERP authorizes the creation of the Federal Radiological Monitoring and Assessment Center (FRMAC). The FRMAC is established to coordinate all Federal agencies involved in the monitoring and assessment of the off-site radiological conditions in support of the impacted states and the Lead Federal Agency (LFA). Within the FRMAC, the Monitoring and Analysis Division is responsible for coordinating all FRMAC assets involved in conducting a comprehensive program of environmental monitoring, sampling, radioanalysis and quality assurance. This program includes: (1) Aerial Radiological Monitoring - Fixed Wing and Helicopter, (2) Field Monitoring and Sampling, (3) Radioanalysis - Mobile and Fixed Laboratories, (4) Radiation Detection Instrumentation - Calibration and Maintenance, (5) Environmental Dosimetry, and (6) An integrated program of Quality Assurance. To assure consistency, completeness and the quality of the data produced, a methodology and procedures handbook is being developed. This paper discusses the structure, assets and operations of FRMAC monitoring and analysis and the content and preparation of this handbook.

  14. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 1, Operations

    SciTech Connect

    NSTec Aerial Measurement Systems

    2012-07-31

    The Monitoring division is primarily responsible for the coordination and direction of: Aerial measurements to delineate the footprint of radioactive contaminants that have been released into the environment. Monitoring of radiation levels in the environment; Sampling to determine the extent of contaminant deposition in soil, water, air and on vegetation; Preliminary field analyses to quantify soil concentrations or depositions; and Environmental and personal dosimetry for FRMAC field personnel, during a Consequence Management Response Team (CMRT) and Federal Radiological Monitoring and Assessment Center (FRMAC) response. Monitoring and sampling techniques used during CM/FRMAC operations are specifically selected for use during radiological emergencies where large numbers of measurements and samples must be acquired, analyzed, and interpreted in the shortest amount of time possible. In addition, techniques and procedures are flexible so that they can be used during a variety of different scenarios; e.g., accidents involving releases from nuclear reactors, contamination by nuclear waste, nuclear weapon accidents, space vehicle reentries, or contamination from a radiological dispersal device. The Monitoring division also provides technicians to support specific Health and Safety Division activities including: The operation of the Hotline; FRMAC facility surveys; Assistance with Health and Safety at Check Points; and Assistance at population assembly areas which require support from the FRMAC. This volume covers deployment activities, initial FRMAC activities, development and implementation of the monitoring and assessment plan, the briefing of field teams, and the transfer of FRMAC to the EPA.

  15. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Bhaskaran, A.; Chen, S.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2010-12-01

    Currently the SCEDC archives continuous and triggered data from nearly 5000 data channels from 425 SCSN recorded stations, processing and archiving an average of 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC in the past year. Updated hardware: ? The SCEDC has more than doubled its waveform file storage capacity by migrating to 2 TB disks. New data holdings: ? Waveform data: Beginning Jan 1, 2010 the SCEDC began continuously archiving all high-sample-rate strong-motion channels. All seismic channels recorded by SCSN are now continuously archived and available at SCEDC. ? Portable data from El Mayor Cucapah 7.2 sequence: Seismic waveforms from portable stations installed by researchers (contributed by Elizabeth Cochran, Jamie Steidl, and Octavio Lazaro-Mancilla) have been added to the archive and are accessible through STP either as continuous data or associated with events in the SCEDC earthquake catalog. This additional data will help SCSN analysts and researchers improve event locations from the sequence. ? Real time GPS solutions from El Mayor Cucapah 7.2 event: Three component 1Hz seismograms of California Real Time Network (CRTN) GPS stations, from the April 4, 2010, magnitude 7.2 El Mayor-Cucapah earthquake are available in SAC format at the SCEDC. These time series were created by Brendan Crowell, Yehuda Bock, the project PI, and Mindy Squibb at SOPAC using data from the CRTN. The El Mayor-Cucapah earthquake demonstrated definitively the power of real-time high-rate GPS data: they measure dynamic displacements directly, they do not clip and they are also able to detect the permanent (coseismic) surface deformation. ? Triggered data from the Quake Catcher Network (QCN) and Community Seismic Network (CSN): The SCEDC in cooperation with QCN and CSN is exploring ways to archive and distribute data from high density low cost networks. As a starting point the SCEDC will store a dataset from QCN and CSN and distribute it through a separate STP client. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. XML formats: ? The SCEDC is now distributing earthquake parameter data through web services in QuakeML format. ? The SCEDC in collaboration with the Northern California Earthquake Data Center (NCEDC) and USGS Golden has reviewed and revised the StationXML format to produce version 2.0. The new version includes a rules on extending the schema, use of named complex types, and greater consistency in naming conventions. Based on this work we plan to develop readers and writers of the StationXML format.

  16. Logic-centered architecture for ubiquitous health monitoring.

    PubMed

    Lewandowski, Jacek; Arochena, Hisbel E; Naguib, Raouf N G; Chao, Kuo-Ming; Garcia-Perez, Alexeis

    2014-09-01

    One of the key points to maintain and boost research and development in the area of smart wearable systems (SWS) is the development of integrated architectures for intelligent services, as well as wearable systems and devices for health and wellness management. This paper presents such a generic architecture for multiparametric, intelligent and ubiquitous wireless sensing platforms. It is a transparent, smartphone-based sensing framework with customizable wireless interfaces and plug'n'play capability to easily interconnect third party sensor devices. It caters to wireless body, personal, and near-me area networks. A pivotal part of the platform is the integrated inference engine/runtime environment that allows the mobile device to serve as a user-adaptable personal health assistant. The novelty of this system lays in a rapid visual development and remote deployment model. The complementary visual Inference Engine Editor that comes with the package enables artificial intelligence specialists, alongside with medical experts, to build data processing models by assembling different components and instantly deploying them (remotely) on patient mobile devices. In this paper, the new logic-centered software architecture for ubiquitous health monitoring applications is described, followed by a discussion as to how it helps to shift focus from software and hardware development, to medical and health process-centered design of new SWS applications. PMID:25192566

  17. Combination of High Rate, Real-Time GNSS and Accelerometer Observations and Rapid Seismic Event Notification for Earthquake Early Warning and Volcano Monitoring with a Focus on the Pacific Rim.

    NASA Astrophysics Data System (ADS)

    Zimakov, L. G.; Passmore, P.; Raczka, J.; Alvarez, M.; Jackson, M.

    2014-12-01

    Scientific GNSS networks are moving towards a model of real-time data acquisition, epoch-by-epoch storage integrity, and on-board real-time position and displacement calculations. This new paradigm allows the integration of real-time, high-rate GNSS displacement information with acceleration and velocity data to create very high-rate displacement records. The mating of these two instruments allows the creation of a new, very high-rate (200 sps) displacement observable that has the full-scale displacement characteristics of GNSS and high-precision dynamic motions of seismic technologies. It is envisioned that these new observables can be used for earthquake early warning studies, volcano monitoring, and critical infrastructure monitoring applications. Our presentation will focus on the characteristics of GNSS, seismic, and strong motion sensors in high dynamic environments, including historic earthquakes in Southern California and the Pacific Rim, replicated on a shake table, over a range of displacements and frequencies. We will explore the optimum integration of these sensors from a filtering perspective including simple harmonic impulses over varying frequencies and amplitudes and under the dynamic conditions of various earthquake scenarios. In addition we will discuss implementation of a Rapid Seismic Event Notification System that provides quick delivery of digital data from seismic stations to the acquisition and processing center and a full data integrity model for real-time earthquake notification that provides warning prior to significant ground shaking.

  18. Re-centering variable friction device for vibration control of structures subjected to near-field earthquakes

    NASA Astrophysics Data System (ADS)

    Ozbulut, Osman E.; Hurlebaus, Stefan

    2011-11-01

    This paper proposes a re-centering variable friction device (RVFD) for control of civil structures subjected to near-field earthquakes. The proposed hybrid device has two sub-components. The first sub-component of this hybrid device consists of shape memory alloy (SMA) wires that exhibit a unique hysteretic behavior and full recovery following post-transformation deformations. The second sub-component of the hybrid device consists of variable friction damper (VFD) that can be intelligently controlled for adaptive semi-active behavior via modulation of its voltage level. In general, installed SMA devices have the ability to re-center structures at the end of the motion and VFDs can increase the energy dissipation capacity of structures. The full realization of these devices into a singular, hybrid form which complements the performance of each device is investigated in this study. A neuro-fuzzy model is used to capture rate- and temperature-dependent nonlinear behavior of the SMA components of the hybrid device. An optimal fuzzy logic controller (FLC) is developed to modulate voltage level of VFDs for favorable performance in a RVFD hybrid application. To obtain optimal controllers for concurrent mitigation of displacement and acceleration responses, tuning of governing fuzzy rules is conducted by a multi-objective heuristic optimization. Then, numerical simulation of a multi-story building is conducted to evaluate the performance of the hybrid device. Results show that a re-centering variable friction device modulated with a fuzzy logic control strategy can effectively reduce structural deformations without increasing acceleration response during near-field earthquakes.

  19. Federal Radiological Monitoring and Assessment Center Analytical Response

    SciTech Connect

    E.C. Nielsen

    2003-04-01

    The Federal Radiological Monitoring and Assessment Center (FRMAC) is authorized by the Federal Radiological Emergency Response Plan to coordinate all off-site radiological response assistance to state and local government s, in the event of a major radiological emergency in the United States. The FRMAC is established by the U.S. Department of Energy, National Nuclear Security Administration, to coordinate all Federal assets involved in conducting a comprehensive program of radiological environmental monitoring, sampling, radioanalysis, quality assurance, and dose assessment. During an emergency response, the initial analytical data is provided by portable field instrumentation. As incident responders scale up their response based on the seriousness of the incident, local analytical assets and mobile laboratories add additional capability and capacity. During the intermediate phase of the response, data quality objectives and measurement quality objectives are more rigorous. These higher objectives will require the use of larger laboratories, with greater capacity and enhanced capabilities. These labs may be geographically distant from the incident, which will increase sample management challenges. This paper addresses emergency radioanalytical capability and capacity and its utilization during FRMAC operations.

  20. The Savannah River Technology Center environmental monitoring field test platform

    SciTech Connect

    Rossabi, J.

    1993-03-05

    Nearly all industrial facilities have been responsible for introducing synthetic chemicals into the environment. The Savannah River Site is no exception. Several areas at the site have been contaminated by chlorinated volatile organic chemicals. Because of the persistence and refractory nature of these contaminants, a complete clean up of the site will take many years. A major focus of the mission of the Environmental Sciences Section of the Savannah River Technology Center is to develop better, faster, and less expensive methods for characterizing, monitoring, and remediating the subsurface. These new methods can then be applied directly at the Savannah River Site and at other contaminated areas in the United States and throughout the world. The Environmental Sciences Section has hosted field testing of many different monitoring technologies over the past two years primarily as a result of the Integrated Demonstration Program sponsored by the Department of Energy`s Office of Technology Development. This paper provides an overview of some of the technologies that have been demonstrated at the site and briefly discusses the applicability of these techniques.

  1. Earthquakes for Kids

    MedlinePlus

    ... Education FAQ Earthquake Glossary For Kids Prepare Google Earth/KML Files Earthquake Summary Posters Photos Publications Share ... for Education FAQ EQ Glossary For Kids Google Earth/KML Files EQ Summary Posters Photos Publications Monitoring ...

  2. Restoration of accelerator facilities damaged by Great East Japan Earthquake at Cyclotron and Radioisotope Center, Tohoku University.

    PubMed

    Wakui, Takashi; Itoh, Masatoshi; Shimada, Kenzi; Yoshida, Hidetomo P; Shinozuka, Tsutomu; Sakemi, Yasuhiro

    2014-01-01

    The Cyclotron and Radioisotope Center (CYRIC) of Tohoku University is a joint-use institution for education and research in a wide variety of fields ranging from physics to medicine. Accelerator facilities at the CYRIC provide opportunities for implementing a broad research program, including medical research using positron emission tomography (PET), with accelerated ions and radioisotopes. At the Great East Japan Earthquake on March 11, 2011, no human injuries occurred and a smooth evacuation was made in the CYRIC, thanks to the anti-earthquake measures such as the renovation of the cyclotron building in 2009 mainly to provide seismic strengthening, fixation of shelves to prevent the falling of objects, and securement of the width of the evacuation route. The preparation of an emergency response manual was also helpful. However, the accelerator facilities were damaged because of strong shaking that continued for a few minutes. For example, two columns on which a 930 cyclotron was placed were damaged, and thereby the 930 cyclotron was inclined. All the elements of beam transport lines were deviated from the beam axis. Some peripheral devices in a HM12 cyclotron were broken. Two shielding doors fell from the carriage onto the floor and blocked the entrances to the rooms. The repair work on the accelerator facilities was started at the end of July 2011. During the repair work, the joint use of the accelerator facilities was suspended. After the repair work was completed, the joint use was re-started at October 2012, one and a half years after the earthquake. PMID:25030295

  3. The response of academic medical centers to the 2010 Haiti earthquake: the Mount Sinai School of Medicine experience.

    PubMed

    Ripp, Jonathan A; Bork, Jacqueline; Koncicki, Holly; Asgary, Ramin

    2012-01-01

    On January 12, 2010, Haiti was struck by a 7.0 earthquake which left the country in a state of devastation. In the aftermath, there was an enormous relief effort in which academic medical centers (AMC) played an important role. We offer a retrospective on the AMC response through the Mount Sinai School of Medicine (MSSM) experience. Over the course of the year that followed the Earthquake, MSSM conducted five service trips in conjunction with two well-established groups which have provided service to the Haitian people for over 15 years. MSSM volunteer personnel included nurses, resident and attending physicians, and specialty fellows who provided expertise in critical care, emergency medicine, wound care, infectious diseases and chronic disease management of adults and children. Challenges faced included stressful and potentially hazardous working conditions, provision of care with limited resources and cultural and language barriers. The success of the MSSM response was due largely to the strength of its human resources and the relationship forged with effective relief organizations. These service missions fulfilled the institution's commitment to social responsibility and provided a valuable training opportunity in advocacy. For other AMCs seeking to respond in future emergencies, we suggest early identification of a partner with field experience, recruitment of administrative and faculty support across the institution, significant pre-departure orientation and utilization of volunteers to fundraise and advocate. Through this process, AMCs can play an important role in disaster response. PMID:22232447

  4. The Response of Academic Medical Centers to the 2010 Haiti Earthquake: The Mount Sinai School of Medicine Experience

    PubMed Central

    Ripp, Jonathan A.; Bork, Jacqueline; Koncicki, Holly; Asgary, Ramin

    2012-01-01

    On January 12, 2010, Haiti was struck by a 7.0 earthquake which left the country in a state of devastation. In the aftermath, there was an enormous relief effort in which academic medical centers (AMC) played an important role. We offer a retrospective on the AMC response through the Mount Sinai School of Medicine (MSSM) experience. Over the course of the year that followed the Earthquake, MSSM conducted five service trips in conjunction with two well-established groups which have provided service to the Haitian people for over 15 years. MSSM volunteer personnel included nurses, resident and attending physicians, and specialty fellows who provided expertise in critical care, emergency medicine, wound care, infectious diseases and chronic disease management of adults and children. Challenges faced included stressful and potentially hazardous working conditions, provision of care with limited resources and cultural and language barriers. The success of the MSSM response was due largely to the strength of its human resources and the relationship forged with effective relief organizations. These service missions fulfilled the institution's commitment to social responsibility and provided a valuable training opportunity in advocacy. For other AMCs seeking to respond in future emergencies, we suggest early identification of a partner with field experience, recruitment of administrative and faculty support across the institution, significant pre-departure orientation and utilization of volunteers to fundraise and advocate. Through this process, AMCs can play an important role in disaster response. PMID:22232447

  5. Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building

    USGS Publications Warehouse

    Kohler, M.D.; Davis, P.M.; Safak, E.

    2005-01-01

    Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.

  6. Utilizing Changes in Repeating Earthquakes to Monitor Evolving Processes and Structure Before and During Volcanic Eruptions

    NASA Astrophysics Data System (ADS)

    Hotovec-Ellis, Alicia

    Repeating earthquakes are two or more earthquakes that share the same source location and source mechanism, which results in the earthquakes having highly similar waveforms when recorded at a seismic instrument. Repeating earthquakes have been observed in a wide variety of environments: from fault systems (such as the San Andreas and Cascadia subduction zone), to hydrothermal areas and volcanoes. Volcano seismologists are particularly concerned with repeating earthquakes, as they have been observed at volcanoes along the entire range of eruptive style and are often a prominent feature of eruption seismicity. The behavior of repeating earthquakes sometimes changes with time, which possibly reflects subtle changes in the mechanism creating the earthquakes. In Chapter 1, we document an example of repeating earthquakes during the 2009 eruption of Redoubt volcano that became increasingly frequent with time, until they blended into harmonic tremor prior to several explosions. We interpreted the source of the earthquakes as stick-slip on a fault near the conduit that slipped increasingly often as the explosion neared in response to the build-up of pressure in the system. The waveforms of repeating earthquakes may also change, even if the behavior does not. We can quantify changes in waveform using the technique of coda wave interferometry to differentiate between changes in source and medium. In Chapters 2 and 3, we document subtle changes in the coda of repeating earthquakes related to small changes in the near-surface velocity structure at Mount St. Helens before and during its eruption in 2004. Velocity changes have been observed prior to several volcanic eruptions, are thought to occur in response to volumetric strain and the opening or closing of cracks in the subsurface. We compared continuous records of velocity change against other geophysical data, and found that velocities at Mount St. Helens change in response to snow loading, fluid saturation, shaking from large distant earthquakes, shallow pressurization, and possibly lava extrusion. Velocity changes at Mount St. Helens are a complex mix of many different effects, and other complementary data are required to interpret the signal.

  7. Comprehensive Nuclear-Test-Ban Treaty seismic monitoring: 2012 USNAS report and recent explosions, earthquakes, and other seismic sources

    SciTech Connect

    Richards, Paul G.

    2014-05-09

    A comprehensive ban on nuclear explosive testing is briefly characterized as an arms control initiative related to the Non-Proliferation Treaty. The work of monitoring for nuclear explosions uses several technologies of which the most important is seismology-a physics discipline that draws upon extensive and ever-growing assets to monitor for earthquakes and other ground-motion phenomena as well as for explosions. This paper outlines the basic methods of seismic monitoring within that wider context, and lists web-based and other resources for learning details. It also summarizes the main conclusions, concerning capability to monitor for test-ban treaty compliance, contained in a major study published in March 2012 by the US National Academy of Sciences.

  8. Detection and monitoring of earthquake precursors: TwinSat, a Russia-UK satellite project

    NASA Astrophysics Data System (ADS)

    Chmyrev, Vitaly; Smith, Alan; Kataria, Dhiren; Nesterov, Boris; Owen, Christopher; Sammonds, Peter; Sorokin, Valery; Vallianatos, Filippos

    2013-09-01

    There is now a body of evidence to indicate that coupling occurs between the lithosphere-atmosphere-ionosphere prior to earthquake events. Nevertheless the physics of these phenomena and the possibilities of their use as part of an earthquake early warning system remain poorly understood. Proposed here is a programme to create a much greater understanding in this area through the deployment of a dedicated space asset along with coordinated ground stations, modelling and the creation of a highly accessible database. The space element would comprise 2 co-orbiting spacecraft (TwinSat) involving a microsatellite and a nanosatellite, each including a suite of science instruments appropriate to this study. Over a mission duration of 3 years ∼ 400 earthquakes in the range 6-6.9 on the Richter scale would be ‘observed’. Such a programme is a prerequisite for an effective earthquake early warning system.

  9. Ambient noise-based monitoring of seismic velocity changes associated with the 2014 Mw 6.0 South Napa earthquake

    NASA Astrophysics Data System (ADS)

    Taira, Taka'aki; Brenguier, Florent; Kong, Qingkai

    2015-09-01

    We perform an ambient noise-based monitoring to explore temporal variations of crustal seismic velocities before, during, and after the 24 August 2014 Mw 6.0 South Napa earthquake. A velocity drop of about 0.08% is observed immediately after the South Napa earthquake. Spatial variability of the velocity reduction is most correlated with the pattern of the peak ground velocity of the South Napa mainshock, which suggests that fracture damage in rocks induced by the dynamic strain is likely responsible for the coseismic velocity change. About 50% of the velocity reduction is recovered at the first 50 days following the South Napa mainshock. This postseismic velocity recovery may suggest a healing process of damaged rocks.

  10. Early Results of Three-Year Monitoring of Red Wood Ants’ Behavioral Changes and Their Possible Correlation with Earthquake Events

    PubMed Central

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-01-01

    Simple Summary For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the video streams. Based on this automated approach, a statistical analysis of the ant behavior will be carried out. Abstract Short-term earthquake predictions with an advance warning of several hours or days are currently not possible due to both incomplete understanding of the complex tectonic processes and inadequate observations. Abnormal animal behaviors before earthquakes have been reported previously, but create problems in monitoring and reliability. The situation is different with red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae)). They have stationary mounds on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas. For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras with both a color and an infrared sensor. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the more than 45,000 hours of video streams. Based on this automated approach, a statistical analysis of the ants’ behavior will be carried out. In addition, other parameters (climate, geotectonic and biological), which may influence behavior, will be included in the analysis. PMID:26487310

  11. The Development of Several Electromagnetic Monitoring Strategies and Algorithms for Validating Pre-Earthquake Electromagnetic Signals

    NASA Astrophysics Data System (ADS)

    Bleier, T. E.; Dunson, J. C.; Roth, S.; Mueller, S.; Lindholm, C.; Heraud, J. A.

    2012-12-01

    QuakeFinder, a private research group in California, reports on the development of a 100+ station network consisting of 3-axis induction magnetometers, and air conductivity sensors to collect and characterize pre-seismic electromagnetic (EM) signals. These signals are combined with daily Infra Red signals collected from the GOES weather satellite infrared (IR) instrument to compare and correlate with the ground EM signals, both from actual earthquakes and boulder stressing experiments. This presentation describes the efforts QuakeFinder has undertaken to automatically detect these pulse patterns using their historical data as a reference, and to develop other discriminative algorithms that can be used with air conductivity sensors, and IR instruments from the GOES satellites. The overall big picture results of the QuakeFinder experiment are presented. In 2007, QuakeFinder discovered the occurrence of strong uni-polar pulses in their magnetometer coil data that increased in tempo dramatically prior to the M5.1 earthquake at Alum Rock, California. Suggestions that these pulses might have been lightning or power-line arcing did not fit with the data actually recorded as was reported in Bleier [2009]. Then a second earthquake occurred near the same site on January 7, 2010 as was reported in Dunson [2011], and the pattern of pulse count increases before the earthquake occurred similarly to the 2007 event. There were fewer pulses, and the magnitude of them was decreased, both consistent with the fact that the earthquake was smaller (M4.0 vs M5.4) and farther away (7Km vs 2km). At the same time similar effects were observed at the QuakeFinder Tacna, Peru site before the May 5th, 2010 M6.2 earthquake and a cluster of several M4-5 earthquakes.

  12. First Results of 3 Year Monitoring of Red Wood Ants' Behavioural Changes and Their Possible Correlation with Earthquake Events

    NASA Astrophysics Data System (ADS)

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-04-01

    Short-term earthquake predictions with an advance warning of several hours or days can currently not be performed reliably and remain limited to only a few minutes before the event. Abnormal animal behaviours prior to earthquakes have been reported previously but their detection creates problems in monitoring and reliability. A different situation is encountered for red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae). They have stationary nest sites on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas and are simultaneously information channels deeply reaching into the crust. A particular advantage of monitoring RWA is their high sensitivity to environmental changes. Besides an evolutionarily developed extremely strong temperature sensitivity of 0.25 K, they have chemoreceptors for the detection of CO2 concentrations and a sensitivity for electromagnetic fields. Changes of the electromagnetic field are discussed or short-lived "thermal anomalies" are reported as trigger mechanisms for bioanomalies of impending earthquakes. For 3 years, we have monitored two Red Wood Ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), 24/7 by high-resolution cameras equipped with a colour and infrared sensor. In the Neuwied Basin, an average of about 100 earthquakes per year with magnitudes up to M 3.9 occur located on different tectonic fault regimes (strike-slip faults and/or normal or thrust faults). The RWA mounds are located on two different fault regimes approximately 30 km apart. First results show that the ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants' behaviour hours before the earthquake event: The nocturnal rest phase and daily activity are suppressed, and standard daily routine is continued not before the next day. Additional parameters that might have an effect on the ants' daily routine (including climate data, earth tides, lunar phases and biological parameters) are recorded and correlated with the analysed daily activity. Additionally, nest air measurements (CO2, Helium, Radon, H2S and CH4) are performed at intervals. At present, an automated image analysis routine is being applied to the acquired more than 45,000 hours of video stream data. It is a valuable tool to objectively identify and classify the ants' activity on top of mounds and to examine possible correlations with earthquakes. Based on this automated approach, a statistical analysis of the ants' behaviour is intended. The investigation and results presented here are a first access to a completely new research complex. The key question is whether the ants' behavioural changes and their correlation with earthquake events are statistically significant and if a detection by an automated system is possible. Long-term studies have to show whether confounding factors and climatic influences can clearly be distinguished. Although the first results suggest that it is promising to consolidate and extend the research to determine a pattern for exceptional situations, there is, however, still a long way to go for a usable automated earthquake warning system. References Berberich G (2010): Identifikation junger gasführender Störungszonen in der West- und Hocheifel mit Hilfe von Bioindikatoren. Dissertation. Essen, 293 S. Berberich G, Klimetzek D, Wöhler C., and Grumpe A (2012): Statistical Correlation between Red Wood Ant Sites and Neotectonic Strike-Slip Faults. Geophysical Research Abstracts Vol. 14, EGU2012-3518 Berberich G, Berberich M, Grumpe A, Wöhler C., and Schreiber U (2012): First Results of 3 Year Monitoring of Red Wood Ants' Behavioural Changes and Their Possible Correlation with Earthquake Events. Animals, ISSN 2076-2615,. Special Issue "Biological Anomalies Prior to Earthquakes") (in prep.) Dologlou E. (2010): Recent aspects on possible interrelation between precursory electric signals and anomalous bioeffects. Nat. Hazards Earth Syst. Sci., 10, 1951-1955. Kirchner, W (2007): Die Ameisen - Biologie und Verhalten. Verlag C.H. Beck, 125 p. Hetz SK, Bradley TJ (2005): Insects breathe discontinuously to avoid oxygen toxicity. Nature 433. Ouzounov D, Freund F (2004): Mid-infrared emission prior to strong earthquakes analyzed by remote sensing data, Adv. pace Res., 33, 268-273. Weaver JC, Vaughan TE, Astumian RD (2000): Biological sensing of small field differences by magnetically sensitive chemical reactions. Nature. 405. Weaver J.C. (2002): Understanding conditions for which biological effects of nonionizing electromagnetic fields can be expected. Bioelectrochemistry, 56, 207- 209.

  13. The Community Seismic Network and Quake-Catcher Network: Monitoring building response to earthquakes through community instrumentation

    NASA Astrophysics Data System (ADS)

    Cheng, M.; Kohler, M. D.; Heaton, T. H.; Clayton, R. W.; Chandy, M.; Cochran, E.; Lawrence, J. F.

    2013-12-01

    The Community Seismic Network (CSN) and Quake-Catcher Network (QCN) are dense networks of low-cost ($50) accelerometers that are deployed by community volunteers in their homes in California. In addition, many accelerometers are installed in public spaces associated with civic services, publicly-operated utilities, university campuses, and high-rise buildings. Both CSN and QCN consist of observation-based structural monitoring which is carried out using records from one to tens of stations in a single building. We have deployed about 150 accelerometers in a number of buildings ranging between five and 23 stories in the Los Angeles region. In addition to a USB-connected device which connects to the host's computer, we have developed a stand-alone sensor-plug-computer device that directly connects to the internet via Ethernet or WiFi. In the case of CSN, the sensors report data to the Google App Engine cloud computing service consisting of data centers geographically distributed across the continent. This robust infrastructure provides parallelism and redundancy during times of disaster that could affect hardware. The QCN sensors, however, are connected to netbooks with continuous data streaming in real-time via the distributed computing Berkeley Open Infrastructure for Network Computing software program to a server at Stanford University. In both networks, continuous and triggered data streams use a STA/LTA scheme to determine the occurrence of significant ground accelerations. Waveform data, as well as derived parameters such as peak ground acceleration, are then sent to the associated archives. Visualization models of the instrumented buildings' dynamic linear response have been constructed using Google SketchUp and MATLAB. When data are available from a limited number of accelerometers installed in high rises, the buildings are represented as simple shear beam or prismatic Timoshenko beam models with soil-structure interaction. Small-magnitude earthquake records are used to identify the first two pairs of horizontal vibrational frequencies, which are then used to compute the response on every floor of the building, constrained by the observed data. The approach has been applied to a CSN-instrumented 12-story reinforced concrete building near downtown Los Angeles. The frequencies were identified directly from spectra of the 8 August 2012 M4.5 Yorba Linda, California earthquake acceleration time series. When the basic dimensions and the first two frequencies are input into a prismatic Timoshenko beam model of the building, the model yields mode shapes that have been shown to match well with densely recorded data. For the instrumented 12-story building, comparisons of the predictions of responses on other floors using only the record from the 9th floor with actual data from the other floors shows this method to approximate the true response remarkably well.

  14. Space Monitoring Data Center at Moscow State University

    NASA Astrophysics Data System (ADS)

    Kalegaev, Vladimir; Bobrovnikov, Sergey; Barinova, Vera; Myagkova, Irina; Shugay, Yulia; Barinov, Oleg; Dolenko, Sergey; Mukhametdinova, Ludmila; Shiroky, Vladimir

    Space monitoring data center of Moscow State University provides operational information on radiation state of the near-Earth space. Internet portal http://swx.sinp.msu.ru/ gives access to the actual data characterizing the level of solar activity, geomagnetic and radiation conditions in the magnetosphere and heliosphere in the real time mode. Operational data coming from space missions (ACE, GOES, ELECTRO-L1, Meteor-M1) at L1, LEO and GEO and from the Earth’s surface are used to represent geomagnetic and radiation state of near-Earth environment. On-line database of measurements is also maintained to allow quick comparison between current conditions and conditions experienced in the past. The models of space environment working in autonomous mode are used to generalize the information obtained from observations on the whole magnetosphere. Interactive applications and operational forecasting services are created on the base of these models. They automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons using data from LEO orbits. Special forecasting services give short-term forecast of SEP penetration to the Earth magnetosphere at low altitudes, as well as relativistic electron fluxes at GEO. Velocities of recurrent high speed solar wind streams on the Earth orbit are predicted with advance time of 3-4 days on the basis of automatic estimation of the coronal hole areas detected on the images of the Sun received from the SDO satellite. By means of neural network approach, Dst and Kp indices online forecasting 0.5-1.5 hours ahead, depending on solar wind and the interplanetary magnetic field, measured by ACE satellite, is carried out. Visualization system allows representing experimental and modeling data in 2D and 3D.

  15. Managing Multi-center Flow Cytometry Data for Immune Monitoring

    PubMed Central

    White, Scott; Laske, Karoline; Welters, Marij JP; Bidmon, Nicole; van der Burg, Sjoerd H; Britten, Cedrik M; Enzor, Jennifer; Staats, Janet; Weinhold, Kent J; Gouttefangeas, Cécile; Chan, Cliburn

    2014-01-01

    With the recent results of promising cancer vaccines and immunotherapy1–5, immune monitoring has become increasingly relevant for measuring treatment-induced effects on T cells, and an essential tool for shedding light on the mechanisms responsible for a successful treatment. Flow cytometry is the canonical multi-parameter assay for the fine characterization of single cells in solution, and is ubiquitously used in pre-clinical tumor immunology and in cancer immunotherapy trials. Current state-of-the-art polychromatic flow cytometry involves multi-step, multi-reagent assays followed by sample acquisition on sophisticated instruments capable of capturing up to 20 parameters per cell at a rate of tens of thousands of cells per second. Given the complexity of flow cytometry assays, reproducibility is a major concern, especially for multi-center studies. A promising approach for improving reproducibility is the use of automated analysis borrowing from statistics, machine learning and information visualization21–23, as these methods directly address the subjectivity, operator-dependence, labor-intensive and low fidelity of manual analysis. However, it is quite time-consuming to investigate and test new automated analysis techniques on large data sets without some centralized information management system. For large-scale automated analysis to be practical, the presence of consistent and high-quality data linked to the raw FCS files is indispensable. In particular, the use of machine-readable standard vocabularies to characterize channel metadata is essential when constructing analytic pipelines to avoid errors in processing, analysis and interpretation of results. For automation, this high-quality metadata needs to be programmatically accessible, implying the need for a consistent Application Programming Interface (API). In this manuscript, we propose that upfront time spent normalizing flow cytometry data to conform to carefully designed data models enables automated analysis, potentially saving time in the long run. The ReFlow informatics framework was developed to address these data management challenges. PMID:26085786

  16. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site

    EPA Science Inventory

    The presentation covers the following monitoring objectives at the demonstration site at Edison, NJ: Hydrologic performance, water quality performance, urban heat island effects, maintenance effects and infiltration water parameters. There will be a side by side monitoring of ...

  17. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  18. Martin Marietta Energy Systems, Inc. comprehensive earthquake management plan: Emergency Operations Center training manual

    SciTech Connect

    Not Available

    1990-02-28

    The objective of this training is to: describe the responsibilities, resources, and goals of the Emergency Operations Center and be able to evaluate and interpret this information to best direct and allocate emergency, plant, and other resources to protect life and the Paducah Gaseous Diffusion Plant.

  19. The Design, Implementation, and Evaluation of a Bilingual Placement and Monitoring Center.

    ERIC Educational Resources Information Center

    Golub, Lester S.

    In 1981, the Bilingual Placement and Monitoring Instructional Support System (BPMIS) Center in the School District of Lancaster (Pennsylvania) was established. Partially supported by Title VII funds, the Center provides comprehensive services to assure sound educational placement, continuous monitoring, successful transition into all…

  20. Passive seismic monitoring of natural and induced earthquakes: case studies, future directions and socio-economic relevance

    USGS Publications Warehouse

    Bohnhoff, Marco; Dresen, Georg; Ellsworth, William L.; Ito, Hisao

    2010-01-01

    An important discovery in crustal mechanics has been that the Earth’s crust is commonly stressed close to failure, even in tectonically quiet areas. As a result, small natural or man-made perturbations to the local stress field may trigger earthquakes. To understand these processes, Passive Seismic Monitoring (PSM) with seismometer arrays is a widely used technique that has been successfully applied to study seismicity at different magnitude levels ranging from acoustic emissions generated in the laboratory under controlled conditions, to seismicity induced by hydraulic stimulations in geological reservoirs, and up to great earthquakes occurring along plate boundaries. In all these environments the appropriate deployment of seismic sensors, i.e., directly on the rock sample, at the earth’s surface or in boreholes close to the seismic sources allows for the detection and location of brittle failure processes at sufficiently low magnitude-detection threshold and with adequate spatial resolution for further analysis. One principal aim is to develop an improved understanding of the physical processes occurring at the seismic source and their relationship to the host geologic environment. In this paper we review selected case studies and future directions of PSM efforts across a wide range of scales and environments. These include induced failure within small rock samples, hydrocarbon reservoirs, and natural seismicity at convergent and transform plate boundaries. Each example represents a milestone with regard to bridging the gap between laboratory-scale experiments under controlled boundary conditions and large-scale field studies. The common motivation for all studies is to refine the understanding of how earthquakes nucleate, how they proceed and how they interact in space and time. This is of special relevance at the larger end of the magnitude scale, i.e., for large devastating earthquakes due to their severe socio-economic impact.

  1. Synergistic combination of systems for structural health monitoring and earthquake early warning for structural health prognosis and diagnosis

    NASA Astrophysics Data System (ADS)

    Wu, Stephen; Beck, James L.

    2012-04-01

    Earthquake early warning (EEW) systems are currently operating nationwide in Japan and are in beta-testing in California. Such a system detects an earthquake initiation using online signals from a seismic sensor network and broadcasts a warning of the predicted location and magnitude a few seconds to a minute or so before an earthquake hits a site. Such a system can be used synergistically with installed structural health monitoring (SHM) systems to enhance pre-event prognosis and post-event diagnosis of structural health. For pre-event prognosis, the EEW system information can be used to make probabilistic predictions of the anticipated damage to a structure using seismic loss estimation methodologies from performance-based earthquake engineering. These predictions can support decision-making regarding the activation of appropriate mitigation systems, such as stopping traffic from entering a bridge that has a predicted high probability of damage. Since the time between warning and arrival of the strong shaking is very short, probabilistic predictions must be rapidly calculated and the decision making automated for the mitigation actions. For post-event diagnosis, the SHM sensor data can be used in Bayesian updating of the probabilistic damage predictions with the EEW predictions as a prior. Appropriate Bayesian methods for SHM have been published. In this paper, we use pre-trained surrogate models (or emulators) based on machine learning methods to make fast damage and loss predictions that are then used in a cost-benefit decision framework for activation of a mitigation measure. A simple illustrative example of an infrastructure application is presented.

  2. Catalog of Earthquake Hypocenters at Alaskan Volcanoes: January 1 through December 31, 2007

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.

    2008-01-01

    Between January 1 and December 31, 2007, AVO located 6,664 earthquakes of which 5,660 occurred within 20 kilometers of the 33 volcanoes monitored by the Alaska Volcano Observatory. Monitoring highlights in 2007 include: the eruption of Pavlof Volcano, volcanic-tectonic earthquake swarms at the Augustine, Illiamna, and Little Sitkin volcanic centers, and the cessation of episodes of unrest at Fourpeaked Mountain, Mount Veniaminof and the northern Atka Island volcanoes (Mount Kliuchef and Korovin Volcano). This catalog includes descriptions of : (1) locations of seismic instrumentation deployed during 2007; (2) earthquake detection, recording, analysis, and data archival systems; (3) seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2007; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2007.

  3. (abstract) GPS Monitoring of Crustal Deformation and the Earthquake Cycle in Costa Rica

    NASA Technical Reports Server (NTRS)

    Lundgren, Paul R.

    1994-01-01

    This paper will discuss the objectives, approach, and anticipated results of a study of earthquakes in Costa Rica. GPS measurements will be taken and field surveys will be made. Assessments of seismic strain accumulation and post-seismic deformation will be made in an effort to understand the effect these processes have on regional tectonic models.

  4. Drilling Across Active Faults in Deep Mines in South Africa for Monitoring Earthquake Processes in the Near-Field

    NASA Astrophysics Data System (ADS)

    Reches, Z.; Jordan, T. H.; Johnston, M. J.; Zoback, M.

    2005-12-01

    Deep mines can provide three-dimensional access to fault-zones that are likely to be activated by the mining operations. To take advantage of this option, we ranked 12 active faults in South African mines according to dimensions, internal structure, accessibility, and likelihood of M>3.0 seismic events. The selected site is on the Pretorius fault, which is 10 km long with throw of 30-60 m, and which is exposed at multiple levels in TauTona and Mponeng gold mines, Western Deep Levels. The DAFSAM-NELSAM project (earthquakes.ou.edu) focuses on establishing a natural earthquake laboratory along the Pretorius fault at 3.5 km depth in TauTona mine. The work on the site started in January, 2005, and has been devoted so far to site characterization, including 3D mapping and in-situ stress measurements, and drilling short holes for accelerometers-seismometers. Cross-fault drilling will initiate in September, 2005, and will include four boreholes 40-60 m long each for the installation of creepmeters strain meters, temperature sensors, acoustic emission transducers, and gas analyzers. The monitoring site is established where both sides of the Pretorius fault and the practice in the deep mines and numerical modeling predict profound increase of the seismic activity at the site during the next 2-4 years. The associated increase of shear stresses on the fault is expected to generate a few earthquakes of M>3.0 along the segments of the Pretorius fault. We will present the feature of the monitoring system, and the main current results of fault characteristics and state of stress.

  5. The continuous automatic monitoring network installed in Tuscany (Italy) since late 2002, to study earthquake precursory phenomena

    NASA Astrophysics Data System (ADS)

    Pierotti, Lisa; Cioni, Roberto

    2010-05-01

    Since late 2002, a continuous automatic monitoring network (CAMN) was designed, built and installed in Tuscany (Italy), in order to investigate and define the geochemical response of the aquifers to the local seismic activity. The purpose of the investigation was to identify eventual earthquake precursors. The CAMN is constituted by two groups of five measurement stations each. A first group has been installed in the Serchio and Magra graben (Garfagnana and Lunigiana Valleys, Northern Tuscany), while the second one, in the area of Mt. Amiata (Southern Tuscany), an extinct volcano. Garfagnana, Lunigiana and Mt. Amiata regions belong to the inner zone of the Northern Apennine fold-and-thrust belt. This zone has been involved in the post-collision extensional tectonics since the Upper Miocene-Pliocene. Such tectonic activity has produced horst and graben structures oriented from N-S to NW-SE that are transferred by NE-SW system. Both Garfagnana (Serchio graben) and Lunigiana (Magra graben) belong to the most inner sector of the belt where the seismic sources, responsible for the strongest earthquakes of the northern Apennine, are located (e.g. the M=6.5 earthquake of September 1920). The extensional processes in southern Tuscany have been accompanied by magmatic activity since the Upper Miocene, developing effusive and intrusive products traditionally attributed to the so-called Tuscan Magmatic Province. Mt. Amiata, whose magmatic activity ceased about 0.3 M.y. ago, belongs to the extensive Tyrrhenian sector that is characterized by high heat flow and crustal thinning. The whole zone is characterized by wide-spread but moderate seismicity (the maximum recorded magnitude has been 5.1 with epicentre in Piancastagnaio, 1919). The extensional regime in both the Garfagnana-Lunigiana and Mt. Amiata area is confirmed by the focal mechanisms of recent earthquakes. An essential phase of the monitoring activities has been the selection of suitable sites for the installation of monitoring stations. This has been carried out on the basis of: i) hydrogeologic and structural studies in order to assess the underground fluid circulation regime; ii) a detailed geochemical study of all the natural manifestations present in the selected territories, such as cold and hot springs and gas emission zones; iii) logistical aspects. Therefore, a detailed hydrogeochemical study was performed in 2002. A total of 150 water points were sampled in Garfagnana/Lunigiana area (N-W Tuscany) and analysed. Based on the results of this multidisciplinary study, five water points suitable for the installation of the monitoring stations, were selected. They are: Bagni di Lucca (Bernabò spring), Gallicano (Capriz spring) and Pieve Fosciana (Prà di Lama spring) in Garfagnana, Equi Terme (main spring feeding the swimming pool of the thermal resort) and Villafranca in Lunigiana (well feeding the public swimming pool). In the Amiata area, in the preliminary campaign, 69 water points were sampled and analyzed and five sites were selected. They are Piancastagnaio, Santa Fiora, Pian dei Renai and Bagnore, which are fed by the volcanic aquifer, and Bagno Vignoni borehole, which is fed by the evaporite carbonate aquifer. The installation and start-up process of the monitoring systems in the Garfagnana-Lunigiana area begun in November 2002; in the Monte Amiata region it begun in June 2003. From the day of installation, a periodic water sampling and manual measurement of the main physical and physicochemical parameters have been carried out on a monthly basis. Such activity has the double function of performing a cross-check of the monitoring instrumentation, and carrying out additional chemical and isotopic analysis. The continuous automatic monitoring stations operate with flowing water (about 5 litres per minute) and record the following parameters: temperature (T), pH, electrical conductivity (EC), redox potential (ORP) and the content of CO2 and CH4 dissolved in water. Data are acquired once per second; the average value, median value and variance of the samples collected over a period of 5 min are recorded in a local removable non-volatile memory (Compact Flash card). Data can be downloaded both onsite and in remote, via a GSM/GPRS modem connected to the embedded PC. The results of seven years of continuous monitoring can be summarised as follows: i) the monitoring stations made it possible to detect even small variations of the measured parameters, with respect to equivalent commercial devices; ii) acquired data made it possible to identify the groundwater circulation patterns; iii) in most locations, the observed trend of the acquired parameters is consistent with the periodic manual sampling results, and confirms the mixture of different water types that the hydrogeochemical model has determined. The absence of seismic events with a sufficient energy precluded the possibility to locate anomalies, with two exception: Equi Terme and Bagno Vignoni sites. At the Equi Terme station an anomalous increase in the dissolved CO2 content was observed twelve days before a M=3.7 earthquake occurred at a distance of 3 km north of the monitoring station. At the Bagno Vignoni station an anomalous decrease in the temperature and electrical conductivity signal was observed nine days before a M=3.3 earthquake occurred at a distance of 12 km est of the monitoring station. The CAMN resulted as being a suitable tool in order to investigate the anomalous variations of the physical, physicochemical and chemical parameters of aquifer systems as earthquake precursors.

  6. 88 hours: The U.S. Geological Survey National Earthquake Information Center response to the 11 March 2011 Mw 9.0 Tohoku earthquake

    USGS Publications Warehouse

    Hayes, G.P.; Earle, P.S.; Benz, H.M.; Wald, D.J.; Briggs, R.W.

    2011-01-01

    This article presents a timeline of NEIC response to a major global earthquake for the first time in a formal journal publication. We outline the key observations of the earthquake made by the NEIC and its partner agencies, discuss how these analyses evolved, and outline when and how this information was released to the public and to other internal and external parties. Our goal in the presentation of this material is to provide a detailed explanation of the issues faced in the response to a rare, giant earthquake. We envisage that the timeline format of this presentation can highlight technical and procedural successes and shortcomings, which may in turn help prompt research by our academic partners and further improvements to our future response efforts. We have shown how NEIC response efforts have significantly improved over the past six years since the great 2004 Sumatra-Andaman earthquake. We are optimistic that the research spawned from this disaster, and the unparalleled dense and diverse data sets that have been recorded, can lead to similar-and necessary-improvements in the future.

  7. Activity remotely triggered in volcanic and geothermal centers in California and Washington by the 3 November 2002 Mw=7.9 Alaska earthquake

    NASA Astrophysics Data System (ADS)

    Hill, D. P.; Prejean, S.; Oppenheimer, D.; Pitt, A. M.; S. D. Malone; Richards-Dinger, K.

    2002-12-01

    The M=7.9 Alaska earthquake of 3 November 2002 was followed by bursts of remotely triggered earthquakes at several volcanic and geothermal areas across the western United States at epicentral distances of 2,500 to 3,660 km. Husen et al. (this session) describe the triggered response for Yellowstone caldera, Wyoming. Here we highlight the triggered response for the Geysers geothermal field in northern California, Mammoth Mountain and Long Valley caldera in eastern California, the Coso geothermal field in southeastern California, and Mount Rainier in central Washington. The onset of triggered seismicity at each of these areas began 15 to 17 minutes after the Alaska earthquake during the S-wave coda and the early phases of the Love and Raleigh waves with periods of 5 to 40 seconds and dynamic strains of a few microstrain. In each case, the seismicity was characterized by spasmodic bursts of small (M<2 ), brittle-failure earthquakes. The activity persisted for just a few minutes at Mount Rainier and Mammoth Mountain and roughly 30 minutes at the Geysers and Coso geothermal fields. Many of the triggered earthquakes at all three sites were too small for reliable locations (magnitudes M<1), although their small S-P times indicate hypocentral locations within a few km of the nearest seismic station. Borehole dilatometers in vicinity of Mammoth Mountain recorded strain offsets on the order of 0.1 microstrain coincident in time with the triggered seismicity (Johnston et al. this session), and water level in the 3-km-deep LVEW well in the center of Long Valley caldera dropped by ~13 cm during passage of the seismic wave train from the Alaska earthquake followed by a gradual recovery. The Geysers, Coso, and Mount Rainier have no continuous, high-resolution strain instrumentation. A larger earthquake swarm that began 23.5 hours later (21:38 UT on the 4th) in the south moat of Long Valley caldera and included nine M>2 and one M=3.0 earthquake may represent a delayed response to the Alaska earthquake.

  8. NCEER seminars on earthquakes

    USGS Publications Warehouse

    Pantelic, J.

    1987-01-01

    In May of 1986, the National Center for Earthquake Engineering Research (NCEER) in Buffalo, New York, held the first seminar in its new monthly forum called Seminars on Earthquakes. The Center's purpose in initiating the seminars was to educate the audience about earthquakes, to facilitate cooperation between the NCEER and visiting researchers, and to enable visiting speakers to learn more about the NCEER   

  9. The 2010 Mw 8.8 Maule megathrust earthquake of Central Chile, monitored by GPS.

    PubMed

    Vigny, C; Socquet, A; Peyrat, S; Ruegg, J-C; Métois, M; Madariaga, R; Morvan, S; Lancieri, M; Lacassin, R; Campos, J; Carrizo, D; Bejar-Pizarro, M; Barrientos, S; Armijo, R; Aranda, C; Valderas-Bermejo, M-C; Ortega, I; Bondoux, F; Baize, S; Lyon-Caen, H; Pavez, A; Vilotte, J P; Bevis, M; Brooks, B; Smalley, R; Parra, H; Baez, J-C; Blanco, M; Cimbaro, S; Kendrick, E

    2011-06-17

    Large earthquakes produce crustal deformation that can be quantified by geodetic measurements, allowing for the determination of the slip distribution on the fault. We used data from Global Positioning System (GPS) networks in Central Chile to infer the static deformation and the kinematics of the 2010 moment magnitude (M(w)) 8.8 Maule megathrust earthquake. From elastic modeling, we found a total rupture length of ~500 kilometers where slip (up to 15 meters) concentrated on two main asperities situated on both sides of the epicenter. We found that rupture reached shallow depths, probably extending up to the trench. Resolvable afterslip occurred in regions of low coseismic slip. The low-frequency hypocenter is relocated 40 kilometers southwest of initial estimates. Rupture propagated bilaterally at about 3.1 kilometers per second, with possible but not fully resolved velocity variations. PMID:21527673

  10. A summary of ground motion effects at SLAC (Stanford Linear Accelerator Center) resulting from the Oct 17th 1989 earthquake

    SciTech Connect

    Ruland, R.E.

    1990-08-01

    Ground motions resulting from the October 17th 1989 (Loma Prieta) earthquake are described and can be correlated with some geologic features of the SLAC site. Recent deformations of the linac are also related to slow motions observed over the past 20 years. Measured characteristics of the earthquake are listed. Some effects on machine components and detectors are noted. 18 refs., 16 figs.

  11. A framework for rapid post-earthquake assessment of bridges and restoration of transportation network functionality using structural health monitoring

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; Ramhormozian, Shahab; Mangabhai, Poonam; Singh, Ravikash; Orense, Rolando

    2013-04-01

    Quick and reliable assessment of the condition of bridges in a transportation network after an earthquake can greatly assist immediate post-disaster response and long-term recovery. However, experience shows that available resources, such as qualified inspectors and engineers, will typically be stretched for such tasks. Structural health monitoring (SHM) systems can therefore make a real difference in this context. SHM, however, needs to be deployed in a strategic manner and integrated into the overall disaster response plans and actions to maximize its benefits. This study presents, in its first part, a framework of how this can be achieved. Since it will not be feasible, or indeed necessary, to use SHM on every bridge, it is necessary to prioritize bridges within individual networks for SHM deployment. A methodology for such prioritization based on structural and geotechnical seismic risks affecting bridges and their importance within a network is proposed in the second part. An example using the methodology application to selected bridges in the medium-sized transportation network of Wellington, New Zealand is provided. The third part of the paper is concerned with using monitoring data for quick assessment of bridge condition and damage after an earthquake. Depending on the bridge risk profile, it is envisaged that data will be obtained from either local or national seismic monitoring arrays or SHM systems installed on bridges. A method using artificial neural networks is proposed for using data from a seismic array to infer key ground motion parameters at an arbitrary bridges site. The methodology is applied to seismic data collected in Christchurch, New Zealand. Finally, how such ground motion parameters can be used in bridge damage and condition assessment is outlined.

  12. Catalog of earthquake hypocenters at Alaskan Volcanoes: January 1 through December 31, 2010

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl K.

    2011-01-01

    Between January 1 and December 31, 2010, the Alaska Volcano Observatory (AVO) located 3,405 earthquakes, of which 2,846 occurred within 20 kilometers of the 33 volcanoes with seismograph subnetworks. There was no significant seismic activity in 2010 at these monitored volcanic centers. Seismograph subnetworks with severe outages in 2009 were repaired in 2010 resulting in three volcanic centers (Aniakchak, Korovin, and Veniaminof) being relisted in the formal list of monitored volcanoes. This catalog includes locations and statistics of the earthquakes located in 2010 with the station parameters, velocity models, and other files used to locate these earthquakes.

  13. Medical Relief Response by Miyako Public Health Center after the Great East Japan Earthquake and Tsunami, 2011.

    PubMed

    Yanagihara, Hiroki

    2016-01-01

    Objectives To improve disaster preparedness, we investigated the response of medical relief activities managed by Iwate Prefectural Miyako Public Health Center during the post-acute phase of the Great East Japan Earthquake and Tsunami on March 11, 2011.Methods The study divided the post-disaster period into three approximate time segments: Period I (time of disaster through late March), Period II (mid-April), and Period III (end of May in Miyako City, early July in Yamada Town). We reviewed records on medical relief activities conducted by medical assistance teams (MATs) in Miyako City and Yamada Town.Results Miyako Public Health Center had organized a meeting to coordinate medical relief activities from Period I to Period III. According to demand for medical services and recovery from the local medical institutions (LMIs) in the affected area, MATs were deployed and active on evacuation centers in each area assigned. The number of patients examined by MATs in Miyako rose to approximately 250 people per day in Period I and decreased to 100 in Period III. However, in Yamada, the number surged to 700 in Period I, fell to 100 in Period II, and decreased to 50 in Period III. This difference could be partly explained as follows. In Miyako, most evacuees had consulted LMIs which restarted medical services after disaster, and the number of LMIs restarted had already reached 29 (94% of the whole) in Period I. In Yamada, most evacuees who had consulted MATs in Period I had almost moved to LMIs restarted in Period II. During the same time, a division of roles and coordination on medical services provision was conducted, such as MATs mainly in charge of primary emergency triage, in response to the number of LMIs restarted which reached 1 (20%) in Period I and 3 (60%) in Period II. Following Period III, more than 80% of patients in Miyako had been a slight illness, such as need for health guidance, and the number of people who underwent emergency medical transport reached pre-disaster levels in both locations. These results suggest that demand for medical services of evacuees declined to a stable level in an early stage of Period III. Using the above findings, one might justify supporting local medical institutions' recovery earlier. Then, medical relief activities might be finished properly.Conclusion This study shows useful perspectives in the response of medical relief activities during post-acute phase after disaster and the importance of establishing systems for information management that apply these perspectives. PMID:26971453

  14. Hatfield Marine Science Center Dynamic Revetment Project DSL permit #45455-FP, Monitoring Report February, 2013

    EPA Science Inventory

    A Dynamic Revetment (gravel beach) was installed in November, 2011 on the shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) to mitigate erosion that threatened HMSC critical infrastructure. Shoreline topographic and biological monitoring was init...

  15. Monitoring of the Permeable Pavement Demonstration Site at the Edison Environmental Center (Poster)

    EPA Science Inventory

    This is a poster on the permeable pavement parking lot at the Edison Environmental Center. The monitoring scheme for the project is discussed in-depth with graphics explaining the instrumentation installed at the site.

  16. Hatfield Marine Science Center Dynamic Revetment Project DSL permit #45455-FP, Monitoring Report February 2012

    EPA Science Inventory

    A Dynamic Revetment (gravel beach) was installed in November, 2011 on the shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) to mitigate erosion that threatened HMSC critical infrastructure. Shoreline topographic and biological monitoring was init...

  17. Reducing atmospheric noise in RST analysis of TIR satellite radiances for earthquakes prone areas satellite monitoring

    NASA Astrophysics Data System (ADS)

    Lisi, Mariano; Filizzola, Carolina; Genzano, Nicola; Paciello, Rossana; Pergola, Nicola; Tramutoli, Valerio

    Space-time fluctuations of the Earth's emitted Thermal Infrared (TIR) radiation observed from satellite from months to weeks before an earthquake are reported in several studies. Among the others, a Robust Satellite data analysis Technique (RST) was proposed (and applied to different satellite sensors in various geo-tectonic contexts) to discriminate anomalous signal transients possibly associated with earthquake occurrence from normal TIR signal fluctuations due to other possible causes (e.g. solar diurnal-annual cycle, meteorological conditions, changes in observational conditions, etc.). Variations in satellite view angle depending on satellite's passages (for polar satellites) and atmospheric water vapour fluctuations were recognized in the past as the main factors affecting the residual signal variability reducing the overall Signal-to-Noise (S/N) ratio and the potential of the RST-based approach in identifying seismically related thermal anomalies. In this paper we focus on both factors for the first time, applying the RST approach to geostationary satellites (which guarantees stable view angles) and using Land Surface Temperature (LST) data products (which are less affected by atmospheric water vapour variability) instead of just TIR radiances at the sensor. The first results, obtained in the case of the Abruzzo earthquake (6 April 2009, MW ∼ 6.3) by analyzing 6 years of SEVIRI (Spinning Enhanced Visible and Infrared Imager on board the geostationary Meteosat Second Generation satellite) LST products provided by EUMETSAT, seem to confirm the major sensitivity of the proposed approach in detecting perturbations of the Earth's thermal emission a few days before the main shock. The results achieved in terms of increased S/N ratio (in validation) and reduced "false alarms" rate (in confutation) are discussed comparing results obtained by applying RST to LST products with those achieved by applying an identical RST analysis (using the same MSG-SEVIRI 2005-2010 data-set) to the simple TIR radiances at the sensor.

  18. Monitoring the Corniglio Landslide (Parma, Italy) before and after the M=5.4 earthquake of December 2008

    NASA Astrophysics Data System (ADS)

    Virdis, S.; Guastaldi, E.; Rindinella, A.; Disperati, L.; Ciulli, A.

    2009-04-01

    In this work we present the results of monitoring the Corniglio landslide (CL), a large landslide located in the Northern Apennines, by integrating traditional geomorphologic and geological surveys, digital photogrammetry, GPS and geostatistics. The CL spreads over an area of about 3 km x 1 km, close to Corniglio village (Parma, Italy). We propose a new kinematic framework for the CL as Deep-Seated Slope Gravitational Deformation (DSGSD). Surveys were carried out in six periods, in July and September 2006, March and August 2007, July 2008 (after a M=4 earthquake of 28 December 2007, 10 km far from Corniglio), and finally January 2009 (after several earthquakes occurred in the last days of December 2008, with magnitude from 4 to 5.4 and epicentres located less than 30 km far from Corniglio). Geological survey, interpretation of orthophotographs related to 1976, 1988, 1994, 1996, 1998, 2005, and satellite imagery related to 2003 were integrated for analysing the state of activity of landslide from 1976 to 2009, quantifying the ground displacement vectors. A RTK GPS survey was periodically carried out in order to locate the crown of the main landslide scarp and to identify reactivation of the CL after the earthquakes of the end of December 2008. Then, kriged multitemporal maps representing azimuth and module of ground displacement vectors were built, by evaluating the displacement with time of homologous ground targets on the multitemporal remotely sensed images. Measuring of ground deformations was performed on imagery related to the periods between December 1994 to July 1996, between October and November 1996, as well as the recurrent activity from October 1998 to 2003. In some sector of the main body of the landslide we estimated 70 m of total of ground displacement. The fieldwork results and photogeologic interpretation performed along the the Bratica valley, to the east of the CL, suggest that the occurrence of rigid behaviour lithotypes (Mt. Caio calcareous flysch of Upper Campanian - Maastrichtian and Oligocene Arenarie del Bratica) over both the plastic low - shear strength chaotic deposit of brownish clays ("Melange di Lago" formation, upper Campanian - middle Eocene) and marly clays ("Argille e Calcari" formation, middle Lutetian) represent a critical setting for the stability of the area. Furthermore, relevant east-west trending lineaments seem to be involved into slope movements. These evidences suggests that the CL may be part of a larger DSGSD also including the hill among the Bratica river, the CL main body and the Parma river. The earthquakes involving this area periodically reactivate the main body of landslide.

  19. GREENHOUSE GAS (GHG) MITIGATION AND MONITORING TECHNOLOGY PERFORMANCE: ACTIVITIES OF THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper discusses greenhouse gas (GHG) mitigation and monitoring technology performance activities of the GHG Technology Verification Center. The Center is a public/private partnership between Southern Research Institute and the U.S. EPA's Office of Research and Development. It...

  20. Seismic Monitoring and Post-Seismic Investigations following the 12 January 2010 Mw 7.0 Haiti Earthquake (Invited)

    NASA Astrophysics Data System (ADS)

    Altidor, J.; Dieuseul, A.; Ellsworth, W. L.; Given, D. D.; Hough, S. E.; Janvier, M. G.; Maharrey, J. Z.; Meremonte, M. E.; Mildor, B. S.; Prepetit, C.; Yong, A.

    2010-12-01

    We report on ongoing efforts to establish seismic monitoring in Haiti. Following the devastating M7.0 Haiti earthquake of 12 January 2010, the Bureau des Mines et de l’Energie worked with the U.S. Geological Survey and other scientific institutions to investigate the earthquake and to better assess hazard from future earthquakes. We deployed several types of portable instruments to record aftershocks: strong-motion instruments within Port-au-Prince to investigate the variability of shaking due to local geological conditions, and a combination of weak-motion, strong-motion, and broadband instruments around the Enriquillo-Plaintain Garden fault (EPGF), primarily to improve aftershock locations and to lower the magnitude threshold of aftershock recording. A total of twenty instruments were deployed, including eight RefTek instruments and nine strong-motion (K2) accelerometers deployed in Port-au-Prince in collaboration with the USGS, and three additional broadband stations deployed in the epicentral region in collaboration with the University of Nice. Five K2s have remained in operation in Port-au-Prince since late June; in late June two instruments were installed in Cap-Haitien and Port de Paix in northern Haiti to provide monitoring of the Septentrional fault. A permanent strong-motion (NetQuakes) instrument was deployed in late June at the US Embassy. Five additional NetQuakes instruments will be deployed by the BME in late 2010/early 2011. Addionally, the BME has collaborated with other scientific institutions, including Columbia University, the Institut Géophysique du Globe, University of Nice, the University of Texas at Austin, and Purdue University, to conduct other types of investigations. These studies include, for example, sampling of uplifted corals to establish a chronology of prior events in the region of the Enriquillo-Plantain Garden fault, surveys of geotechnical properties to develop microzonation maps of metropolitan Port-au-Prince, surveys of damage to public buildings, and a continuation of GPS surveys to measure co- and post-seismic displacements in collaboration with researchers from Purdue University. Preliminary analysis of aftershock recordings and damage surveys reveals that local site effects contributed significantly to the damage in some neighborhoods of Port-au-Prince. However, in general, bad construction practices and high population density were the primary causes of the extent of the damage and the high number of fatalities.

  1. (Stanford Linear Accelerator Center) annual environmental monitoring report, January--December 1989

    SciTech Connect

    Not Available

    1990-05-01

    This progress report discusses environmental monitoring activities at the Stanford Linear Accelerator Center for 1989. Topics include climate, site geology, site water usage, land use, demography, unusual events or releases, radioactive and nonradioactive releases, compliance summary, environmental nonradiological program information, environmental radiological program information, groundwater protection monitoring ad quality assurance. 5 figs., 7 tabs. (KJD)

  2. On the Potential Uses of Static Offsets Derived From Low-Cost Community Instruments and Crowd-Sourcing for Earthquake Monitoring and Rapid Response

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Murray, J. R.; Iannucci, R. A.

    2013-12-01

    We explore the efficacy of low-cost community instruments (LCCIs) and crowd-sourcing to produce rapid estimates of earthquake magnitude and rupture characteristics which can be used for earthquake loss reduction such as issuing tsunami warnings and guiding rapid response efforts. Real-time high-rate GPS data are just beginning to be incorporated into earthquake early warning (EEW) systems. These data are showing promising utility including producing moment magnitude estimates which do not saturate for the largest earthquakes and determining the geometry and slip distribution of the earthquake rupture in real-time. However, building a network of scientific-quality real-time high-rate GPS stations requires substantial infrastructure investment which is not practicable in many parts of the world. To expand the benefits of real-time geodetic monitoring globally, we consider the potential of pseudorange-based GPS locations such as the real-time positioning done onboard cell phones or on LCCIs that could be distributed in the same way accelerometers are distributed as part of the Quake Catcher Network (QCN). While location information from LCCIs often have large uncertainties, their low cost means that large numbers of instruments can be deployed. A monitoring network that includes smartphones could collect data from potentially millions of instruments. These observations could be averaged together to substantially decrease errors associated with estimated earthquake source parameters. While these data will be inferior to data recorded by scientific-grade seismometers and GPS instruments, there are features of community-based data collection (and possibly analysis) that are very attractive. This approach creates a system where every user can host an instrument or download an application to their smartphone that both provides them with earthquake and tsunami warnings while also providing the data on which the warning system operates. This symbiosis helps to encourage people to both become users of the warning system and to contribute data to the system. Further, there is some potential to take advantage of the LCCI hosts' computing and communications resources to do some of the analysis required for the warning system. We will present examples of the type of data which might be observed by pseudorange-based positioning for both actual earthquakes and laboratory tests as well as performance tests of potential earthquake source modeling derived from pseudorange data. A highlight of these performance tests is a case study of the 2011 Mw 9 Tohoku-oki earthquake.

  3. Structural Health Monitoring Sensor Development at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Prosser, W. H.; Wu, M. C.; Allison, S. G.; DeHaven, S. L.; Ghoshal, A.

    2002-01-01

    NASA is applying considerable effort on the development of sensor technology for structural health monitoring (SHM). This research is targeted toward increasing the safety and reliability of aerospace vehicles, while reducing operating and maintenance costs. Research programs are focused on applications to both aircraft and space vehicles. Sensor technologies under development span a wide range including fiber-optic sensing, active and passive acoustic sensors, electromagnetic sensors, wireless sensing systems, MEMS, and nanosensors. Because of their numerous advantages for aerospace applications, fiber-optic sensors are one of the leading candidates and are the major focus of this presentation. In addition, recent advances in active and passive acoustic sensing will also be discussed.

  4. Earthquake Testing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

  5. A cost effective wireless structural health monitoring network for buildings in earthquake zones

    NASA Astrophysics Data System (ADS)

    Pentaris, F. P.; Stonham, J.; Makris, J. P.

    2014-10-01

    The design, programming and implementation of a cost effective wireless structural health monitoring system (wSHMs) is presented, able to monitor the seismic and/or man-made acceleration in buildings. This system actually operates as a sensor network exploiting internet connections that commonly exist, aiming to monitor the structural health of the buildings being installed. Key-feature of wSHMs is that it can be implemented in Wide Area Network mode to cover many remote structures and buildings, on metropolitan scale. Acceleration data is able to send, in real time, from dozens of buildings of a broad metropolitan area, to a central database, where they are analyzed in order to depict possible structural damages or nonlinear characteristics and alert for non-appropriateness of specific structures.

  6. Earthquake history of Texas

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

    Seventeen earthquakes, intensity V or greater, have centered in Texas since 1882, when the first shock was reported. The strongest earthquake, a maximum intensity VIII, was in western Texas in 1931 and was felt over 1 165 000 km 2. Three shocks in the Panhandle region in 1925, 1936, and 1943 were widely felt. 

  7. Monitoring shallow resistivity changes prior to the 12 May 2008 M 8.0 Wenchuan earthquake on the Longmen Shan tectonic zone, China

    NASA Astrophysics Data System (ADS)

    Lu, Jun; Xie, Tao; Li, Mei; Wang, Yali; Ren, Yuexia; Gao, Shude; Wang, Lanwei; Zhao, Jialiu

    2016-04-01

    An active source measurement of shallow resistivity using fixed-electrode quasi-Schlumberger arrays has been conducted at Pixian, Jiangyou and Wudu stations on the Longmen Shan tectonic zone in western China, with the hope of detecting earthquake-associated changes. For the duration of the monitoring experiment, a gradual decrease of apparent resistivity of up to 6.7% several years prior to the 12 May 2008 M 8.0 Wenchuan earthquake had been recorded clearly at Pixian station, approximately 35 km from the epicenter. The change of apparent resistivity was monitored with a fixed Schlumberger array of AB/MN spacings of 736 m/226 m in the direction of N57.5°E, giving precisions in measured daily averages of 0.16% or less. A coseismic resistivity drop of up to 5.3% was observed at Jiangyou station, using a Schlumberger array of AB/MN spacings of 710 m/90 m in the direction of N10°E. No fluctuation of resistivity was detected at Wudu station at the time of the Wenchuan mainshock. While the focus of this paper is on monitoring or tracking resistivity variations prior to, during, and after the Wenchuan earthquake, we also aim to compare resistivity records of the Wenchuan earthquake to those of the M 7.8 Tangshan and M 7.2 Songpan earthquakes of 1976. Attempts to explain the observed resistivity variations have been made. The results show that the resistivity variations observed at all three stations are in approximate agreement with resistivity-stress behavior deduced from in situ experiments, focal mechanisms, a simplified dynamical model, static stress analyses, and field investigations from along the Longmen Shan fault zone.

  8. A model for earthquake acceleration monitoring with wireless sensor networks in a structure

    NASA Astrophysics Data System (ADS)

    Fujiwara, Takahiro; Nakamura, Yugo; Jinno, Kousei; Matsubara, Taku; Uehara, Hideyuki

    2014-03-01

    Wireless sensor networks (WSNs) technologies have attracted much attention to collect damage information in a natural disaster. WSNs to monitor temperature or humidity usually collect data once in some seconds or some minutes. Since structural health monitoring (SHM), meanwhile, aims to make a diagnosis for the state of a structure based on detected acceleration, WSNs are a promising technology to collect acceleration data. One concern to employ WSNs in SHM is to detect phenomena at a high sampling rate under energy-aware condition. In this paper, we describe a model for seismic acceleration monitoring, configured with multi-layer networks: WSNs, a wireless distribution system (WDS) and a database server, where the WDS is mainly operating in a wireless local area network (WLAN). Examining the performance in the test bed for the monitoring system, the results showed the system was capable of collecting acceleration at a rate of 100 sampling per second (sps) even in the fashion of intermittent operation, and capable of storing data into a database. We also suggest that the method using intermittent operation with appropriate sampling rate is effective in providing a long time operation for the system by considering in the response motion of a structure.

  9. Earthquake history of Oregon

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    Although situated between two States (California and Washington) that have has many violent earthquakes, Oregon is noticeably less active seismically. the greatest damage experienced resulted from a major shock near Olympia, Wash., in 1949. During the short history record available (since 1841), 34 earthquakes of intensity V, Modified Mercalli Scale, or greater have centered within Oregon or near its borders. Only 13 of the earthquakes had an intensity above V, and many of the shocks were local. However, a 1936 earthquake in the eastern Oregon-Washington region caused extensive damage and was felt over an area of 272,000 square kilometers. 

  10. Earthquake history of Vermont

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

    Seven earthquakes of intensity V or greater on the Modified Mercalli Scale (MM) are known to have originated within Vermont. Many additional shocks centered in other New England States and Canada have been strongly felt in Vermont. 

  11. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - presentation

    EPA Science Inventory

    The EPAs Urban Watershed Management Branch has been monitoring an instrumented 110-space pervious pavement parking lot. The lot is used by EPA personnel and visitors to the Edison Environmental Center. The design includes 28-space rows of three permeable pavement types: asphal...

  12. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - Abstract

    EPA Science Inventory

    The EPAs Urban Watershed Management Branch (UWMB) is monitoring an instrumented, working, 110-space pervious pavement parking at EPAs Edison Environmental Center (EEC). Permeable pavement systems are classified as stormwater best management practices (BMPs) which reduce runo...

  13. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - Abstract

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch (UWMB) is monitoring an instrumented, working, 110-space pervious pavement parking at EPA’s Edison Environmental Center (EEC). Permeable pavement systems are classified as stormwater best management practices (BMPs) which reduce runo...

  14. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - presentation

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch has been monitoring an instrumented 110-space pervious pavement parking lot. The lot is used by EPA personnel and visitors to the Edison Environmental Center. The design includes 28-space rows of three permeable pavement types: asphal...

  15. Earthquake Monitoring at 9 deg 50'N on the East Pacific Rise: Latest Results and Implications for Integrated Models

    NASA Astrophysics Data System (ADS)

    Doermann, L.; Waldhauser, F.; Tolstoy, M.

    2008-12-01

    Ocean bottom seismograph (OBS) data were recorded continuously between October 2003 and January 2007 at the Ridge 2000 Bull's Eye site at 950'N on the East Pacific Rise (EPR) using a 4 x 4 km array of up to 12 instruments with approximately annual turnaround. These data have provided exciting insights into fundamental processes at fast-spreading ridges including volcanism and hydrothermal circulation. They also are providing critical linkages for understanding the geological, chemical and biological data at this site. Results from the first OBS deployment have shown that we are able to monitor microseismicity on a fine enough scale to image the fundamental structure of a hydrothermal circulation cell, and we have identified an on-axis down-flow zone and a hydrothermal cracking front overlying the axial magma chamber (Tolstoy et al., 2008). Our results show that hydrothermal circulation at the EPR is dominantly along-axis with narrowly focused down-flow at small kinks in the axial summit trough (AST). There appear to be two distinct circulation cells within the 949'N-951'N area, and these correlate well with temperature, chemical and biological observations. The rate of seismic events recorded at the array were ~2 orders of magnitude higher than anticipated based on prior results from this area (>320,000 events recorded versus ~4,500 anticipated), and therefore the processing task is considerable. In addition to hand-picking phase arrival times from periods of particular interest, we are also working on improved automatic detection tools to speed up processing of data from the remaining years and the use of waveform cross-correlation to improve event locations. Preliminary results to date suggest that the basic structure imaged in the 2003-2004 earthquake data persists, with seismicity rates continuing to climb leading up to the January 2006 eruption. We will present the most recent earthquake locations and discuss how they fit into results from the 2003-2004 data, as well as the implications for integrated models at this site.

  16. Real time of earthquakes prone areas by RST analysis of satellite TIR radiances: results of continuous monitoring over Italy and Turkey regions.

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Filizzola, C.; Genzano, N.; Lisi, M.; Paciello, R.; Pergola, N.

    2012-04-01

    Meteorological satellites offering global coverage, continuity of observations and long term time series (starting even 30 years ago) offer a unique possibility not only to learn from the past but also to guarantee continuous monitoring whereas other observation technologies are lacking because too expensive or (like in the case of earthquake precursor studies) or considered useless by decision-makers. Space-time fluctuations of Earth's emitted Thermal Infrared (TIR) radiation have been observed from satellite months to weeks before earthquakes occurrence. The general RST approach has been proposed (since 2001) in order to discriminate normal (i.e. related to the change of natural factor and/or observation conditions) TIR signal fluctuations from anomalous signal transient possibly associated to earthquake occurrence. Since then several earthquakes occurred in Europe, Africa and America have been studied by analyzing decades of satellite observations always using a validation/confutation approach in order to verify the presence/absence of anomalous space-time TIR transients in presence/absence of significant seismic activity. In the framework of PRE-EARTHQUAKES EU-FP7 Project (www.pre-earthquakes.org) , starting from October 2010 (still continuing) RST approach has been applied to MSG/SEVIRI data to generate TIR anomaly maps over Italian peninsula, continuously for all the midnight slots. Since September 2011 the same monitoring activity (still continuing) started for Turkey region. For the first time a similar analysis has been performed in real-time, systematically analyzing TIR anomaly maps in order to identify day by day possible significant (e.g. persistent in the space-time domain) thermal anomalies. During 2011 only in very few cases (1 in Italy in July and 2 in the Turkish region in September and November) the day by day analysis enhanced significant anomalies that in two cases were communicated to the other PRE-EARTHQUAKES partners asking for their attention. In this paper results of such analysis will be presented which seem to confirm results independently achieved (unfortunately without their knowledge) by other authors applying a similar approach to EOS/MODIS data over California region.

  17. Exploring Earthquakes in Real-Time

    NASA Astrophysics Data System (ADS)

    Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.

    2013-12-01

    Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.

  18. Lessons learned from the introduction of autonomous monitoring to the EUVE science operations center

    NASA Technical Reports Server (NTRS)

    Lewis, M.; Girouard, F.; Kronberg, F.; Ringrose, P.; Abedini, A.; Biroscak, D.; Morgan, T.; Malina, R. F.

    1995-01-01

    The University of California at Berkeley's (UCB) Center for Extreme Ultraviolet Astrophysics (CEA), in conjunction with NASA's Ames Research Center (ARC), has implemented an autonomous monitoring system in the Extreme Ultraviolet Explorer (EUVE) science operations center (ESOC). The implementation was driven by a need to reduce operations costs and has allowed the ESOC to move from continuous, three-shift, human-tended monitoring of the science payload to a one-shift operation in which the off shifts are monitored by an autonomous anomaly detection system. This system includes Eworks, an artificial intelligence (AI) payload telemetry monitoring package based on RTworks, and Epage, an automatic paging system to notify ESOC personnel of detected anomalies. In this age of shrinking NASA budgets, the lessons learned on the EUVE project are useful to other NASA missions looking for ways to reduce their operations budgets. The process of knowledge capture, from the payload controllers for implementation in an expert system, is directly applicable to any mission considering a transition to autonomous monitoring in their control center. The collaboration with ARC demonstrates how a project with limited programming resources can expand the breadth of its goals without incurring the high cost of hiring additional, dedicated programmers. This dispersal of expertise across NASA centers allows future missions to easily access experts for collaborative efforts of their own. Even the criterion used to choose an expert system has widespread impacts on the implementation, including the completion time and the final cost. In this paper we discuss, from inception to completion, the areas where our experiences in moving from three shifts to one shift may offer insights for other NASA missions.

  19. Environmental assessment of the Carlsbad Environmental Monitoring and Research Center Facility

    SciTech Connect

    1995-10-01

    This Environmental Assessment has been prepared to determine if the Carlsbad Environmental Monitoring and Research Center (the Center), or its alternatives would have significant environmental impacts that must be analyzed in an Environmental Impact Statement. DOE`s proposed action is to continue funding the Center. While DOE is not funding construction of the planned Center facility, operation of that facility is dependent upon continued funding. To implement the proposed action, the Center would initially construct a facility of approximately 2,300 square meters (25,000 square feet). The Phase 1 laboratory facilities and parking lot will occupy approximately 1.2 hectares (3 acres) of approximately 8.9 hectares (22 acres) of land which were donated to New Mexico State University (NMSU) for this purpose. The facility would contain laboratories to analyze chemical and radioactive materials typical of potential contaminants that could occur in the environment in the vicinity of the DOE Waste Isolation Pilot Plant (WIPP) site or other locations. The facility also would have bioassay facilities to measure radionuclide levels in the general population and in employees of the WIPP. Operation of the Center would meet the DOE requirement for independent monitoring and assessment of environmental impacts associated with the planned disposal of transuranic waste at the WIPP.

  20. Program Evaluation of Remote Heart Failure Monitoring: Healthcare Utilization Analysis in a Rural Regional Medical Center

    PubMed Central

    Keberlein, Pamela; Sorenson, Gigi; Mohler, Sailor; Tye, Blake; Ramirez, A. Susana; Carroll, Mark

    2015-01-01

    Abstract Background: Remote monitoring for heart failure (HF) has had mixed and heterogeneous effects across studies, necessitating further evaluation of remote monitoring systems within specific healthcare systems and their patient populations. “Care Beyond Walls and Wires,” a wireless remote monitoring program to facilitate patient and care team co-management of HF patients, served by a rural regional medical center, provided the opportunity to evaluate the effects of this program on healthcare utilization. Materials and Methods: Fifty HF patients admitted to Flagstaff Medical Center (Flagstaff, AZ) participated in the project. Many of these patients lived in underserved and rural communities, including Native American reservations. Enrolled patients received mobile, broadband-enabled remote monitoring devices. A matched cohort was identified for comparison. Results: HF patients enrolled in this program showed substantial and statistically significant reductions in healthcare utilization during the 6 months following enrollment, and these reductions were significantly greater compared with those who declined to participate but not when compared with a matched cohort. Conclusions: The findings from this project indicate that a remote HF monitoring program can be successfully implemented in a rural, underserved area. Reductions in healthcare utilization were observed among program participants, but reductions were also observed among a matched cohort, illustrating the need for rigorous assessment of the effects of HF remote monitoring programs in healthcare systems. PMID:25025239

  1. Evaluating the Imbalance Between Increasing Hemodialysis Patients and Medical Staff Shortage After the Great East Japan Earthquake: Report From a Hemodialysis Center Near the Fukushima Nuclear Power Plants.

    PubMed

    Koshiba, Takaaki; Nishiuchi, Takamitsu; Akaihata, Hidenori; Haga, Nobuhiro; Kojima, Yoshiyuki; Kubo, Hajime; Kasahara, Masato; Hayashi, Masayuki

    2016-04-01

    The Great East Japan Earthquake in 2011 caused an unprecedented imbalance between an increasing number of hemodialysis patients and medical staff shortage in the Sousou area, the site of the Fukushima nuclear power plants. In 2014, capacity of our hemodialysis center reached a critical limit due to such an imbalance. We attempted to evaluate the effort of medical staff to clarify to what extent their burden had increased post-disaster. The ratio of total dialysis sessions over total working days of medical staff was determined as an approximate indicator of effort per month. The mean value of each year was compared. Despite fluctuations of the ratio, the mean value did not differ from 2010 to 2013. However, the ratio steadily increased in 2014, and there was a significant increase in the mean value. This proposed indicator of the effort of medical staff appears to reflect what we experienced, although its validity must be carefully examined in future studies. PMID:26935477

  2. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Chen, S. E.; Yu, E.; Bhaskaran, A.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2011-12-01

    Currently, the SCEDC archives continuous and triggered data from nearly 8400 data channels from 425 SCSN recorded stations, processing and archiving an average of 6.4 TB of continuous waveforms and 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC during 2011. New website design: ? The SCEDC has revamped its website. The changes make it easier for users to search the archive, discover updates and new content. These changes also improve our ability to manage and update the site. New data holdings: ? Post processing on El Mayor Cucapah 7.2 sequence continues. To date there have been 11847 events reviewed. Updates are available in the earthquake catalog immediately. ? A double difference catalog (Hauksson et. al 2011) spanning 1981 to 6/30/11 will be available for download at www.data.scec.org and available via STP. ? A focal mechanism catalog determined by Yang et al. 2011 is available for distribution at www.data.scec.org. ? Waveforms from Southern California NetQuake stations are now being stored in the SCEDC archive and available via STP as event associated waveforms. Amplitudes from these stations are also being stored in the archive and used by ShakeMap. ? As part of a NASA/AIST project in collaboration with JPL and SIO, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP. Improvements in the user tool STP: ? STP sac output now includes picks from the SCSN. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. The data is stored in miniseed format with gzip compression. Time gaps between time series were padded with null values, which substantially increases search efficiency by make the records uniform in length.

  3. Environmental monitoring and research at the John F. Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Hall, C. R.; Hinkle, C. R.; Knott, W. M.; Summerfield, B. R.

    1992-01-01

    The Biomedical Operations and Research Office at the NASA John F. Kennedy Space Center has been supporting environmental monitoring and research since the mid-1970s. Program elements include monitoring of baseline conditions to document natural variability in the ecosystem, assessments of operations and construction of new facilities, and ecological research focusing on wildlife habitat associations. Information management is centered around development of a computerized geographic information system that incorporates remote sensing and digital image processing technologies along with traditional relational data base management capabilities. The proactive program is one in which the initiative is to anticipate potential environmental concerns before they occur and, by utilizing in-house expertise, develop impact minimization or mitigation strategies to reduce environmental risk.

  4. Earthquake history of Mississippi

    USGS Publications Warehouse

    von Hake, C. A.

    1974-01-01

    Since its admission into the Union in 1817, Mississippi has had only four earthquakes of intensity V or greater within its borders. Although the number of earthquakes known to have been centered within Mississippi's boundaries is small, the State has been affected by numerous shocks located in neighboring States. In 1811 and 1812, a series of great earthquakes near the New Madrid Missouri area was felt in Mississippi as far south as the gulf coast. The New Madrid series caused the banks of the Mississippi River to cave in as far as Vicksburg, mroe than 300 miles from the epicentral region. As a result of this great earthquake series, the northwest corner of Mississippi is in seismic risk zone 3, the highest risk zone. Expect for the new Madrid series, effects in Mississippi from earthquakes located outside of the State have been less than intensity V. 

  5. Earthquake history of Pennsylvania

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    Record of early earthquakes in Northeastern United States provide limited information on effects in pennsylvania until 1737, 55 years after the first permanent settlement was established. A very severe earthquake that centered in the St.Lawrence River region in 1663 may have been felt in Pennsylvania, but historical accounts are not definite. Likewise, a damaging shock at Newbury, Mass., in 1727 probably affected towns in Pennsylvania. A strong earthquake on December 18, 1737, toppled chimneys at New York City and was reported felt at Boston, Mass., Philadelphia, Pa. and New Castle, Del. Other shocks with origins outside the State were felt in 1758, 1783, and 1791. Since 1800, when two earthquakes (March 17 and November 29) were reported as "severe" at Philadelphia, 16 tremors of intensity V or greater (Modified Mercalli Scale) have originated within the State. On November 11 and 14, 1840, sever earthquakes at Philadelphia were accompnaied by a great and unusual swell on the Delaware River. 

  6. Source Process of the Mw 5.0 Au Sable Forks, New York, Earthquake Sequence from Local Aftershock Monitoring Network Data

    NASA Astrophysics Data System (ADS)

    Kim, W.; Seeber, L.; Armbruster, J. G.

    2002-12-01

    On April 20, 2002, a Mw 5 earthquake occurred near the town of Au Sable Forks, northeastern Adirondacks, New York. The quake caused moderate damage (MMI VII) around the epicentral area and it is well recorded by over 50 broadband stations in the distance ranges of 70 to 2000 km in the Eastern North America. Regional broadband waveform data are used to determine source mechanism and focal depth using moment tensor inversion technique. Source mechanism indicates predominantly thrust faulting along 45° dipping fault plane striking due South. The mainshock is followed by at least three strong aftershocks with local magnitude (ML) greater than 3 and about 70 aftershocks are detected and located in the first three months by a 12-station portable seismographic network. The aftershock distribution clearly delineate the mainshock rupture to the westerly dipping fault plane at a depth of 11 to 12 km. Preliminary analysis of the aftershock waveform data indicates that orientation of the P-axis rotated 90° from that of the mainshock, suggesting a complex source process of the earthquake sequence. We achieved an important milestone in monitoring earthquakes and evaluating their hazards through rapid cross-border (Canada-US) and cross-regional (Central US-Northeastern US) collaborative efforts. Hence, staff at Instrument Software Technology, Inc. near the epicentral area joined Lamont-Doherty staff and deployed the first portable station in the epicentral area; CERI dispatched two of their technical staff to the epicentral area with four accelerometers and a broadband seismograph; the IRIS/PASSCAL facility shipped three digital seismographs and ancillary equipment within one day of the request; the POLARIS Consortium, Canada sent a field crew of three with a near real-time, satellite telemetry based earthquake monitoring system. The Polaris station, KSVO, powered by a solar panel and batteries, was already transmitting data to the central Hub in London, Ontario, Canada within a day after the field crew arrived in the Au Sable Forks area. This collaboration allowed us to maximize the scarce resources available for monitoring this damaging earthquake and its aftershocks in the Northeastern U.S.

  7. Communication infrastructure in a contact center for home care monitoring of chronic disease patients.

    PubMed Central

    Maglaveras, N.; Gogou, G.; Chouvarda, I.; Koutkias, V.; Lekka, I.; Giaglis, G.; Adamidis, D.; Karvounis, C.; Louridas, G.; Goulis, D.; Avramidis, A.; Balas, E. A.

    2002-01-01

    The Citizen Health System (CHS) is a European Commission (EC) funded project in the field of IST for Health. Its main goal is to develop a generic contact center which in its pilot stage can be used in the monitoring, treatment and management of chronically ill patients at home in Greece, Spain and Germany. Such contact centers, which can use any type of communication technology, and can provide timely and preventive prompting to the patients are envisaged in the future to evolve into well-being contact centers providing services to all citizens. In this paper, we present the structure of such a generic contact center and in particular the telecommunication infrastructure, the communication protocols and procedures, and finally the educational modules that are integrated into this contact center. We discuss the procedures followed for two target groups of patients where two randomized control clinical trials are under way, namely diabetic patients with obesity problems, and congestive heart failure patients. We present examples of the communication means between the contact center medical personnel and these patients, and elaborate on the educational issues involved. PMID:12463870

  8. RAPID: Collaboration Results from Three NASA Centers in Commanding/Monitoring Lunar Assets

    NASA Technical Reports Server (NTRS)

    Torres, R. Jay; Allan, Mark; Hirsh, Robert; Wallick, Michael N.

    2009-01-01

    Three NASA centers are working together to address the challenge of operating robotic assets in support of human exploration of the Moon. This paper describes the combined work to date of the Ames Research Center (ARC), Jet Propulsion Laboratory (JPL) and Johnson Space Center (JSC) on a common support framework to control and monitor lunar robotic assets. We discuss how we have addressed specific challenges including time-delayed operations, and geographically distributed collaborative monitoring and control, to build an effective architecture for integrating a heterogeneous collection of robotic assets into a common work. We describe the design of the Robot Application Programming Interface Delegate (RAPID) architecture that effectively addresses the problem of interfacing a family of robots including the JSC Chariot, ARC K-10 and JPL ATHLETE rovers. We report on lessons learned from the June 2008 field test in which RAPID was used to monitor and control all of these assets. We conclude by discussing some future directions to extend the RAPID architecture to add further support for NASA's lunar exploration program.

  9. The meteorological monitoring system for the Kennedy Space Center/Cape Canaveral Air Station

    NASA Technical Reports Server (NTRS)

    Dianic, Allan V.

    1994-01-01

    The Kennedy Space Center (KSC) and Cape Canaveral Air Station (CCAS) are involved in many weather-sensitive operations. Manned and unmanned vehicle launches, which occur several times each year, are obvious example of operations whose success and safety are dependent upon favorable meteorological conditions. Other operations involving NASA, Air Force, and contractor personnel, including daily operations to maintain facilities, refurbish launch structures, prepare vehicles for launch, and handle hazardous materials, are less publicized but are no less weather-sensitive. The Meteorological Monitoring System (MMS) is a computer network which acquires, processes, disseminates, and monitors near real-time and forecast meteorological information to assist operational personnel and weather forecasters with the task of minimizing the risk to personnel, materials, and the surrounding population. CLIPS has been integrated into the MMS to provide quality control analysis and data monitoring. This paper describes aspects of the MMS relevant to CLIPS including requirements, actual implementation details, and results of performance testing.

  10. The in-the-ear recording concept: user-centered and wearable brain monitoring.

    PubMed

    Looney, David; Kidmose, Preben; Park, Cheolsoo; Ungstrup, Michael; Rank, Mike; Rosenkranz, Karin; Mandic, Danilo

    2012-01-01

    The integration of brain monitoring based on electroencephalography (EEG) into everyday life has been hindered by the limited portability and long setup time of current wearable systems as well as by the invasiveness of implanted systems (e.g. intracranial EEG). We explore the potential to record EEG in the ear canal, leading to a discreet, unobtrusive, and user-centered approach to brain monitoring. The in-the-ear EEG (Ear-EEG) recording concept is tested using several standard EEG paradigms, benchmarked against standard onscalp EEG, and its feasibility proven. Such a system promises a number of advantages, including fixed electrode positions, user comfort, robustness to electromagnetic interference, feedback to the user, and ease of use. The Ear-EEG platform could also support additional biosensors, extending its reach beyond EEG to provide a powerful health-monitoring system for those applications that require long recording periods in a natural environment. PMID:23247157

  11. Acoustic monitoring of earthquakes along the Blanco Transform Fault zone and Gorda Plate and their tectonic implications

    NASA Astrophysics Data System (ADS)

    Dziak, Robert Paul

    Hydroacoustic tertiary (T-) waves are seismically generated acoustic waves that propagate over great distances in the ocean sound channel with little loss in signal strength. Hydrophone recorded T-waves can provide a lower earthquake detection threshold and an improved epicenter location accuracy for oceanic earthquakes than land-based seismic networks. Thus detection and location of NE Pacific ocean earthquakes along the Blanco Transform Fault (BTFZ) and Gorda plate using the U.S. Navy's SOSUS (SOund SUrveillance System) hydrophone arrays afford greater insight into the current state of stress and crustal deformation mechanics than previously available. Acoustic earthquake information combined with bathymetry, submersible observations, earthquake source- parameter estimates, petrologic samples, and water-column chemistry renders a new tectonic view of the southern Juan de Fuca plate boundaries. Chapter 2 discusses development of seismo-acoustic analysis techniques using the well-documented April 1992 Cape Mendocino earthquake sequence. Findings include a hydrophone detection threshold estimate (M ~ 2.4), and T-wave propagation path modeling to approximate earthquake acoustic source energy. Empirical analyses indicate that acoustic energy provides a reasonable magnitude and seismic moment estimate of oceanic earthquakes not detected by seismic networks. Chapters 3 documents a probable volcanogenic T-wave event swarm along a pull-apart basin within the western BTFZ during January 1994. Response efforts yielded evidence of anomalous water-column 3He concentrations, pillow- lava volcanism, and the first discovery of active hydrothermal vents along an oceanic fracture zone. Chapter 4 discusses the detection of a NE-SW trending microearthquake band along the mid-Gorda plate which was active from initiation of SOSUS recording in August 1991 through July 1992, then abruptly ceased. It is proposed that eventual termination of the Gorda plate seismicity band is due to strain reduction associated with the Cape Mendocino earthquake sequence. Chapter 5 combines bathymetric, hydro-acoustic, seismic, submersible, and gravity data to investigate the active tectonics of the transform parallel Blanco Ridge (BR), along the eastern BTFZ. The BR formation mechanism preferred here is uplift through strike-slip motion (with a normal component) followed by formation and intrusion of mantle-derived serpentinized-peridotite into the shallow ocean crust. The conclusion considers a potential link between the deformation patterns observed along the BTFZ and Gorda plate regions.

  12. Earthquake history of Nebraska

    USGS Publications Warehouse

    von Hake, C. A.

    1974-01-01

    Nebraska is in a region of moderate seismicity occasionally punctuated by rather strong earthquakes. Most of the State is seismic risk zone 1, with a small part in the southeast corner in risk zone 2. the first significant earthquake felt in Nebraska occurred in 1867, the year that statehood was achieved. the tremor occurred on April 24, 1867, and was apparently centered near Lawrence, Kansas. It affected an area estimated at 780,000 km2 including much of Nebraska. Since 1867, at least seven earthquakes of intensity V or greater have originated within Nebraska's boundaries. Several strong earthquakes centered in neighboring States have also been felt over limited portions of Nebraska. None of these caused damage. 

  13. Earthquake history of Tennessee

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

     The western part of the State was shaken strongly by the New Madrid, Mo., earthquakes of 1811-12 and by earthquakes in 1843 and 1895. The area has also experienced minor shocks. Additional activity has occurred in the eastern part of the State, near the North Carolina border. Forty shocks of intensity V (Modified Mercalli scale) or greater have been cataloged as occurring within the State. Many other earthquakes centered in bordering States have affected points in Tennessee. The following summary covers only hose shocks of intensity VI or greater. 

  14. Earthquake history of Wyoming

    USGS Publications Warehouse

    von Hake, C. A.

    1978-01-01

    Forty-five earthquakes of moderate intensity (V or greater) on the Modified Mercalli Intensity Scale (MM) and extent have originated in Wyoming from 1894 to 1976. Many shocks have occurred in Yellowstone National Park, including an intensity VII event in June 1975. the 1959 Hebgen Lake, Mont., earthquake, centered just west of the park, was felt (MM VII) in northwestern Wyoming. Many aftershocks from this earthquake were reported in Yellowstone Park (MM V-VI) through December 1959, and numerous shocks of lesser intensities continued through 1963. 

  15. The “NetBoard”: Network Monitoring Tools Integration for INFN Tier-1 Data Center

    NASA Astrophysics Data System (ADS)

    De Girolamo, D.; dell'Agnello and, L.; Zani, S.

    2012-12-01

    The monitoring and alert system is fundamental for the management and the operation of the network in a large data center such as an LHC Tier-1. The network of the INFN Tier-1 at CNAF is a multi-vendor environment: for its management and monitoring several tools have been adopted and different sensors have been developed. In this paper, after an overview on the different aspects to be monitored and the tools used for them (i.e. MRTG, Nagios, Arpwatch, NetFlow, Syslog, etc), we will describe the “NetBoard”, a monitoring toolkit developed at the INFN Tier-1. NetBoard, developed for a multi-vendor network, is able to install and auto-configure all tools needed for its monitoring, either via network devices discovery mechanism or via configuration file or via wizard. In this way, we are also able to activate different types of sensors and Nagios checks according to the equipment vendor specifications. Moreover, when a new device is connected in the LAN, NetBoard can detect where it is plugged. Finally the NetBoard web interface allows to have the overall status of the entire network “at a glance”, both the local and the geographical (including the LHCOPN and the LHCONE) link utilization, health status of network devices (with active alerts) and flow analysis.

  16. Estimated airborne release of plutonium from the 102 Building at the General Electric Vallecitos Nuclear Center, Vallecitos, California, as a result of damage from severe wind and earthquake hazard

    SciTech Connect

    Mishima, J.; Ayer, J.E.; Hays, I.D.

    1980-12-01

    This report estimates the potential airborne releases of plutonium as a consequence of various severities of earthquake and wind hazard postulated for the 102 Building at the General Electric Vallecitos Nuclear Center in California. The releases are based on damage scenarios developed by other specialists. The hazard severities presented range up to a nominal velocity of 230 mph for wind hazard and are in excess of 0.8 g linear acceleration for earthquakes. The consequences of thrust faulting are considered. The approaches and factors used to estimate the releases are discussed. Release estimates range from 0.003 to 3 g Pu.

  17. Wilson Corners SWMU 001 2014 Annual Long Term Monitoring Report Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Langenbach, James

    2015-01-01

    This document presents the findings of the 2014 Long Term Monitoring (LTM) that was completed at the Wilson Corners site, located at the National Aeronautics and Space Administration (NASA) John F. Kennedy Space Center (KSC), Florida. The goals of the 2014 annual LTM event were to evaluate the groundwater flow direction and gradient and to monitor the vertical and downgradient horizontal extent of the volatile organic compounds (VOCs) in groundwater at the site. The LTM activities consisted of an annual groundwater sampling event in December 2014, which included the collection of water levels from the LTM wells. During the annual groundwater sampling event, depth to groundwater was measured and VOC samples were collected using passive diffusion bags (PDBs) from 30 monitoring wells. In addition to the LTM sampling, additional assessment sampling was performed at the site using low-flow techniques based on previous LTM results and assessment activities. Assessment of monitoring well MW0052DD was performed by collecting VOC samples using low-flow techniques before and after purging 100 gallons from the well. Monitoring well MW0064 was sampled to supplement shallow VOC data north of Hot Spot 2 and east of Hot Spot 4. Monitoring well MW0089 was sampled due to its proximity to MW0090. MW0090 is screened in a deeper interval and had an unexpected detection of trichloroethene (TCE) during the 2013 LTM, which was corroborated during the March 2014 verification sampling. Monitoring well MW0130 was sampled to provide additional VOC data beneath the semi-confining clay layer in the Hot Spot 2 area.

  18. Ghana's experience in the establishment of a national data center

    NASA Astrophysics Data System (ADS)

    Ekua, Amponsah Paulina; Yaw, Serfor-Armah

    2012-08-01

    The government of Ghana in a bilateral agreement with the Preparatory Commission for the Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO) has established a National Data Center in Ghana with the aim of monitoring the testing of nuclear explosions. Seismic, hydroacoustic, radionuclide and infrasound methods are used for the monitoring. The data center was commissioned on 3 February, 2010 at the Ghana Atomic Energy Commission. At present Ghana does not have any operational, centralised data (seismic, hydroacoustic, radionuclide and infrasound) acquisition system with the capability of accessing data from other international stations. Hence, the need of setting up the National Data Center which would enable us constantly monitor, manage and coordinate both natural and man-made seismic activities in the country and around the globe, upload data to the International Data Center (IDC) as well as receive and use International Monitoring System (IMS) data and IDC products for treaty verification and compliance. Apart from these, the center also accesses and analyzes seismic waveforms relevant to its needs from the International Data Center; makes data available to its stakeholder institutions for earthquake disaster mitigation; reports on all aspects of disasters related to earthquake to the relevant government agencies that deal with disasters; makes recommendations to the government of Ghana on earthquake safety measures; provides information to assist government institutions to develop appropriate land and building policies. The center in collaboration with stakeholder agencies periodically organises public lectures on earthquake disaster risk mitigation.

  19. Response to the great East Japan earthquake of 2011 and the Fukushima nuclear crisis: the case of the Laboratory Animal Research Center at Fukushima Medical University.

    PubMed

    Katahira, Kiyoaki; Sekiguchi, Miho

    2013-01-01

    A magnitude 9.0 great earthquake, the 2011 off the Pacific coast of Tohoku Earthquake, occurred on March 11, 2011, and subsequent Fukushima Daiichi Nuclear Power Station (Fukushima NPS) accidents stirred up natural radiation around the campus of Fukushima Medical University (FMU). FMU is located in Fukushima City, and is 57 km to the northwest of Fukushima NPS. Due to temporary failure of the steam boilers, the air conditioning system for the animal rooms, all autoclaves, and a cage washer could not be used at the Laboratory Animal Research Center (LARC) of FMU. The outside air temperature dropped to zero overnight, and the temperature inside the animal rooms fell to 10°C for several hours. We placed sterilized nesting materials inside all cages to encourage rodents to create nests. The main water supply was cut off for 8 days in all, while supply of steam and hot water remained unavailable for 12 days. It took 20 days to restore the air conditioning system to normal operation at the facility. We measured radiation levels in the animal rooms to confirm the safety of care staff and researchers. On April 21, May 9, and June 17, the average radiation levels at a central work table in the animal rooms with HEPA filters were 46.5, 44.4, and 43.4 cpm, respectively, which is equal to the background level of the equipment. We sincerely hope our experiences will be a useful reference regarding crisis management for many institutes having laboratory animals. PMID:23615301

  20. Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Bachmann, C. E.; Wiemer, S.; Woessner, J.; Hainzl, S.

    2011-08-01

    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was reduced and then stopped after another ML 2.5 event. A few hours later, an earthquake with ML 3.4, felt within the city, occurred, which led to bleed-off of the well. A risk study was later issued with the outcome that the experiment could not be resumed. We analyse the statistical features of the sequence and show that the sequence is well modelled with the Omori-Utsu law following the termination of water injection. Based on this model, the sequence will last 31+29/-14 years to reach the background level. We introduce statistical models based on Reasenberg and Jones and Epidemic Type Aftershock Sequence (ETAS) models, commonly used to model aftershock sequences. We compare and test different model setups to simulate the sequences, varying the number of fixed and free parameters. For one class of the ETAS models, we account for the flow rate at the injection borehole. We test the models against the observed data with standard likelihood tests and find the ETAS model accounting for the on flow rate to perform best. Such a model may in future serve as a valuable tool for designing probabilistic alarm systems for EGS experiments.

  1. GPS Monitoring of Surface Change During and Following the Fortuitous Occurrence of the M(sub w) = 7.3 Landers Earthquake in our Network

    NASA Technical Reports Server (NTRS)

    Miller, M. Meghan

    1998-01-01

    Accomplishments: (1) Continues GPS monitoring of surface change during and following the fortuitous occurrence of the M(sub w) = 7.3 Landers earthquake in our network, in order to characterize earthquake dynamics and accelerated activity of related faults as far as 100's of kilometers along strike. (2) Integrates the geodetic constraints into consistent kinematic descriptions of the deformation field that can in turn be used to characterize the processes that drive geodynamics, including seismic cycle dynamics. In 1991, we installed and occupied a high precision GPS geodetic network to measure transform-related deformation that is partitioned from the Pacific - North America plate boundary northeastward through the Mojave Desert, via the Eastern California shear zone to the Walker Lane. The onset of the M(sub w) = 7.3 June 28, 1992, Landers, California, earthquake sequence within this network poses unique opportunities for continued monitoring of regional surface deformation related to the culmination of a major seismic cycle, characterization of the dynamic behavior of continental lithosphere during the seismic sequence, and post-seismic transient deformation. During the last year, we have reprocessed all three previous epochs for which JPL fiducial free point positioning products available and are queued for the remaining needed products, completed two field campaigns monitoring approx. 20 sites (October 1995 and September 1996), begun modeling by development of a finite element mesh based on network station locations, and developed manuscripts dealing with both the Landers-related transient deformation at the latitude of Lone Pine and the velocity field of the whole experiment. We are currently deploying a 1997 observation campaign (June 1997). We use GPS geodetic studies to characterize deformation in the Mojave Desert region and related structural domains to the north, and geophysical modeling of lithospheric behavior. The modeling is constrained by our existing and continued GPS measurements, which will provide much needed data on far-field strain accumulation across the region and on the deformational response of continental lithosphere during and following a large earthquake, forming the basis for kinematic and dynamic modeling of secular and seismic-cycle deformation. GPS geodesy affords both regional coverage and high precision that uniquely bear on these problems.

  2. Earthquake prediction

    SciTech Connect

    Ma, Z.; Fu, Z.; Zhang, Y.; Wang, C.; Zhang, G.; Liu, D.

    1989-01-01

    Mainland China is situated at the eastern edge of the Eurasian seismic system and is the largest intra-continental region of shallow strong earthquakes in the world. Based on nine earthquakes with magnitudes ranging between 7.0 and 7.9, the book provides observational data and discusses successes and failures of earthquake prediction. Derived from individual earthquakes, observations of various phenomena and seismic activities occurring before and after earthquakes, led to the establishment of some general characteristics valid for earthquake prediction.

  3. An Evaluation of North Korea’s Nuclear Test by Belbasi Nuclear Tests Monitoring Center-KOERI

    NASA Astrophysics Data System (ADS)

    Necmioglu, O.; Meral Ozel, N.; Semin, K.

    2009-12-01

    Bogazici University and Kandilli Observatory and Earthquake Research Institute (KOERI) is acting as the Turkish National Data Center (NDC) and responsible for the operation of the International Monitoring System (IMS) Primary Seismic Station (PS-43) under Belbasi Nuclear Tests Monitoring Center for the verification of compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) since February 2000. The NDC is responsible for operating two arrays which are part of the IMS, as well as for transmitting data from these stations to the International Data Centre (IDC) in Vienna. The Belbasi array was established in 1951, as a four-element (Benioff 1051) seismic array as part of the United States Atomic Energy Detection System (USAEDS). Turkish General Staff (TGS) and U.S. Air Force Technical Application Center (AFTAC) under the Defense and Economic Cooperation Agreement (DECA) jointly operated this short period array. The station was upgraded and several seismometers were added to array during 1951 and 1994 and the station code was changed from BSRS (Belbasi Seismic Research Station) to BRTR-PS43 later on. PS-43 is composed of two sub-arrays (Ankara and Keskin): the medium-period array with a ~40 km radius located in Ankara and the short-period array with a ~3 km radius located in Keskin. Each array has a broadband element located at the middle of the circular geometry. Short period instruments are installed at depth 30 meters from the surface while medium and broadband instruments are installed at depth 60 meters from surface. On 25 May 2009, The Democratic People’s Republic of Korea (DPRK) claimed that it had conducted a nuclear test. Corresponding seismic event was recorded by IMS and IDC released first automatic estimation of time (00:54:43 GMT), location (41.2896°N and 129.0480°E) and the magnitude (4.52 mb) of the event in less than two hours time (USGS: 00:54:43 GMT; 41.306°N, 129.029°E; 4.7 mb) During our preliminary analysis of the 25th May 2009 DPRK event, we saw a very clear P arrival at 01:05:47 (GMT) at BRTR SP array. The result of the f-k analysis performed in Geotool software, installed at NDC facilities in 2008 and is in full use currently, was also indicating that the arrival belongs to the DPRK event. When comparing our f-k results (calculated at 1-2 Hz) with IDC-REB, however, we have noticed that our calculation and therefore corresponding residuals (calculated with reference to REB residuals) are much better in comparison to REB. The reasons of this ambiguity have been explored and for the first time a comprehensive seismological analysis of a Nuclear Test has been conducted in Turkey. CTBT has an important role for the implementation of the non-proliferation of nuclear weapons and it is a key element for the pursuit of nuclear disarmament. In this study, we would like to reflect the technical and scientific aspects of the 25 May 2009 DPRK event analysis, together with our involvement in CTBT(O) affairs, which we believe it brings new dimensions to Turkey especially in the area of Geophysics.

  4. A real-time navigation monitoring expert system for the Space Shuttle Mission Control Center

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Fletcher, Malise

    1993-01-01

    The ONAV (Onboard Navigation) Expert System has been developed as a real time console assistant for use by ONAV flight controllers in the Mission Control Center at the Johnson Space Center. This expert knowledge based system is used to monitor the Space Shuttle onboard navigation system, detect faults, and advise flight operations personnel. This application is the first knowledge-based system to use both telemetry and trajectory data from the Mission Operations Computer (MOC). To arrive at this stage, from a prototype to real world application, the ONAV project has had to deal with not only AI issues but operating environment issues. The AI issues included the maturity of AI languages and the debugging tools, verification, and availability, stability and size of the expert pool. The environmental issues included real time data acquisition, hardware suitability, and how to achieve acceptance by users and management.

  5. Federal Radiological Monitoring and Assessment Center (FRMAC) overview of FRMAC operations

    SciTech Connect

    1996-02-01

    In the event of a major radiological emergency, 17 federal agencies with various statutory responsibilities have agreed to coordinate their efforts at the emergency scene under the umbrella of the Federal Radiological Emergency Response plan (FRERP). This cooperative effort will assure the designated Lead Federal Agency (LFA) and the state(s) that all federal radiological assistance fully supports their efforts to protect the public. The mandated federal cooperation ensures that each agency can obtain the data critical to its specific responsibilities. This Overview of the Federal Radiological Monitoring and Assessment Center (FRMAC) Operations describes the FRMAC response activities to a major radiological emergency. It also describes the federal assets and subsequent operational activities which provide federal radiological monitoring and assessment of the off-site areas. These off-site areas may include one or more affected states.

  6. Launch Complex 39A, SWMU 008, Operations, Maintenance, and Monitoring Report, Kennedy Space Center, FL

    NASA Technical Reports Server (NTRS)

    Wilson, Deborah M.

    2016-01-01

    This Operations, Maintenance, and Monitoring Report (OMMR) presents the findings, observations, and results from Year 1 operation of the air sparging (AS) groundwater interim measure (IM) for High-Concentration Plumes (HCPs) and Low-Concentration Plumes (LCPs) within the perimeter fence line at Launch Complex 39A (LC39A) located at Kennedy Space Center (KSC), Florida. The objective of the LC39A groundwater IM is to actively decrease concentrations of trichloroethene (TCE), cis-1,2-dichloroethene (cDCE), and vinyl chloride (VC) in groundwater in the HCP and LCP within the pad perimeter fence line via AS to levels less than Florida Department of Environmental Protection (FDEP) Groundwater Cleanup Target Levels (GCTLs). The objective was developed because LC39A is currently being leased to Space Exploration Technologies (SpaceX), and the original IM for monitored natural attenuation (MNA) over an extended period of time was not suitable for future planned site use.

  7. Classification of Global Urban Centers Using ASTER Data: Preliminary Results From the Urban Environmental Monitoring Program

    NASA Astrophysics Data System (ADS)

    Stefanov, W. L.; Stefanov, W. L.; Christensen, P. R.

    2001-05-01

    Land cover and land use changes associated with urbanization are important drivers of global ecologic and climatic change. Quantification and monitoring of these changes are part of the primary mission of the ASTER instrument, and comprise the fundamental research objective of the Urban Environmental Monitoring (UEM) Program. The UEM program will acquire day/night, visible through thermal infrared ASTER data twice per year for 100 global urban centers over the duration of the mission (6 years). Data are currently available for a number of these urban centers and allow for initial comparison of global city structure using spatial variance texture analysis of the 15 m/pixel visible to near infrared ASTER bands. Variance texture analysis highlights changes in pixel edge density as recorded by sharp transitions from bright to dark pixels. In human-dominated landscapes these brightness variations correlate well with urbanized vs. natural land cover and are useful for characterizing the geographic extent and internal structure of cities. Variance texture analysis was performed on twelve urban centers (Albuquerque, Baghdad, Baltimore, Chongqing, Istanbul, Johannesburg, Lisbon, Madrid, Phoenix, Puebla, Riyadh, Vancouver) for which cloud-free daytime ASTER data are available. Image transects through each urban center produce texture profiles that correspond to urban density. These profiles can be used to classify cities into centralized (ex. Baltimore), decentralized (ex. Phoenix), or intermediate (ex. Madrid) structural types. Image texture is one of the primary data inputs (with vegetation indices and visible to thermal infrared image spectra) to a knowledge-based land cover classifier currently under development for application to ASTER UEM data as it is acquired. Collaboration with local investigators is sought to both verify the accuracy of the knowledge-based system and to develop more sophisticated classification models.

  8. Temporal Evolution of Effective Upper Mantle Viscosity from Postseismic Response to the 2006-2007 Great Kuril Earthquakes: Four Years of GPS Monitoring

    NASA Astrophysics Data System (ADS)

    Kogan, M. G.; Vasilenko, N. F.; Frolov, D. I.; Freymueller, J. T.; Prytkov, A. S.

    2012-12-01

    Transient surface deformation was still observed by GPS 40 years after two giant (M ~9) megathrust earthquakes in the 20th century: the 1960 Chile and the 1964 Alaska events [Hu et al., 2004; Suito and Freymueller, 2009]. The postseismic signal was attributed to viscoelastic relaxation in the Maxwell mantle wedge with constant viscosity on the order of 10^19 Pa s. In contrast, postseismic deformation for 3-4 years after the 2002 M 7.9 Denali and the 1997 M 7.6 Manyi, Tibet earthquakes requires much lower Maxwell viscosity on the order of 10^17 - 10^18 Pa s [Freed et al, 2006; Ryder et al., 2007; Biggs et al., 2009]. Also these early postseismic GPS and InSAR time series suggest an increase in viscosity with time, which would be inconsistent with a uniform Maxwell viscosity. Here we analyze surface deformation following the doublet of the 2006-2007 M > 8 Kuril megathrust earthquakes using 4 years of postseismic continuous GPS time series on the Kuril GPS Array. We split time series into four annual intervals starting at epoch 2007.5, i.e., about 7 months after the 2006 earthquake, and search for the best-fitting Maxwell viscosity year by year, after accounting for afterslip and the background interseismic strain signal. Earlier we showed that the contribution of afterslip to the Kuril postseismic displacement is small since about epoch 2007.5 [Kogan et al, 2011]. The background interseismic strain signal was not measured on the central Kurils at the stations showing the largest postseismic motion because observations started several months after the earthquakes. From analysis of trench-parallel gravity anomalies, Song and Simons [2003] proposed weak interseismic locking at the subduction interface in the central Kurils. If this hypothesis holds, we can expect small interseismic velocities at the sites affected by postseismic deformation. We tested three simple variants of corrections for interseismic motion of these sites, ranging from 0 to the mean velocity at the SW and NE arc segments. Regardless of the variant to correct for interseismic motion, the best-fitting viscosity evolves from 2 × 10^17 Pa s in Year 1 to 1 × 10^18 Pa s in Year 4. The increase with time in the effective Maxwell viscosity of the asthenosphere suggests that the actual physical mechanism is dislocation creep with the power-law dependence of strain rate on stress. For such nonlinear rheology, the effective viscosity several decades following the Kuril doublet may be much higher than what we inferred from 4 years of GPS monitoring.

  9. Appraising the Early-est earthquake monitoring system for tsunami alerting at the Italian candidate Tsunami Service Provider

    NASA Astrophysics Data System (ADS)

    Bernardi, F.; Lomax, A.; Michelini, A.; Lauciani, V.; Piatanesi, A.; Lorito, S.

    2015-04-01

    In this paper we present the procedure for earthquake location and characterization implemented in the Italian candidate Tsunami Service Provider at INGV in Roma. Following the ICG/NEAMTWS guidelines, the first tsunami warning messages are based only on seismic information, i.e. epicenter location, hypocenter depth and magnitude, which are automatically computed by the software Early-est. Early-est is a package for rapid location and seismic/tsunamigenic characterization of earthquakes. The Early-est software package operates on offline-event or continuous-realtime seismic waveform data to perform trace processing and picking, and, at a regular report interval, phase association, event detection, hypocenter location, and event characterization. In this paper we present the earthquake parameters computed by Early-est from the beginning of 2012 till the end of December 2014 at global scale for events with magnitude M ≥ 5.5, and the detection timeline. The earthquake parameters computed automatically by Early-est are compared with reference manually revised/verified catalogs. From our analysis the epicenter location and hypocenter depth parameters do not differ significantly from the values in the reference catalogs. The epicenter coordinates generally differ less than 20 ∓ 20 km from the reference epicenter coordinates; focal depths are less well constrained and differ generally less than 0 ∓ 30 km. Early-est also provides mb, Mwp and Mwpd magnitude estimations. mb magnitudes are preferred for events with Mwp ≲ 5.8, while Mwpd are valid for events with Mwp ≳ 7.2. The magnitude mb show wide differences with respect to the reference catalogs, we thus apply a linear correction mbcorr = mb · 0.52 + 2.46, such correction results into δmb ≈ 0.0 ∓ 0.2 uncertainty with respect the reference catalogs. As expected the Mwp show distance dependency. Mwp values at stations with epicentral distance Δ ≲ 30° are significantly overestimated with respect the CMT-global solutions, whereas Mwp values at stations with epicentral distance Δ ≳ 90° are slightly underestimated. We thus apply a 3rd degree polynomial distance correction. After applying the distance correction, the Mwp provided by Early-est differs from CMT-global catalog values of about δ Mwp ≈ 0.0 ∓ 0.2. Early-est continuously acquires time series data and updates the earthquake source parameters. Our analysis shows that the epicenter coordinates and the magnitude values converge rather quickly toward the final values. Generally we can provide robust and reliable earthquake source parameters to compile tsunami warning message within less than about 15 min after event origin time.

  10. Hidden Earthquakes.

    ERIC Educational Resources Information Center

    Stein, Ross S.; Yeats, Robert S.

    1989-01-01

    Points out that large earthquakes can take place not only on faults that cut the earth's surface but also on blind faults under folded terrain. Describes four examples of fold earthquakes. Discusses the fold earthquakes using several diagrams and pictures. (YP)

  11. Lecture Demonstrations on Earthquakes for K-12 Teachers and Students

    NASA Astrophysics Data System (ADS)

    Dry, M. D.; Patterson, G. L.

    2005-12-01

    Lecture Demonstrations on Earthquakes for K-12 Teachers and Students Since 1975, the Center for Earthquake Research and Information, (CERI), at The University of Memphis, has strived to satisfy its information transfer directives through diverse education and outreach efforts, providing technical and non-technical earthquake information to the general public, K-16 teachers and students, professional organizations, and state and federal organizations via all forms of written and electronic communication. <> Through these education and outreach efforts, CERI tries to increase earthquake hazard awareness to help limit future losses. <>In the past three years, education programs have reached over 20,000 K-16 students and teachers through in-service training workshops for teachers and earthquake/earth science lecture demonstrations for students. The presentations include an hour-long lecture demonstration featuring graphics and an informal question and answer format. Graphics used include seismic hazard maps, damage photos, plate tectonic maps, layers of the Earth, and more, all adapted for the audience. Throughout this presentation, manipulatives such as a Slinky, Silly Putty, a foam Earth with depth and temperature features, and Popsicle sticks are used to demonstrate seismic waves, the elasticity of the Earth, the Earth's layers and their features, and the brittleness of the crust. Toward the end, a demonstration featuring a portable shake table with a dollhouse mounted on it is used to illustrate earthquake-shaking effects. This presentation is also taken to schools when they are unable to visit CERI. Following this presentation, groups are then taken to the Public Earthquake Resource Center at CERI, a space featuring nine displays, seven of which are interactive. The interactive displays include a shake table and building blocks, a trench with paleoliquefaction features, computers with web access to seismology sites, a liquefaction model, an oscilloscope and attached geophone, a touch-screen monitor, and various manipulatives. CERI is also developing suitcase kits and activities for teachers to borrow and use in their classrooms. The suitcase kits include activities based on state learning standards, such as layers of the Earth and plate tectonics. Items included in the suitcase modules include a shake table and dollhouse, an oscilloscope and geophone, a resonance model, a Slinky, Silly putty, Popsicle sticks, and other items. Almost all of the activities feature a lecture demonstration component. These projects would not be possible without leveraged funding from the Mid-America Earthquake Center (MAEC) and the Center for Earthquake Research and Information, with additional funding from the National Earthquake Hazards Reduction Program (NEHRP).

  12. Earthquake history of South Dakota

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

    Twelve earthquakes of intensity V or greater (Modified Mercalli scale) have centered within the borders of South Dakota. All the shocks were rather localized, except that of 1911 which was felt over an area of approximately 100,000 km2. Some earthquakes from neighboring States were felt strongly in South Dakota. 

  13. Quake centers funded

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    The National Science Foundation (NSF) has awarded $30 million in contracts to three research centers to study earthquakes and develop technologies to reduce hazards and future losses.The centers—which hope to match agency funding with outside sources—are forming consortiums of public and private institutions and will focus on separate earthquake issues. The Pacific Earthquake Engineering Research Center at the University of California at Berkeley will develop technologies to reduce urban earthquake losses. The Mid-America Earthquake Center at the University of Illinois' Urbana-Champaign campus will study how to reduce potential losses from low-frequency seismic events in the central and eastern United States.

  14. Photovoltaic Performance and Reliability Database: A Gateway to Experimental Data Monitoring Projects for PV at the Florida Solar Energy Center

    DOE Data Explorer

    This site is the gateway to experimental data monitoring projects for photovoltaic (PV) at the Florida Solar Energy Center. The website and the database were designed to facilitate and standardize the processes for archiving, analyzing and accessing data collected from dozens of operational PV systems and test facilities monitored by FSEC's Photovoltaics and Distributed Generation Division. [copied from http://www.fsec.ucf.edu/en/research/photovoltaics/data_monitoring/index.htm

  15. Novel Diagnostic and Monitoring Tools in Stroke: an Individualized Patient-Centered Precision Medicine Approach.

    PubMed

    de Villiers, Sulette; Swanepoel, Albe; Bester, Janette; Pretorius, Etheresia

    2016-05-01

    Central to the pathogenesis of ischaemic stroke are the normally protective processes of platelet adhesion and activation. Experimental evidence has shown that the ligand-receptor interactions in ischaemic stroke represent a thrombo-inflammatory cascade, which presents research opportunities into new treatment. However, as anti-platelet drugs have the potential to cause severe side effects in ischaemic stroke patients (as well as other vascular disease patients), it is important to carefully monitor the risk of bleeding and risk of thrombus in patients receiving treatment. Because thrombo-embolic ischaemic stroke is a major health issue, we suggest that the answer to adequate treatment is based on an individualized patient-centered approach, inline with the latest NIH precision medicine approach. A combination of viscoelastic methodologies may be used in a personalized patient-centered regime, including thromboelastography (TEG®) and the lesser used scanning electron microscopy approach (SEM). Thromboelastography provides a dynamic measure of clot formation, strength, and lysis, whereas SEM is a visual structural tool to study patient fibrin structure in great detail. Therefore, we consider the evidence for TEG® and SEM as unique means to confirm stroke diagnosis, screen at-risk patients, and monitor treatment efficacy. Here we argue that the current approach to stroke treatment needs to be restructured and new innovative thought patterns need to be applied, as even approved therapies require close patient monitoring to determine efficacy, match treatment regimens to each patient's individual needs, and assess the risk of dangerous adverse effects. TEG® and SEM have the potential to be a useful tool and could potentially alter the clinical approach to managing ischaemic stroke. As envisaged in the NIH precision medicine approach, this will involve a number of role players and innovative new research ideas, with benefits that will ultimately only be realized in a few years. Therefore, with this ultimate goal in mind, we suggest that an individualized patient-orientated approach is now available and therefore already within our ability to use. PMID:26686739

  16. Data Management Coordinators Monitor STS-78 Mission at the Huntsville Operations Support Center

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Launched on June 20, 1996, the STS-78 mission's primary payload was the Life and Microgravity Spacelab (LMS), which was managed by the Marshall Space Flight Center (MSFC). During the 17 day space flight, the crew conducted a diverse slate of experiments divided into a mix of life science and microgravity investigations. In a manner very similar to future International Space Station operations, LMS researchers from the United States and their European counterparts shared resources such as crew time and equipment. Five space agencies (NASA/USA, European Space Agency/Europe (ESA), French Space Agency/France, Canadian Space Agency /Canada, and Italian Space Agency/Italy) along with research scientists from 10 countries worked together on the design, development and construction of the LMS. This photo represents Data Management Coordinators monitoring the progress of the mission at the Huntsville Operations Support Center (HOSC) Spacelab Payload Operations Control Center (SL POCC) at MSFC. Pictured are assistant mission scientist Dr. Dalle Kornfeld, Rick McConnel, and Ann Bathew.

  17. Appraising the Early-est earthquake monitoring system for tsunami alerting at the Italian Candidate Tsunami Service Provider

    NASA Astrophysics Data System (ADS)

    Bernardi, F.; Lomax, A.; Michelini, A.; Lauciani, V.; Piatanesi, A.; Lorito, S.

    2015-09-01

    In this paper we present and discuss the performance of the procedure for earthquake location and characterization implemented in the Italian Candidate Tsunami Service Provider at the Istituto Nazionale di Geofisica e Vulcanologia (INGV) in Rome. Following the ICG/NEAMTWS guidelines, the first tsunami warning messages are based only on seismic information, i.e., epicenter location, hypocenter depth, and magnitude, which are automatically computed by the software Early-est. Early-est is a package for rapid location and seismic/tsunamigenic characterization of earthquakes. The Early-est software package operates using offline-event or continuous-real-time seismic waveform data to perform trace processing and picking, and, at a regular report interval, phase association, event detection, hypocenter location, and event characterization. Early-est also provides mb, Mwp, and Mwpd magnitude estimations. mb magnitudes are preferred for events with Mwp ≲ 5.8, while Mwpd estimations are valid for events with Mwp ≳ 7.2. In this paper we present the earthquake parameters computed by Early-est between the beginning of March 2012 and the end of December 2014 on a global scale for events with magnitude M ≥ 5.5, and we also present the detection timeline. We compare the earthquake parameters automatically computed by Early-est with the same parameters listed in reference catalogs. Such reference catalogs are manually revised/verified by scientists. The goal of this work is to test the accuracy and reliability of the fully automatic locations provided by Early-est. In our analysis, the epicenter location, hypocenter depth and magnitude parameters do not differ significantly from the values in the reference catalogs. Both mb and Mwp magnitudes show differences to the reference catalogs. We thus derived correction functions in order to minimize the differences and correct biases between our values and the ones from the reference catalogs. Correction of the Mwp distance dependency is particularly relevant, since this magnitude refers to the larger and probably tsunamigenic earthquakes. Mwp values at stations with epicentral distance Δ ≲ 30° are significantly overestimated with respect to the CMT-global solutions, whereas Mwp values at stations with epicentral distance Δ ≳ 90° are slightly underestimated. After applying such distance correction the Mwp provided by Early-est differs from CMT-global catalog values of about δ Mwp ≈ 0.0 ∓ 0.2. Early-est continuously acquires time-series data and updates the earthquake source parameters. Our analysis shows that the epicenter coordinates and the magnitude values converge within less than 10 min (5 min in the Mediterranean region) toward the stable values. Our analysis shows that we can compute Mwp magnitudes that do not display short epicentral distance dependency overestimation, and we can provide robust and reliable earthquake source parameters to compile tsunami warning messages within less than 15 min after the event origin time.

  18. The Deep Impact Network Experiment Operations Center Monitor and Control System

    NASA Technical Reports Server (NTRS)

    Wang, Shin-Ywan (Cindy); Torgerson, J. Leigh; Schoolcraft, Joshua; Brenman, Yan

    2009-01-01

    The Interplanetary Overlay Network (ION) software at JPL is an implementation of Delay/Disruption Tolerant Networking (DTN) which has been proposed as an interplanetary protocol to support space communication. The JPL Deep Impact Network (DINET) is a technology development experiment intended to increase the technical readiness of the JPL implemented ION suite. The DINET Experiment Operations Center (EOC) developed by JPL's Protocol Technology Lab (PTL) was critical in accomplishing the experiment. EOC, containing all end nodes of simulated spaces and one administrative node, exercised publish and subscribe functions for payload data among all end nodes to verify the effectiveness of data exchange over ION protocol stacks. A Monitor and Control System was created and installed on the administrative node as a multi-tiered internet-based Web application to support the Deep Impact Network Experiment by allowing monitoring and analysis of the data delivery and statistics from ION. This Monitor and Control System includes the capability of receiving protocol status messages, classifying and storing status messages into a database from the ION simulation network, and providing web interfaces for viewing the live results in addition to interactive database queries.

  19. Earthquake history of Rhode Island

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    Only three shocks (intensity V or greater, Modified Moercalli Scale) have centered in Rhode Island, although several earthquakes in New England and the St.Lawerence Valley have been felt in the State.

  20. Test data from the chloride-monitor well at Sun City Center, Hillsborough County, Florida

    USGS Publications Warehouse

    Sinclair, William C.

    1979-01-01

    A test well drilled for Southwest Florida Water Management District at Sun City Center in Hillsborough County, will serve to monitor the interface between freshwater in the aquifer and the underlying chloride water. The sulfate content of the water in the aquifer at the well site exceeds 250 mg/L below a depth of about 700 feet. Wells for domestic and public supply in the area bottom at less than 500 feet and are separated from the sulfate water by about 100 feet of poorly-permeable limestone. The freshwater-chloride water interface is quite sharp and occurs at a depth of 1,410 feet. The chloride water is similar in composition to seawater but nearly twice as saline. (Woodard-USGS).

  1. Activation and implementation of a Federal Radiological Monitoring and Assessment Center

    SciTech Connect

    Doyle, J.F. III

    1989-01-01

    The Nevada Operations Office of the U.S. Department of Energy (DOE/NV) has been assigned the primary responsibility for responding to a major radiological emergency. The initial response to any radiological emergency, however, will probably be conducted under the DOE regional radiological assistance plan (RAP). If the dimensions of the crisis demand federal assistance, the following sequence of events may be anticipated: (1) DOE regional RAP response, (2) activation of the Federal Radiological Monitoring and Assistance Center (FRMAC) requested, (3) aerial measuring systems and DOE/NV advance party respond, (4) FRMAC activated, (5) FRMAC responds to state(s) and cognizant federal agency (CFA), and (6) management of FRMAC transferred to the Environmental Protection Agency (EPA). The paper discusses activation channels, authorization, notification, deployment, and interfaces.

  2. Pain Reduction and Financial Incentives to Improve Glucose Monitoring Adherence in a Community Health Center

    PubMed Central

    Huntsman, Mary Ann H.; Olivares, Faith J.; Tran, Christina P.; Billimek, John; Hui, Elliot E.

    2014-01-01

    Self-monitoring of blood glucose is a critical component of diabetes management. However, patients often do not maintain the testing schedule recommended by their healthcare provider. Many barriers to testing have been cited, including cost and pain. We present a small pilot study to explore whether the use of financial incentives and pain-free lancets could improve adherence to glucose testing in a community health center patient population consisting largely of non-English speaking ethnic minorities with low health literacy. The proportion of patients lost to follow-up was 17%, suggesting that a larger scale study is feasible in this type of setting, but we found no preliminary evidence suggesting a positive effect on adherence by either financial incentives or pain-free lancets. Results from this pilot study will guide the design of larger-scale studies to evaluate approaches to overcome the variety of barriers to glucose testing that are present in disadvantaged patient populations. PMID:25486531

  3. Pain reduction and financial incentives to improve glucose monitoring adherence in a community health center.

    PubMed

    Huntsman, Mary Ann H; Olivares, Faith J; Tran, Christina P; Billimek, John; Hui, Elliot E

    2014-01-01

    Self-monitoring of blood glucose is a critical component of diabetes management. However, patients often do not maintain the testing schedule recommended by their healthcare provider. Many barriers to testing have been cited, including cost and pain. We present a small pilot study to explore whether the use of financial incentives and pain-free lancets could improve adherence to glucose testing in a community health center patient population consisting largely of non-English speaking ethnic minorities with low health literacy. The proportion of patients lost to follow-up was 17%, suggesting that a larger scale study is feasible in this type of setting, but we found no preliminary evidence suggesting a positive effect on adherence by either financial incentives or pain-free lancets. Results from this pilot study will guide the design of larger-scale studies to evaluate approaches to overcome the variety of barriers to glucose testing that are present in disadvantaged patient populations. PMID:25486531

  4. Atmospheric monitoring of a perfluorocarbon tracer at the 2009 ZERT Center experiment

    NASA Astrophysics Data System (ADS)

    Pekney, Natalie; Wells, Arthur; Rodney Diehl, J.; McNeil, Matthew; Lesko, Natalie; Armstrong, James; Ference, Robert

    2012-02-01

    Field experiments at Montana State University are conducted for the U.S. Department of Energy as part of the Zero Emissions Research and Technology Center (ZERT) to test and verify monitoring techniques for carbon capture and storage (CCS). A controlled release of CO 2 with an added perfluorocarbon tracer was conducted in July 2009 in a multi-laboratory study of atmospheric transport and detection technologies. Tracer plume dispersion was measured with various meteorological conditions using a tethered balloon system with Multi-Tube Remote Samplers (MTRS) at elevations of 10 m, 20 m, and 40 m above ground level (AGL), as well as a ground-based portable tower with monitors containing sorbent material to collect the tracer at 1 m, 2 m, 3 m, and 4 m AGL. Researchers designed a horizontal grid of sampling locations centered at the tracer plume source, with the tower positioned at 10 m and 30 m in both upwind and downwind directions, and the MTRS spaced at 50 m and 90 m downwind and 90 m upwind. Tracer was consistently detected at elevated concentrations at downwind sampling locations. With very few exceptions, higher tracer concentrations correlated with lower elevations. Researchers observed no statistical difference between sampling at 50 m and 90 m downwind at the same elevation. The US EPA AERMOD model applied using site-specific information predicted transport and dispersion of the tracer. Model results are compared to experimental data from the 2009 ZERT experiment. Successful characterization of the tracer plume simulated by the ZERT experiment is considered a step toward demonstrating the feasibility of remote sampling with unmanned aerial systems (UAS's) at future sequestration sites.

  5. Incorporating Fundamentals of Climate Monitoring into Climate Indicators at the National Climatic Data Center

    NASA Astrophysics Data System (ADS)

    Arndt, D. S.

    2014-12-01

    In recent years, much attention has been dedicated to the development, testing and implementation of climate indicators. Several Federal agencies and academic groups have commissioned suites of indicators drawing upon and aggregating information available across the spectrum of climate data stewards and providers. As a long-time participant in the applied climatology discipline, NOAA's National Climatic Data Center (NCDC) has generated climate indicators for several decades. Traditionally, these indicators were developed for sectors with long-standing relationships with, and needs of, the applied climatology field. These have recently been adopted and adapted to meet the needs of sectors who have newfound sensitivities to climate and needs for climate data. Information and indices from NOAA's National Climatic Data Center have been prominent components of these indicator suites, and in some cases have been drafted in toto by these aggregators, often with improvements to the communicability and aesthetics of the indicators themselves. Across this history of supporting needs for indicators, NCDC climatologists developed a handful of practical approaches and philosophies that inform a successful climate monitoring product. This manuscript and presentation will demonstrate the utility this set of practical applications that translate raw data into useful information.

  6. The Swift X-ray monitoring campaign of the center of the Milky Way

    NASA Astrophysics Data System (ADS)

    Degenaar, N.; Wijnands, R.; Miller, J. M.; Reynolds, M. T.; Kennea, J.; Gehrels, N.

    2015-09-01

    In 2006 February, shortly after its launch, Swift began monitoring the center of the Milky Way with the on board X-Ray Telescope using short 1-ks exposures performed every 1-4 days. Between 2006 and 2014 over 1200 observations have been obtained, accumulating to ? 1.3 Ms of exposure time. This has yielded a wealth of information about the long-term X-ray behavior of the supermassive black hole Sgr A*, and numerous transient X-ray binaries that are located within the 25? 25? region covered by the campaign. In this review we highlight the discoveries made during these first nine years, which include 1) the detection of seven bright X-ray flares from Sgr A*, 2) the discovery of the magnetar SGR J1745-29, 3) the first systematic analysis of the outburst light curves and energetics of the peculiar class of very-faint X-ray binaries, 4) the discovery of three new transient X-ray sources, 5) the exposure of low-level accretion in otherwise bright X-ray binaries, and 6) the identification of a candidate X-ray binary/millisecond radio pulsar transitional object. We also reflect on future science to be done by continuing this Swift's legacy campaign, such as high-cadence monitoring to study how the interaction between the gaseous object 'G2' and Sgr A* plays out in the future.

  7. Hidden earthquakes

    SciTech Connect

    Stein, R.S.; Yeats, R.S.

    1989-06-01

    Seismologists generally look for earthquakes to happen along visible fault lines, e.g., the San Andreas fault. The authors maintain that another source of dangerous quakes has been overlooked: the release of stress along a fault that is hidden under a fold in the earth's crust. The paper describes the differences between an earthquake which occurs on a visible fault and one which occurs under an anticline and warns that Los Angeles greatest earthquake threat may come from a small quake originating under downtown Los Angeles, rather than a larger earthquake which occurs 50 miles away at the San Andreas fault.

  8. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    NASA Technical Reports Server (NTRS)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  9. Earthquakes and the urban environment. Volume I

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 1 contains chapters on earthquake parameters and hazards.

  10. Earthquakes and the urban environment. Volume II

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 2 contains chapters on earthquake prediction, control, building design and building response.

  11. Seismological investigation of earthquakes in the New Madrid Seismic Zone. Final report, September 1986--December 1992

    SciTech Connect

    Herrmann, R.B.; Nguyen, B.

    1993-08-01

    Earthquake activity in the New Madrid Seismic Zone had been monitored by regional seismic networks since 1975. During this time period, over 3,700 earthquakes have been located within the region bounded by latitudes 35{degrees}--39{degrees}N and longitudes 87{degrees}--92{degrees}W. Most of these earthquakes occur within a 1.5{degrees} x 2{degrees} zone centered on the Missouri Bootheel. Source parameters of larger earthquakes in the zone and in eastern North America are determined using surface-wave spectral amplitudes and broadband waveforms for the purpose of determining the focal mechanism, source depth and seismic moment. Waveform modeling of broadband data is shown to be a powerful tool in defining these source parameters when used complementary with regional seismic network data, and in addition, in verifying the correctness of previously published focal mechanism solutions.

  12. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  13. The Terminator Time in subionospheric VLF/LF diurnal variation as recorded by the Romanian VLF/LF radio monitoring system related to earthquake occurrence and volcano erruptions

    NASA Astrophysics Data System (ADS)

    Moldovan, I. A.; Moldovan, A. S.; Biagi, P. F.; Ionescu, C.; Schwingenschuh, K.; Boudjada, M. Y.

    2012-04-01

    The Romanian VLF/LF monitoring system consisting in a radio receiver and the infrastructure that is necessary to record and transmit the collected data is part of the European international network named INFREP. Information on electromagnetic fields' intensities created by transmitters at a receiving site are indicating the quality of the propagation along the paths between the receivers and transmitters. Studying the ionosphere's influences on the electromagnetic waves' propagation along a certain path is a method to put into evidence possible modifications of its lower structure and composition as earthquakes' precursors. The VLF/LF receiver installed in Romania was put into operation in February 2009 and has already 3 years of testing, functioning and proving its utility in the forecast of some earthquakes or volcanic eruptions. Simultaneously we monitor, in the same site with the VLF/LF receiver, the vertical atmospheric electric field and different other meteorological parameters as: temperature, pressure or rainfall. The global magnetic conditions are emphasized with the help of Daily Geomagnetic Index Kp. At a basic level, the adopted analysis consists in a simple statistical evaluation of the signals by comparing the instantaneous values to the trend of the signal. In this paper we pay attention to the terminator times in subionospheric VLF/LF diurnal variation, which are defined as the times of minimum in amplitude (or phase) around sunrise and sunset. These terminator times are found to shift significantly just around the earthquake. In the case of Kobe earthquake, there were found significant shifts in both morning and evening terminator times and these authors interpreted the shift in terminator time in terms of the lowering of lower ionosphere by using the full-wave mode theory. A LabVIEW application which accesses the VLF/LF receiver through internet was developed. This program opens the receiver's web-page and automatically retrieves the list of data files to synchronize the user-side data with the receiver's data. Missing zipped files are also automatically downloaded. The application appends daily files into monthly and anual files and performs 3D colour-coded maps with graphic representations of VLF and LF signals' intensities versus the minute-of-the-day and the day-of-the-month, facilitating a near real-time observation of VLF and LF electromagnetic waves' propagation. This type of representation, highlights the modification of the terminator time versus the length of the solar-day, improves the user's capability to detect possible propagation anomalies due to ionosphere conditions and allows a quick visual inspection of unexpected behaviors of transmission channels at different frequencies and paths. A very special result, was observed on the recordings made on the propagation path to Iceland (NRK, 37.5kHz). Recordings are made once a minute, for a period of 303 days. Icelandic channel propagation anomalies present in the range of 40-90 days are considered to be precursory phenomena associated with Eyjafjallajokull - Iceland, volcanic eruption occurred in April-May 2010.

  14. Analog earthquakes

    SciTech Connect

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  15. Interoperable Access to Near Real Time Ocean Observations with the Observing System Monitoring Center

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Hankin, S.; Mendelssohn, R.; Simons, R.; Smith, B.; Kern, K. J.

    2013-12-01

    The Observing System Monitoring Center (OSMC), a project funded by the National Oceanic and Atmospheric Administration's Climate Observations Division (COD), exists to join the discrete 'networks' of In Situ ocean observing platforms -- ships, surface floats, profiling floats, tide gauges, etc. - into a single, integrated system. The OSMC is addressing this goal through capabilities in three areas focusing on the needs of specific user groups: 1) it provides real time monitoring of the integrated observing system assets to assist management in optimizing the cost-effectiveness of the system for the assessment of climate variables; 2) it makes the stream of real time data coming from the observing system available to scientific end users into an easy-to-use form; and 3) in the future, it will unify the delayed-mode data from platform-focused data assembly centers into a standards- based distributed system that is readily accessible to interested users from the science and education communities. In this presentation, we will be focusing on the efforts of the OSMC to provide interoperable access to the near real time data stream that is available via the Global Telecommunications System (GTS). This is a very rich data source, and includes data from nearly all of the oceanographic platforms that are actively observing. We will discuss how the data is being served out using a number of widely used 'web services' (including OPeNDAP and SOS) and downloadable file formats (KML, csv, xls, netCDF), so that it can be accessed in web browsers and popular desktop analysis tools. We will also be discussing our use of the Environmental Research Division's Data Access Program (ERDDAP), available from NOAA/NMFS, which has allowed us to achieve our goals of serving the near real time data. From an interoperability perspective, it's important to note that access to the this stream of data is not just for humans, but also for machine-to-machine requests. We'll also delve into how we configured access to the near real time ocean observations in accordance with the Climate and Forecast (CF) metadata conventions describing the various 'feature types' associated with particular in situ observation types, or discrete sampling geometries (DSG). Wrapping up, we'll discuss some of the ways this data source is already being used.

  16. Monitoring Stellar Orbits Around the Massive Black Hole in the Galactic Center

    NASA Astrophysics Data System (ADS)

    Gillessen, S.; Eisenhauer, F.; Trippe, S.; Alexander, T.; Genzel, R.; Martins, F.; Ott, T.

    2009-02-01

    We present the results of 16 years of monitoring stellar orbits around the massive black hole in the center of the Milky Way, using high-resolution near-infrared techniques. This work refines our previous analysis mainly by greatly improving the definition of the coordinate system, which reaches a long-term astrometric accuracy of ≈300 μas, and by investigating in detail the individual systematic error contributions. The combination of a long-time baseline and the excellent astrometric accuracy of adaptive optics data allows us to determine orbits of 28 stars, including the star S2, which has completed a full revolution since our monitoring began. Our main results are: all stellar orbits are fit extremely well by a single-point-mass potential to within the astrometric uncertainties, which are now ≈6× better than in previous studies. The central object mass is (4.31 ± 0.06|_{stat} ± 0.36|_{R_0})× 10^6 M_⊙, where the fractional statistical error of 1.5% is nearly independent from R 0, and the main uncertainty is due to the uncertainty in R 0. Our current best estimate for the distance to the Galactic center is R 0 = 8.33 ± 0.35 kpc. The dominant errors in this value are systematic. The mass scales with distance as (3.95 ± 0.06) × 106(R 0/8 kpc)2.19 M sun. The orientations of orbital angular momenta for stars in the central arcsecond are random. We identify six of the stars with orbital solutions as late-type stars, and six early-type stars as members of the clockwise-rotating disk system, as was previously proposed. We constrain the extended dark mass enclosed between the pericenter and apocenter of S2 at less than 0.066, at the 99% confidence level, of the mass of Sgr A*. This is two orders of magnitudes larger than what one would expect from other theoretical and observational estimates.

  17. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    USGS Publications Warehouse

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    On March 27, 1964, at 5:36 p.m., a magnitude 9.2 earthquake, the largest recorded earthquake in U.S. history, struck southcentral Alaska (fig. 1). The Great Alaska Earthquake (also known as the Good Friday Earthquake) occurred at a pivotal time in the history of earth science, and helped lead to the acceptance of plate tectonic theory (Cox, 1973; Brocher and others, 2014). All large subduction zone earthquakes are understood through insights learned from the 1964 event, and observations and interpretations of the earthquake have influenced the design of infrastructure and seismic monitoring systems now in place. The earthquake caused extensive damage across the State, and triggered local tsunamis that devastated the Alaskan towns of Whittier, Valdez, and Seward. In Anchorage, the main cause of damage was ground shaking, which lasted approximately 4.5 minutes. Many buildings could not withstand this motion and were damaged or collapsed even though their foundations remained intact. More significantly, ground shaking triggered a number of landslides along coastal and drainage valley bluffs underlain by the Bootlegger Cove Formation, a composite of facies containing variably mixed gravel, sand, silt, and clay which were deposited over much of upper Cook Inlet during the Late Pleistocene (Ulery and others, 1983). Cyclic (or strain) softening of the more sensitive clay facies caused overlying blocks of soil to slide sideways along surfaces dipping by only a few degrees. This guide is the document version of an interactive web map that was created as part of the commemoration events for the 50th anniversary of the 1964 Great Alaska Earthquake. It is accessible at the U.S. Geological Survey (USGS) Alaska Science Center website: http://alaska.usgs.gov/announcements/news/1964Earthquake/. The website features a map display with suggested tour stops in Anchorage, historical photographs taken shortly after the earthquake, repeat photography of selected sites, scanned documents, and small-scale maps, as well as links to slideshows of additional photographs and Google Street View™ scenes. Buildings in Anchorage that were severely damaged, sites of major landslides, and locations of post-earthquake engineering responses are highlighted. The web map can be used online as a virtual tour or in a physical self-guided tour using a web-enabled Global Positioning System (GPS) device. This publication serves the purpose of committing most of the content of the web map to a single distributable document. As such, some of the content differs from the online version.

  18. VERY LARGE ARRAY MONITORING OF 1720 MHz OH MASERS TOWARD THE GALACTIC CENTER

    SciTech Connect

    Pihlstroem, Y. M.; Mesler, R. A.; Sjouwerman, L. O.

    2011-10-20

    We present the first variability study of the 1720 MHz OH masers located in the Galactic center. Most of these masers are associated with the interaction between the supernova remnant Sgr A East and the interstellar medium, but a few masers are associated with the circumnuclear disk (CND). The monitoring program covered five epochs and a timescale of 20-195 days, during which no masers disappeared and no new masers appeared. All masers have previously been detected in a single-epoch observation about one year prior to the start of the monitoring experiment, implying relatively stable conditions for the 1720 MHz OH masers. No extreme variability was detected. The masers associated with the northeastern interaction region between the supernova remnant and the +50 km s{sup -1} molecular cloud show the highest level of variability. This can be explained with the +50 km s{sup -1} molecular cloud being located behind the supernova remnant and with a region of high OH absorbing column density along the line of sight. Possibly, the supernova remnant provides additional turbulence to the gas in this region, through which the maser emission must travel. The masers in the southern interaction region are located on the outermost edge of Sgr A East, the line of sight of which is not covered by either absorbing OH gas or a supernova remnant, in agreement with the much lower variability level observed. Similarly, the masers associated with the CND show little variability, consistent with those arising through collisions between relatively large clumps of gas in the CND and no significant amount of turbulent gas along the line of sight.

  19. Multi-Year Combination of Tide Gauge Benchmark Monitoring (TIGA) Analysis Center Products

    NASA Astrophysics Data System (ADS)

    Hunegnaw, A.; Teferle, F. N.

    2014-12-01

    In 2013 the International GNSS Service (IGS) Tide Gauge Benchmark Monitoring (TIGA) Working Group (WG) started their reprocessing campaign, which proposes to re-analyze all relevant Global Positioning System (GPS) observations from 1994 to 2013. This re-processed dataset will provide high quality estimates of land motions, enabling regional and global high-precision geophysical/geodetic studies. Several of the individual TIGA Analysis Centers (TACs) have completed processing the full history of GPS observations recorded by the IGS global network, as well as, many other GPS stations at or close to tide gauges, which are available from the TIGA data centre at the University of La Rochelle (www.sonel.org). The TAC solutions contain a total of over 700 stations. Following the recent improvements in processing models and strategies, this is the first complete reprocessing attempt by the TIGA WG to provide homogeneous position time series. The TIGA Combination Centre (TCC) at the University of Luxembourg (UL) has computed a first multi-year weekly combined solution using two independent combination software packages: CATREF and GLOBK. These combinations allow an evaluation of any effects from the combination software and of the individual TAC contributions and their influences on the combined solution. In this study we will present the first UL TIGA multi-year combination results and discuss these in terms of geocentric sea level changes

  20. Rapid and robust characterization of the earthquake source for tsunami early-warning

    NASA Astrophysics Data System (ADS)

    Lomax, Anthony; Michelini, Alberto; Bernardi, Fabrizio; Lauciani, Valentino

    2015-04-01

    Effective tsunami early-warning after an earthquake is difficult when the distances and tsunami travel-times between earthquake/tsunami source regions and coast lines at risk are small, especially since the density of seismic and other monitoring stations is very low in most regions of risk. For tsunami warning worldwide, seismic monitoring and analysis currently provide the majority of information available within the first tens of minutes after an earthquake. This information is used for direct tsunami hazard assessment, and as basic input to real-time, tsunami hazard modeling. It is thus crucial that key earthquake parameters are determined as rapidly and reliably as possible, in a probabilistic, time-evolving manner, along with full uncertainties. Early-est (EArthquake Rapid Location sYstem with EStimation of Tsunamigenesis) is the module for rapid earthquake detection, location and analysis at the INGV tsunami alert center (CAT, "Centro di Allerta Tsunami"), part of the Italian, candidate Tsunami Watch Provider. Here we present the information produced by Early-est within the first 10 min after an earthquake to characterize the location, depth, magnitude, mechanism and tsunami potential of an earthquake. We discuss key algorithms in Early-est that produce fully automatic, robust results and their uncertainties in the shortest possible time using sparse observations. For example, a broadband picker and a robust, probabilistic, global-search detector/associator/locator component of Early-est can detect and locate a seismic event with as few as 4 to 5 P onset observations. We also discuss how these algorithms may be further evolved to provide even earlier and more robust results. Finally, we illustrate how the real-time, evolutionary and probabilistic earthquake information produced by Early-est, along with prior and non-seismic information and later seismic information (e.g., full-waveform moment-tensors), may be used within time-evolving, decision and modeling systems for tsunami early warning.

  1. Using graphics and expert system technologies to support satellite monitoring at the NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Shirah, Gregory W.; Luczak, Edward C.

    1994-01-01

    At NASA's Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analysts Assistant (GenSAA), was developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. This paper describes GenSAA's capabilities and how it is supporting monitoring functions of current and future NASA missions for a variety of satellite monitoring applications ranging from subsystem health and safety to spacecraft attitude. Finally, this paper addresses efforts to generalize GenSAA's data interface for more widespread usage throughout the space and commercial industry.

  2. Using graphics and expert system technologies to support satellite monitoring at the NASA Goddard Space Flight Center

    NASA Astrophysics Data System (ADS)

    Hughes, Peter M.; Shirah, Gregory W.; Luczak, Edward C.

    1994-11-01

    At NASA's Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analysts Assistant (GenSAA), was developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. This paper describes GenSAA's capabilities and how it is supporting monitoring functions of current and future NASA missions for a variety of satellite monitoring applications ranging from subsystem health and safety to spacecraft attitude. Finally, this paper addresses efforts to generalize GenSAA's data interface for more widespread usage throughout the space and commercial industry.

  3. Broadband characteristics of earthquakes recorded during a dome-building eruption at Mount St. Helens, Washington, between October 2004 and May 2005: Chapter 5 in A volcano rekindled: the renewed eruption of Mount St. Helens, 2004-2006

    USGS Publications Warehouse

    Horton, Stephen P.; Norris, Robert D.; Moran, Seth C.

    2008-01-01

    From October 2004 to May 2005, the Center for Earthquake Research and Information of the University of Memphis operated two to six broadband seismometers within 5 to 20 km of Mount St. Helens to help monitor recent seismic and volcanic activity. Approximately 57,000 earthquakes identified during the 7-month deployment had a normal magnitude distribution with a mean magnitude of 1.78 and a standard deviation of 0.24 magnitude units. Both the mode and range of earthquake magnitude and the rate of activity varied during the deployment. We examined the time domain and spectral characteristics of two classes of events seen during dome building. These include volcano-tectonic earthquakes and lower-frequency events. Lower-frequency events are further classified into hybrid earthquakes, low-frequency earthquakes, and long-duration volcanic tremor. Hybrid and low-frequency earthquakes showed a continuum of characteristics that varied systematically with time. A progressive loss of high-frequency seismic energy occurred in earthquakes as magma approached and eventually reached the surface. The spectral shape of large and small earthquakes occurring within days of each other did not vary with magnitude. Volcanic tremor events and lower-frequency earthquakes displayed consistent spectral peaks, although higher frequencies were more favorably excited during tremor than earthquakes.

  4. Earthquake history of South Carolina

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    An estimated $23 million damage was caused by one of the great earthquakes in United States history in 1886. Charleston, S.C, and nearby cities suffered most of the damage, although points as far as 160 km away were strongly shaken. Many of the 20 earthquakes of intensity V or greater (Modified Mercalli scale) that centered within South Carolina occurred near Charleston. A 1924 shock in the western part of the State was felt over 145,000 km2. Several earthquakes outside the State borders were felt strongly in South Carolina. 

  5. Earthquake Precursors in Thermal Infrared Data

    NASA Astrophysics Data System (ADS)

    Alqassim, S. S.; Vanderbilt, V. C.

    2010-12-01

    As part of an agreement between NASA and the Arab Youth Venture Foundation (AYVF), three engineering students from the United Arab Emirates (UAE) participated in a 10-week experiential learning program this summer. This educational program is managed by the NASA Ames Research Center Office of Education and Public Outreach and is administered by the Education Associates Program (EAP). One of the research projects under this program tested the hypothesis that signals emitted by the Earth’s surface prior to the occurrence of an earthquake, including thermal infrared (TIR) emissions, can be detected through appropriate analysis of data collected by the Moderate-resolution Imaging Spectroradiometer (MODIS) satellite sensors. After applying a set of preprocessing algorithms to the satellite data, we analyzed MODIS images showing the TIR emitted by a ground area in the days prior to an eventual earthquake. We used computing tools and software, such as MATLAB and ENVI, to isolate these pre-seismic signals from the background noise. The development of a technique to monitor pre-seismic signals holds promise in finding a method to predict earthquakes.

  6. Anthropogenic Triggering of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Bizzarri, Andrea

    2014-08-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor ``foreshocks'', since the induction may occur with a delay up to several years.

  7. Anthropogenic Triggering of Large Earthquakes

    PubMed Central

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1–10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor “foreshocks”, since the induction may occur with a delay up to several years. PMID:25156190

  8. Earthquake Analysis.

    ERIC Educational Resources Information Center

    Espinoza, Fernando

    2000-01-01

    Indicates the importance of the development of students' measurement and estimation skills. Analyzes earthquake data recorded at seismograph stations and explains how to read and modify the graphs. Presents an activity for student evaluation. (YDS)

  9. Earthquake Facts

    MedlinePlus

    ... and have smaller magnitudes than earthquakes on the Earth. It appears they are related to the tidal stresses associated with the varying distance between the Earth and Moon. They also occur at great depth, ...

  10. Earthquake watch

    USGS Publications Warehouse

    Hill, M.

    1976-01-01

     When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media. 

  11. Rapid characterization of the 2015 Mw 7.8 Gorkha, Nepal, earthquake sequence and its seismotectonic context

    USGS Publications Warehouse

    Hayes, Gavin; Briggs, Richard; Barnhart, William D.; Yeck, William; McNamara, Daniel E.; Wald, David J.; Nealy, Jennifer; Benz, Harley M.; Gold, Ryan D.; Jaiswal, Kishor S.; Marano, Kristin; Earle, Paul; Hearne, Mike; Smoczyk, Gregory M.; Wald, Lisa A.; Samsonov, Sergey

    2015-01-01

    Earthquake response and related information products are important for placing recent seismic events into context and particularly for understanding the impact earthquakes can have on the regional community and its infrastructure. These tools are even more useful if they are available quickly, ahead of detailed information from the areas affected by such earthquakes. Here we provide an overview of the response activities and related information products generated and provided by the U.S. Geological Survey National Earthquake Information Center in association with the 2015 M 7.8 Gorkha, Nepal, earthquake. This group monitors global earthquakes 24  hrs/day and 7  days/week to provide rapid information on the location and size of recent events and to characterize the source properties, tectonic setting, and potential fatalities and economic losses associated with significant earthquakes. We present the timeline over which these products became available, discuss what they tell us about the seismotectonics of the Gorkha earthquake and its aftershocks, and examine how their information is used today, and might be used in the future, to help mitigate the impact of such natural disasters.

  12. Impending ionospheric anomaly preceding the Iquique Mw8.2 earthquake in Chile on 2014 April 1

    NASA Astrophysics Data System (ADS)

    Guo, Jinyun; Li, Wang; Yu, Hongjuan; Liu, Zhimin; Zhao, Chunmei; Kong, Qiaoli

    2015-12-01

    To investigate the coupling relationship between great earthquake and ionosphere, the GPS-derived total electron contents (TECs) by the Center for Orbit Determination in Europe and the foF2 data from the Space Weather Prediction Center were used to analyse the impending ionospheric anomalies before the Iquique Mw8.2 earthquake in Chile on 2014 April 1. Eliminating effects of the solar and geomagnetic activities on ionosphere by the sliding interquartile range with the 27-day window, the TEC analysis results represent that there were negative anomalies occurred on 15th day prior to the earthquake, and positive anomalies appeared in 5th day before the earthquake. The foF2 analysis results of ionosonde stations Jicamarca, Concepcion and Ramey show that the foF2 increased by 40, 50 and 45 per cent, respectively, on 5th day before the earthquake. The TEC anomalous distribution indicates that there was a widely TEC decrement over the epicentre with the duration of 6 hr on 15th day before the earthquake. On 5th day before the earthquake, the TEC over the epicentre increased with the amplitude of 15 TECu, and the duration exceeded 6 hr. The anomalies occurred on the side away from the equator. All TEC anomalies in these days were within the bounds of equatorial anomaly zone where should be the focal area to monitor ionospheric anomaly before strong earthquakes. The relationship between ionospheric anomalies and geomagnetic activity was detected by the cross wavelet analysis, which implied that the foF2 was not affected by the magnetic activities on 15th day and 5th day prior to the earthquake, but the TECs were partially affected by anomalous magnetic activity during some periods of 5th day prior to the earthquake.

  13. Earthquake history of Nevada

    USGS Publications Warehouse

    von Hake, C. A.

    1974-01-01

    Since 1852, more than 30 shocks of intensity VI or greater (Modified Mercalli scale) have occurred in western Nevada. At least three of these were classified as intensity X. In addition, seven earthquakes (intensity VI or greater) were centered in the eastern part of the State. Almost 2,000 other shocks have been cataloged in the nearly 125-year period of historical records. Thus, Nevada ranks among the most seismically active States. 

  14. Earthquakes and the urban environment. Volume III

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 3 contains chapters on seismic planning, social aspects and future prospects.

  15. Satellite relay telemetry of seismic data in earthquake prediction and control

    USGS Publications Warehouse

    Jackson, Wayne H.; Eaton, Jerry P.

    1971-01-01

    The Satellite Telemetry Earthquake Monitoring Program was started in FY 1968 to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research (NCER) in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project.

  16. Groundwater monitoring program plan and conceptual site model for the Al-Tuwaitha Nuclear Research Center in Iraq.

    SciTech Connect

    Copland, John Robin; Cochran, John Russell

    2013-07-01

    The Radiation Protection Center of the Iraqi Ministry of Environment is developing a groundwater monitoring program (GMP) for the Al-Tuwaitha Nuclear Research Center located near Baghdad, Iraq. The Al-Tuwaitha Nuclear Research Center was established in about 1960 and is currently being cleaned-up and decommissioned by Iraq's Ministry of Science and Technology. This Groundwater Monitoring Program Plan (GMPP) and Conceptual Site Model (CSM) support the Radiation Protection Center by providing:A CSM describing the hydrogeologic regime and contaminant issues,recommendations for future groundwater characterization activities, anddescriptions of the organizational elements of a groundwater monitoring program. The Conceptual Site Model identifies a number of potential sources of groundwater contamination at Al-Tuwaitha. The model also identifies two water-bearing zones (a shallow groundwater zone and a regional aquifer). The depth to the shallow groundwater zone varies from approximately 7 to 10 meters (m) across the facility. The shallow groundwater zone is composed of a layer of silty sand and fine sand that does not extend laterally across the entire facility. An approximately 4-m thick layer of clay underlies the shallow groundwater zone. The depth to the regional aquifer varies from approximately 14 to 17 m across the facility. The regional aquifer is composed of interfingering layers of silty sand, fine-grained sand, and medium-grained sand. Based on the limited analyses described in this report, there is no severe contamination of the groundwater at Al-Tuwaitha with radioactive constituents. However, significant data gaps exist and this plan recommends the installation of additional groundwater monitoring wells and conducting additional types of radiological and chemical analyses.

  17. Cooperative Monitoring Center Occasional Paper/11: Cooperative Environmental Monitoring in the Coastal Regions of India and Pakistan

    SciTech Connect

    Rajen, Gauray

    1999-06-01

    The cessation of hostilities between India and Pakistan is an immediate need and of global concern, as these countries have tested nuclear devices, and have the capability to deploy nuclear weapons and long-range ballistic missiles. Cooperative monitoring projects among neighboring countries in South Asia could build regional confidence, and, through gradual improvements in relations, reduce the threat of war and the proliferation of weapons of mass destruction. This paper discusses monitoring the trans-border movement of flow and sediment in the Indian and Pakistani coastal areas. Through such a project, India and Pakistan could initiate greater cooperation, and engender movement towards the resolution of the Sir Creek territorial dispute in their coastal region. The Joint Working Groups dialogue being conducted by India and Pakistan provides a mechanism for promoting such a project. The proposed project also falls within a regional framework of cooperation agreed to by several South Asian countries. This framework has been codified in the South Asian Seas Action Plan, developed by Bangladesh, India, Maldives, Pakistan and Sri Lanka. This framework provides a useful starting point for Indian and Pakistani cooperative monitoring in their trans-border coastal area. The project discussed in this paper involves computer modeling, the placement of in situ sensors for remote data acquisition, and the development of joint reports. Preliminary computer modeling studies are presented in the paper. These results illustrate the cross-flow connections between Indian and Pakistani coastal regions and strengthen the argument for cooperation. Technologies and actions similar to those suggested for the coastal project are likely to be applied in future arms control and treaty verification agreements. The project, therefore, serves as a demonstration of cooperative monitoring technologies. The project will also increase people-to-people contacts among Indian and Pakistani policy makers and scientists. In the perceptions of the general public, the project will crystallize the idea that the two countries share ecosystems and natural resources, and have a vested interest in increased collaboration.

  18. A National Tracking Center for Monitoring Shipments of HEU, MOX, and Spent Nuclear Fuel: How do we implement?

    SciTech Connect

    Mark Schanfein

    2009-07-01

    Nuclear material safeguards specialists and instrument developers at US Department of Energy (USDOE) National Laboratories in the United States, sponsored by the National Nuclear Security Administration (NNSA) Office of NA-24, have been developing devices to monitor shipments of UF6 cylinders and other radioactive materials , . Tracking devices are being developed that are capable of monitoring shipments of valuable radioactive materials in real time, using the Global Positioning System (GPS). We envision that such devices will be extremely useful, if not essential, for monitoring the shipment of these important cargoes of nuclear material, including highly-enriched uranium (HEU), mixed plutonium/uranium oxide (MOX), spent nuclear fuel, and, potentially, other large radioactive sources. To ensure nuclear material security and safeguards, it is extremely important to track these materials because they contain so-called “direct-use material” which is material that if diverted and processed could potentially be used to develop clandestine nuclear weapons . Large sources could be used for a dirty bomb also known as a radioactive dispersal device (RDD). For that matter, any interdiction by an adversary regardless of intent demands a rapid response. To make the fullest use of such tracking devices, we propose a National Tracking Center. This paper describes what the attributes of such a center would be and how it could ultimately be the prototype for an International Tracking Center, possibly to be based in Vienna, at the International Atomic Energy Agency (IAEA).

  19. CTEPP-OH DATA COLLECTED ON FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data set contains data for CTEPP-OH concerning the potential sources of pollutants at the day care center including the chemicals that have been applied in the past at the day care center by staff members or by commercial contractors. The day care teacher was asked questions...

  20. CTEPP NC DATA COLLECTED ON FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data set contains data concerning the potential sources of pollutants at the day care center including the chemicals that have been applied in the past at the day care center by staff members or by commercial contractors. The day care teacher was asked questions related to t...

  1. Deep earthquakes

    SciTech Connect

    Frohlich, C.

    1989-01-01

    Earthquakes are often recorded at depths as great as 650 kilometers or more. These deep events mark regions where plates of the earth's surface are consumed in the mantle. But the earthquakes themselves present a conundrum: the high pressures and temperatures at such depths should keep rock from fracturing suddenly and generating a tremor. This paper reviews the research on this problem. Almost all deep earthquakes conform to the pattern described by Wadati, namely, they generally occur at the edge of a deep ocean and define an inclined zone extending from near the surface to a depth of 600 kilometers of more, known as the Wadati-Benioff zone. Several scenarios are described that were proposed to explain the fracturing and slipping of rocks at this depth.

  2. Ground-water-level monitoring for earthquake prediction; a progress report based on data collected in Southern California, 1976-79

    USGS Publications Warehouse

    Moyle, W.R., Jr.

    1980-01-01

    The U.S. Geological Survey is conducting a research program to determine if groundwater-level measurements can be used for earthquake prediction. Earlier studies suggest that water levels in wells may be responsive to small strains on the order of 10 to the minus 8th power to 10 to the minus 10th power (dimensionless). Water-level data being collected in the area of the southern California uplift show response to earthquakes and other natural and manmade effects. The data are presently (1979) being made ready for computer analysis. The completed analysis may indicate the presence of precursory earthquake information. (USGS)

  3. Earthquake Education and Outreach in Haiti

    Following the devastating 2010 Haiti earthquake, the USGS has been helping with earthquake awareness and monitoring in the country, with continued support from the U.S. Agency for International Development (USAID). This assistance has helped the Bureau des Mines et de l'Energie (BME) in Port-au-Prin...

  4. Earthquakes in Arkansas and vicinity 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Ausbrooks, Scott M.

    2011-01-01

    This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  5. Clinical observation of atrial threshold monitoring algorithm: a single center experience

    PubMed Central

    She, Jianqing; Zhou, Jing; Hu, Zhan; Xia, Yulong

    2015-01-01

    Objective: To observe the atrial capture management in an atrial threshold monitoring algorithm. By calculating the enabling rate of the atrial threshold monitoring algorithm and comparing atrial thresholds measured automatically and manually, we evaluate its safety, reliability and applicability in clinical practice. Methods and results: Data were collected at implant, start of atrial threshold monitoring, visits scheduled 1 month, 2 months and 4 months thereafter, and upon notification of adverse events. Atrial threshold monitoring algorithm was enabled in 94 patients, while in 38 not, indicating an enabling rate of 71.2%. Causes of the unsuccessful attempts to enable automatic atrial threshold include tachycardia (2, 5.3%), and atrial safety margin not met (36, 94.7%). A total of 88 pairs of atrial thresholds measured automatically and manually were gained. The auto threshold was 0.528 ± 0.270 V, and the manual threshold was 0.580 ± 0.223 V. There is a strict correlation between the automatic measurements and those conducted manually by the physician with a P < 0.05. No significant differences were observed during the 1-month, 2-month and 4-month follow-up. Conclusion: Atrial threshold monitoring algorithm is safe, reliable and applicable over time. Atrial threshold monitoring tested atrial threshold was demonstrated to be clinically equivalent to the manual atrial threshold test. The addition of atrial threshold monitoring will benefit the patients by reducing energy cost and enhancing pacemaker safety. PMID:26131207

  6. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-31

    ... Geological Survey National Earthquake Prediction Evaluation Council (NEPEC) AGENCY: Department of the... National Earthquake Prediction Evaluation Council (NEPEC) will hold a 1\\1/2\\ day meeting on September 17 and 18, 2012, at the U.S. Geological Survey National Earthquake Information Center (NEIC),...

  7. Cooperative Monitoring Center Occasional Paper/4: Missile Control in South Asia and the Role of Cooperative Monitoring Technology

    SciTech Connect

    Kamal, N.; Sawhney, P.

    1998-10-01

    The succession of nuclear tests by India and Pakistan in May 1998 has changed the nature of their missile rivalry, which is only one of numerous manifestations of their relationship as hardened adversaries, deeply sensitive to each other's existing and evolving defense capabilities. The political context surrounding this costly rivalry remains unmediated by arms control measures or by any nascent prospect of detente. As a parallel development, sensible voices in both countries will continue to talk of building mutual confidence through openness to avert accidents, misjudgments, and misinterpretations. To facilitate a future peace process, this paper offers possible suggestions for stabilization that could be applied to India's and Pakistan's missile situation. Appendices include descriptions of existing missile agreements that have contributed to better relations for other countries as well as a list of the cooperative monitoring technologies available to provide information useful in implementing subcontinent missile regimes.

  8. Earthquake Shaking - Finding the "Hot Spots"

    USGS Publications Warehouse

    Field, Ned; Jones, Lucile; Jordan, Tom; Benthien, Mark; Wald, Lisa

    2001-01-01

    A new Southern California Earthquake Center study has quantified how local geologic conditions affect the shaking experienced in an earthquake. The important geologic factors at a site are softness of the rock or soil near the surface and thickness of the sediments above hard bedrock. Even when these 'site effects' are taken into account, however, each earthquake exhibits unique 'hotspots' of anomalously strong shaking. Better predictions of strong ground shaking will therefore require additional geologic data and more comprehensive computer simulations of individual earthquakes.

  9. Cooperative Monitoring Center Occasional Paper/7: A Generic Model for Cooperative Border Security

    SciTech Connect

    Netzer, Colonel Gideon

    1999-03-01

    This paper presents a generic model for dealing with security problems along borders between countries. It presents descriptions and characteristics of various borders and identifies the threats to border security, while emphasizing cooperative monitoring solutions.

  10. Analysis of Instrumentation to Monitor the Hydrologic Performance of Green Infrastructure at the Edison Environmental Center

    EPA Science Inventory

    Infiltration is one of the primary functional mechanisms of green infrastructure stormwater controls, so this study explored selection and placement of embedded soil moisture and water level sensors to monitor surface infiltration and infiltration into the underlying soil for per...

  11. Aftershocks series monitoring of the September 18, 2004 M = 4.6 earthquake at the western Pyrenees: A case of reservoir-triggered seismicity?

    NASA Astrophysics Data System (ADS)

    Ruiz, M.; Gaspà, O.; Gallart, J.; Díaz, J.; Pulgar, J. A.; García-Sansegundo, J.; López-Fernández, C.; González-Cortina, J. M.

    2006-10-01

    On September 18, 2004, a 4.6 mbLg earthquake was widely felt in the region around Pamplona, at the western Pyrenees. Preliminary locations reported an epicenter less than 20 km ESE of Pamplona and close to the Itoiz reservoir, which started impounding in January 2004. The area apparently lacks of significant seismic activity in recent times. After the main shock, which was preceded by series of foreshocks reaching magnitudes of 3.3 mbLg, a dense temporal network of 13 seismic stations was deployed there to monitor the aftershocks series and to constrain the hypocentral pattern. Aftershock determinations obtained with a double-difference algorithm define a narrow epicentral zone of less than 10 km 2, ESE-WNW oriented. The events are mainly concentrated between 3 and 9 km depth. Focal solutions were computed for the main event and 12 aftershocks including the highest secondary one of 3.8 mbLg. They show mainly normal faulting with some strike-slip component and one of the nodal planes oriented NW-SE and dipping to the NE. Cross-correlation techniques applied to detect and associate events with similar waveforms, provided up to 33 families relating the 67% of the 326 relocated aftershocks. Families show event clusters grouped by periods and migrating from NW to SE. Interestingly, the narrow epicentral zone inferred here is located less than 4 km away from the 111-m high Itoiz dam. These hypocentral results, and the correlation observed between fluctuations of the reservoir water level and the seismic activity, favour the explanation of this foreshock-aftershock series as a rapid response case of reservoir-triggered seismicity, burst by the first impoundment of the Itoiz reservoir. The region is folded and affected by shallow dipping thrusts, and the Itoiz reservoir is located on the hangingwall of a low angle southward verging thrust, which might be a case sensible to water level fluctuations. However, continued seismic monitoring in the coming years is mandatory in this area to infer more reliable seismotectonic and hazard assessments.

  12. The World Trade Center Attack: Similarities to the 1988 earthquake in Armenia: time to teach the public life-supporting first aid?

    PubMed Central

    Crippen, David

    2001-01-01

    On 7 December 1988, a severe earthquake hit in Armenia, a former republic of the Soviet Union (USSR); on 11 September 2001, a manmade attack of similar impact hit New York City. These events share similar implications for the role of the uninjured survivor. With basic training, the uninjured survivors could save lives without tools or resuscitation equipment. This article makes the case for teaching life-supporting first aid to the public in the hope that one day, should another such incident occur, they would be able to preserve injured victims until formal rescue occurs. PMID:11737915

  13. Earthquake tectonics

    SciTech Connect

    Steward, R.F. )

    1991-02-01

    Earthquakes release a tremendous amount of energy into the subsurface in the form of seismic waves. The seismic wave energy of the San Francisco 1906 (M = 8.2) earthquake was equivalent to over 8 billion tons of TNT (3.3 {times} 10{sup 19} joules). Four basic wave types are propagated form seismic sources, two non-rotational and two rotational. As opposed to the non-rotational R and SH waves, the rotational compressional (RC) and rotational shear (RS) waves carry the bulk of the energy from a seismic source. RC wavefronts propagate in the subsurface and refract similarly to P waves, but are considerably slower. RC waves are critically refracted beneath the air surface interface at velocities less than the velocity of sound in air because they refract at the velocity of sound in air minus the retrograde particle velocity at the top of the wave. They propagate at tsunami waves in the open ocean, and produce loud sounds on land that are heard by humans and animals during earthquakes. The energy of the RS wave dwarfs that of the P, SH, and even the RC wave. The RS wave is the same as what is currently called the S wave in earthquake seismology, and produces both folding and strike-slip faulting at considerable distances from the epicenter. RC and RS waves, propagated during earthquakes from the Santa Ynez fault and a right-slip fault on trend with the Red Mountain fault, produced the Santa Ynez Mountains in California beginning in the middle Pliocene and continuing until the present.

  14. The EM Earthquake Precursor

    NASA Astrophysics Data System (ADS)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two directional techniques were employed, resulting in three mapped, potential epicenters. The remaining, weaker signals presented similar directionality results to more epicentral locations. In addition, the directional results of the Timpson field tests lead to the design and construction of a third prototype antenna. In a laboratory setting, experiments were created to fail igneous rock types within a custom-designed Faraday Cage. An antenna emplaced within the cage detected EM emissions, which were both reproducible and distinct, and the laboratory results paralleled field results. With a viable system and continuous monitoring, a fracture cycle could be established and observed in real-time. Sequentially, field data would be reviewed quickly for assessment; thus, leading to a much improved earthquake forecasting capability. The EM precursor determined by this method may surpass all prior precursor claims, and the general public will finally receive long overdue forecasting.

  15. 4 Earthquake: Major offshore earthquakes recall the Aztec myth

    USGS Publications Warehouse

    United States Department of Commerce

    1970-01-01

    Long before the sun clears the eastern mountains of April 29, 1970, the savanna highlands of Chiapas tremble from a magnitude 6.7 earthquake centered off the Pacific coast near Mexico’s southern border. Then, for a few hours, he Isthmus of Tehuantepec is quiet.

  16. 2010 Chile Earthquake Aftershock Response

    NASA Astrophysics Data System (ADS)

    Barientos, Sergio

    2010-05-01

    The Mw=8.8 earthquake off the coast of Chile on 27 February 2010 is the 5th largest megathrust earthquake ever to be recorded and provides an unprecedented opportunity to advance our understanding of megathrust earthquakes and associated phenomena. The 2010 Chile earthquake ruptured the Concepcion-Constitucion segment of the Nazca/South America plate boundary, south of the Central Chile region and triggered a tsunami along the coast. Following the 2010 earthquake, a very energetic aftershock sequence is being observed in an area that is 600 km along strike from Valparaiso to 150 km south of Concepcion. Within the first three weeks there were over 260 aftershocks with magnitude 5.0 or greater and 18 with magnitude 6.0 or greater (NEIC, USGS). The Concepcion-Constitucion segment lies immediately north of the rupture zone associated with the great magnitude 9.5 Chile earthquake, and south of the 1906 and the 1985 Valparaiso earthquakes. The last great subduction earthquake in the region dates back to the February 1835 event described by Darwin (1871). Since 1835, part of the region was affected in the north by the Talca earthquake in December 1928, interpreted as a shallow dipping thrust event, and by the Chillan earthquake (Mw 7.9, January 1939), a slab-pull intermediate depth earthquake. For the last 30 years, geodetic studies in this area were consistent with a fully coupled elastic loading of the subduction interface at depth; this led to identify the area as a mature seismic gap with potential for an earthquake of magnitude of the order 8.5 or several earthquakes of lesser magnitude. What was less expected was the partial rupturing of the 1985 segment toward north. Today, the 2010 earthquake raises some disturbing questions: Why and how the rupture terminated where it did at the northern end? How did the 2010 earthquake load the adjacent segment to the north and did the 1985 earthquake only partially ruptured the plate interface leaving loaded asperities since 1906? Since the number of M>7.0 aftershocks has been low, does the distribution of large-magnitude aftershocks differ from previous events of this size? What is the origin of the extensional-type aftershocks at shallow depths within the upper plate? The international seismological community (France, Germany, U.K., U.S.A.) in collaboration with the Chilean seismological community responded with a total of 140 portable seismic stations to deploy in order to record aftershocks. This combined with the Chilean permanent seismic network, in the area results in 180 stations now in operation recording continuous at 100 cps. The seismic equipment is a mix of accelerometers, short -period and broadband seismic sensors deployed along the entire length of the aftershock zone that will record the aftershock sequence for three to six months. The collected seismic data will be merged and archived to produce an international data set open to the entire seismological community immediately after archiving. Each international group will submit their data as soon as possible in standard (mini seed) format with accompanying meta data to the IRIS DMC where the data will be merged into a combined data set and available to individuals and other data centers. This will be by far the best-recorded aftershock sequence of a large megathrust earthquake. This outstanding international collaboration will provide an open data set for this important earthquake as well as provide a model for future aftershock deployments around the world.

  17. Patient-Centered Technological Assessment and Monitoring of Depression for Low-Income Patients

    PubMed Central

    Wu, Shinyi; Vidyanti, Irene; Liu, Pai; Hawkins, Caitlin; Ramirez, Magaly; Guterman, Jeffrey; Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Ell, Kathleen

    2014-01-01

    Depression is a significant challenge for ambulatory care because it worsens health status and outcomes, increases health care utilizations and costs, and elevates suicide risk. An automatic telephonic assessment (ATA) system that links with tasks and alerts to providers may improve quality of depression care and increase provider productivity. We used ATA system in a trial to assess and monitor depressive symptoms of 444 safety-net primary care patients with diabetes. We assessed system properties, evaluated preliminary clinical outcomes, and estimated cost savings. The ATA system is feasible, reliable, valid, safe, and likely cost-effective for depression screening and monitoring for low-income primary care population. PMID:24525531

  18. Patient-centered technological assessment and monitoring of depression for low-income patients.

    PubMed

    Wu, Shinyi; Vidyanti, Irene; Liu, Pai; Hawkins, Caitlin; Ramirez, Magaly; Guterman, Jeffrey; Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Ell, Kathleen

    2014-01-01

    Depression is a significant challenge for ambulatory care because it worsens health status and outcomes, increases health care utilizations and costs, and elevates suicide risk. An automatic telephonic assessment (ATA) system that links with tasks and alerts to providers may improve quality of depression care and increase provider productivity. We used ATA system in a trial to assess and monitor depressive symptoms of 444 safety-net primary care patients with diabetes. We assessed system properties, evaluated preliminary clinical outcomes, and estimated cost savings. The ATA system is feasible, reliable, valid, safe, and likely cost-effective for depression screening and monitoring for low-income primary care population. PMID:24525531

  19. Investigations of Anomalous Earthquakes at Active Volcanoes

    NASA Astrophysics Data System (ADS)

    Shuler, Ashley Elizabeth

    This dissertation investigates the link between volcanic unrest and the occurrence of moderate-to-large earthquakes with a specific type of focal mechanism. Vertical compensated-linear-vector-dipole (vertical-CLVD) earthquakes have vertical pressure or tension axes and seismic radiation patterns that are inconsistent with the double-couple model of slip on a planar fault. Prior to this work, moderate-to-large vertical-CLVD earthquakes were known to be geographically associated with volcanic centers, and vertical-CLVD earthquakes were linked to a tsunami in the Izu-Bonin volcanic arc and a subglacial fissure eruption in Iceland. Vertical-CLVD earthquakes are some of the largest and most anomalous earthquakes to occur in volcanic systems, yet their physical mechanisms remain controversial largely due to the small number of observations. Five vertical-CLVD earthquakes with vertical pressure axes are identified near Nyiragongo volcano in the Democratic Republic of the Congo. Three earthquakes occur within days of a fissure eruption at Nyiragongo, and two occur several years later in association with the refilling of the lava lake in the summit crater of the volcano. Detailed study of these events shows that the earthquakes have slower source processes than tectonic earthquakes with similar magnitudes and locations. All five earthquakes are interpreted as resulting from slip on inward-dipping ring-fault structures located above deflating shallow magma chambers. The Nyiragongo study supports the interpretation that vertical-CLVD earthquakes may be causally related to dynamic physical processes occurring inside the edifices or magmatic plumbing systems of active volcanoes. Two seismicity catalogs from the Global Centroid Moment Tensor (CMT) Project are used to search for further examples of shallow earthquakes with robust vertical-CLVD focal mechanisms. CMT solutions for approximately 400 target earthquakes are calculated and 86 vertical-CLVD earthquakes are identified near active volcanoes. Together with the Nyiragongo study, this work increases the number of well-studied vertical-CLVD earthquakes from 14 to 101. Vertical-CLVD earthquakes have focal depths in the upper ˜10 km of the Earth's crust, and ˜80% have centroid locations within 30 km of an active volcanic center. Vertical-CLVD earthquakes are observed near several different types of volcanoes in a variety of geographic and tectonic settings, but most vertical-CLVD earthquakes are observed near basaltic-to-andesitic stratovolcanoes and submarine volcanoes in subduction zones. Vertical-CLVD earthquakes are linked to tsunamis, volcanic earthquake swarms, effusive and explosive eruptions, and caldera collapse, and approximately 70% are associated with documented volcanic eruptions or episodes of volcanic unrest. Those events with vertical pressure axes typically occur after volcanic eruptions initiate, whereas events with vertical tension axes commonly occur before the start of volcanic unrest. Both types of vertical-CLVD earthquakes have longer source durations than tectonic earthquakes of the same magnitude. The isotropic and pure vertical-CLVD components of the moment tensor cannot be independently resolved using our long-period seismic dataset. As a result, several physical mechanisms can explain the retrieved deviatoric vertical-CLVD moment tensors, including dip-slip motion on ring faults, volume exchange between two reservoirs, the opening and closing of tensile cracks, and volumetric sources. An evaluation of these mechanisms is performed using constraints obtained from detailed studies of individual vertical-CLVD earthquakes. Although no single physical mechanism can explain all of the characteristics of vertical-CLVD earthquakes, a ring-faulting model consisting of slip on inward- or outward-dipping ring faults triggered by the inflation or deflation of a shallow magma chamber can account for their seismic radiation patterns and source durations, as well as their temporal relationships with volcanic unrest. The observation that most vertical-CLVD earthquakes are associated with volcanoes with caldera structures supports this interpretation.

  20. Upgrading the Digital Electronics of the PEP-II Bunch Current Monitors at the Stanford Linear Accelerator Center

    SciTech Connect

    Kline, Josh; /SLAC

    2006-08-28

    The testing of the upgrade prototype for the bunch current monitors (BCMs) in the PEP-II storage rings at the Stanford Linear Accelerator Center (SLAC) is the topic of this paper. Bunch current monitors are used to measure the charge in the electron/positron bunches traveling in particle storage rings. The BCMs in the PEP-II storage rings need to be upgraded because components of the current system have failed and are known to be failure prone with age, and several of the integrated chips are no longer produced making repairs difficult if not impossible. The main upgrade is replacing twelve old (1995) field programmable gate arrays (FPGAs) with a single Virtex II FPGA. The prototype was tested using computer synthesis tools, a commercial signal generator, and a fast pulse generator.

  1. Earthquake history of New Mexico

    USGS Publications Warehouse

    von Hake, C. A.

    1975-01-01

    Most of New Mexico's historical seismcity has been concentrated in the Rio Grande Valley between Socorro and Albuquerque. About half of the earthquakes of intensity V or greater (Modified Mercalli intensity) that occurred in teh State between 1868 and 1973 were centered in this region. 

  2. User-centered development and testing of a monitoring system that provides feedback regarding physical functioning to elderly people

    PubMed Central

    Vermeulen, Joan; Neyens, Jacques CL; Spreeuwenberg, Marieke D; van Rossum, Erik; Sipers, Walther; Habets, Herbert; Hewson, David J; de Witte, Luc P

    2013-01-01

    Purpose To involve elderly people during the development of a mobile interface of a monitoring system that provides feedback to them regarding changes in physical functioning and to test the system in a pilot study. Methods and participants The iterative user-centered development process consisted of the following phases: (1) selection of user representatives; (2) analysis of users and their context; (3) identification of user requirements; (4) development of the interface; and (5) evaluation of the interface in the lab. Subsequently, the monitoring and feedback system was tested in a pilot study by five patients who were recruited via a geriatric outpatient clinic. Participants used a bathroom scale to monitor weight and balance, and a mobile phone to monitor physical activity on a daily basis for six weeks. Personalized feedback was provided via the interface of the mobile phone. Usability was evaluated on a scale from 1 to 7 using a modified version of the Post-Study System Usability Questionnaire (PSSUQ); higher scores indicated better usability. Interviews were conducted to gain insight into the experiences of the participants with the system. Results The developed interface uses colors, emoticons, and written and/or spoken text messages to provide daily feedback regarding (changes in) weight, balance, and physical activity. The participants rated the usability of the monitoring and feedback system with a mean score of 5.2 (standard deviation 0.90) on the modified PSSUQ. The interviews revealed that most participants liked using the system and appreciated that it signaled changes in their physical functioning. However, usability was negatively influenced by a few technical errors. Conclusion Involvement of elderly users during the development process resulted in an interface with good usability. However, the technical functioning of the monitoring system needs to be optimized before it can be used to support elderly people in their self-management. PMID:24039407

  3. Great Sumatra Earthquake registers on electrostatic sensor

    NASA Astrophysics Data System (ADS)

    Röder, Helmut; Schuhmann, Wolfram; Büttner, Ralf; Zimanowski, Bernard; Braun, Thomas; Boschi, Enzo

    Strong electrical signals that correspond to the Mw = 9.3 earthquake of 26 December 2004, whichoccurred at 0058:50.7 UTC off the west coast of northern Sumatra, Indonesia, were recorded by an electrostatic sensor (a device that detects short-term variations in Earth's electrostatic field) at a seismic station in Italy, which had been installed to study the influence of local earthquakes on a new landslide monitoring system.Electrical signals arrived at the station practically instantaneously and were detected up to several hours before the onset of the Sumatra earthquake (Figure 1) as well as before local quakes. The corresponding seismic signals (p-waves) arrived 740 seconds after the start of the earthquake. Because the electrical signals travel at the speed of light, electrical monitoring for the global detection of very strong earthquakes could be an important tool in significantly increasing the hazard alert window.

  4. Earthquakes in Mississippi and vicinity 1811-2010

    USGS Publications Warehouse

    Dart, Richard L.; Bograd, Michael B.E.

    2011-01-01

    This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  5. MONITORING TOXIC ORGANIC GASES AND PARTICLES NEAR THE WORLD TRADE CENTER AFTER SEPTEMBER 11, 2001

    EPA Science Inventory

    The September 11, 2001 attack on the World Trade Center (WTC) resulted in an intense fire and the subsequent, complete collapse of the two main structures and adjacent buildings, as well as significant damage to many surrounding buildings within and around the WTC complex. Thi...

  6. Hatfield Marine Science Center Dynamic Revetment Project DSL Permit # 45455-FP. Monitoring Report. February, 2016

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  7. Hatfield Marine Science Center Dynamic Revetment Project DSL Permit # 45455-FP. Monitoring Report. February, 2014.

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  8. Hatfield Marine Science Center Dynamic Revetment Project DSL permit # 45455-FP, Monitoring Report February, 2015

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  9. CTEPP DATA COLLECTION FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data collection form is used to identify the potential sources of pollutants at the day care center. The day care teacher is asked questions related to the age of their day care building; age and frequency of cleaning carpets or rugs; types of heating and air conditioning de...

  10. America's faulty earthquake plans

    SciTech Connect

    Rosen, J

    1989-10-01

    In this article, the author discusses the liklihood of major earthquakes in both the western and eastern United States as well as the level of preparedness of each region of the U.S. for a major earthquake. Current technology in both earthquake-resistance design and earthquake detection is described. Governmental programs for earthquake hazard reduction are outlined and critiqued.

  11. EVALUATION OF ENVIROSCAN CAPACITANCE PROBES FOR MONITORING SOIL MOISTURE IN CENTER PIVOT IRRIGATED POTATOES

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Careful irrigation scheduling is the key to providing adequate water to minimize potential leaching losses below the rootzone, while supplying adequate water to minimize negative effects of water stress. Capacitance probes were used for real-time continuous monitoring of soil moisture content at va...

  12. CTEPP-OH DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the child’s daily activities and potential exposures to pollutants at their homes for CTEPP-OH. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on th...

  13. CTEPP DATA COLLECTION FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data collection form is used to provide information on the child's daily activities and potential exposures to pollutants at their homes. It includes questions on chemicals applied and cigarettes smoked at the home over the 48-hr monitoring period. It also collects informati...

  14. Monitoring of the permeable pavement demonstration site at Edison Environmental Center

    EPA Science Inventory

    The EPAs Urban Watershed Management Branch has installed an instrumented, working full-scale 110-space pervious pavement parking lot and has been monitoring several environmental stressors and runoff. This parking lot demonstration site has allowed the investigation of differenc...

  15. CTEPP NC DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the child’s daily activities and potential exposures to pollutants at their homes. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on the child’s han...

  16. CTEPP-OH DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the childs daily activities and potential exposures to pollutants at their homes for CTEPP-OH. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on th...

  17. CTEPP NC DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the childs daily activities and potential exposures to pollutants at their homes. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on the childs han...

  18. Monitoring of the permeable pavement demonstration site at Edison Environmental Center

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch has installed an instrumented, working full-scale 110-space pervious pavement parking lot and has been monitoring several environmental stressors and runoff. This parking lot demonstration site has allowed the investigation of differenc...

  19. Transaxillary gasless robotic thyroid surgery with nerve monitoring: initial two experince in a North American center.

    PubMed

    Kandil, Emad; Winters, Ryan; Aslam, Rizwan; Friedlander, Paul; Bellows, Charles

    2012-03-01

    Minimally invasive thyroid surgery using various techniques is well described. The present study reviews our initial experience with the technique with added intraoperative monitoring to assess its safety and feasibility. The study group consisted of ten consecutive patients with suspicious thyroid nodules who were candidates for thyroid lobectomy from September to December 2009. All patients underwent intraoperative nerve integrity monitoring and postoperative direct laryngoscopy. The patients' demographic information, operative times, learning curve, complications, and postoperative hospital stay were evaluated. All procedures were successfully completed with intraoperative nerve monitoring. No cases were converted to an open procedure. The median age was 38.5 years (σ = 13.5) and nine of the ten patients were females. The mean operating time was 131 minutes (range 101-203 minutes) and the mean operating time with the da Vinci system was 55 minutes. All patients were discharged home after an overnight stay. One patient developed transient radial nerve neuropathy that resolved spontaneously. There were no other postoperative complications. None of the patients complained of postoperative neck pain. Postoperative laryngoscopy showed intact and mobile vocal cords in all patients. Robotic endoscopic thyroid surgery with gasless transaxillary approach is feasible and safe in the treatment of suspicious thyroid nodules. Monitoring of the RLN during this approach is feasible. PMID:21395464

  20. Research on Earthquake Precursor in E-TEC: A Study on Land Surface Thermal Anomalies Using MODIS LST Product in Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, W. Y.; Wu, M. C.

    2014-12-01

    Taiwan has been known as an excellent natural laboratory characterized by rapid active tectonic rate and high dense seismicity. The Eastern Taiwan Earthquake Research Center (E-TEC) is established on 2013/09/24 in National Dong Hwa University and collaborates with Central Weather Bureau (CWB), National Center for Research on Earthquake Engineering (NCREE), National Science and Technology Center for Disaster Reduction (NCDR), Institute of Earth Science of Academia Sinica (IES, AS) and other institutions (NCU, NTU, CCU) and aims to provide an integrated platform for researchers to conduct the new advances on earthquake precursors and early warning for seismic disaster prevention in the eastern Taiwan, as frequent temblors are most common in the East Taiwan rift valley. E-TEC intends to integrate the multi-disciplinary observations and is equipped with stations to monitor a wide array of factors of quake precursors, including seismicity, GPS, strain-meter, ground water, geochemistry, gravity, electromagnetic, ionospheric density, thermal infrared remote sensing, gamma radiation etc, and will maximize the value of the data for researches with the range of monitoring equipment that enable to predict where and when the next devastated earthquake will strike Taiwan and develop reliable earthquake prediction models. A preliminary study on earthquake precursor using monthly Moderate Resolution Imaging Spectroradiometer (MODIS) Land Surface Temperature (LST) data before 2013/03/27 Mw6.2 Nantou earthquake in Taiwan is presented. Using the statistical analysis, the result shows the peak of the anomalous LST that exceeds a standard deviation of LST appeared on 2013/03/09 and became less or none anomalies observed on 2013/03/16 before the main-shock, which is in consist with the phenomenon observed by other researchers. This preliminary experimental result shows that the thermal anomalies reveal the possibility to associate surface thermal phenomena before the strong earthquakes.

  1. Stalking the next Parkfield earthquake

    SciTech Connect

    Kerr, R.A.

    1984-01-06

    The 30-kilometer section of the San Andreas fault midway between San Francisco and Los Angeles is the most well understood and most intensely monitored fault in the world. The geology of the area, its rock mechanics, the study of its past earthquakes, and prediction efforts for the next quake are described.

  2. Advances in Earthquake Prediction Research and the June 2000 Earthquakes in Iceland

    NASA Astrophysics Data System (ADS)

    Stefansson, R.

    2006-12-01

    In June 2000, two earthquakes with magnitude 6.6 (Ms) occurred in the central part of the South Iceland seismic zone (SISZ). Earthquakes in this region have, according to historical information, in some cases caused collapse of the majority of houses in areas encompassing 1,000 square kilometers in this relatively densely populated farming region. Because large earthquakes were expected to occur soon, much attention was given to preparedness in the region and for the last two decades it has been the subject of multi- national, mainly European, co-operation in earthquake prediction research and in the development of a high- level micro-earthquake system: the SIL system. Despite intensive surface fissuring caused by the earthquakes and measured accelerations reaching 0.8 g, the earthquakes in 2000 caused no serious injuries and no structural collapse. The relatively minor destruction led to more optimism regarding the safety of living in the area. But it also lead to some optimism about the significance of earthquake prediction research. Both earthquakes had a long-term prediction and the second of the two earthquakes had a short- term warning about place, size and immediacy. In this presentation, I will describe the warnings that were given ahead of the earthquakes. Also, I will reconsider these warnings in light of new results from multi-national earthquake prediction research in Iceland. This modeling work explains several observable patterns caused by crustal process ahead of large earthquakes. Micro-seismic observations and modeling show that, in conditions prevailing in the Icelandic crust, fluids can be carried upward from the brittle-ductile boundary in response to strain, bringing high, near- lithostatic pore pressures into the brittle crust, preparing a region for the release of a large earthquake; monitoring this process will enable long- and short- term earthquakes warnings.

  3. Establishment of Antakya Basin Strong Ground Motion Monitoring System

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Özel, O.; Bikce, M.; Geneş, M. C.; Kacın, S.; Erdik, M.; Safak, E.; Över, S.

    2009-04-01

    Turkey is located in one of the most active earthquake zones of the world. The cities located along the North Anatolian Fault (NAF) and the East Anatolian Fault (EAF) are exposed to significant earthquake hazard. The Hatay province near the southern terminus of the EAF has always experienced a significant seismic activity, since it is on the intersection of the northernmost segment of Dead Sea Fault Zone coming from the south, with the Cyprean Arc approaching from south-west. Historical records extending over the last 2000 years indicate that Antakya, founded in the 3rd century B.C., is effected by intensity IX-X earthquakes every 150 years. In the region, the last destructive earthquake occurred in 1872. Destructive earthquakes should be expected in the region in the near future similar to the ones that occurred in the past. The strong response of sedimentary basins to seismic waves was largely responsible for the damage produced by the devastating earthquakes of 1985 Michoacan Earthquake which severely damaged parts of Mexico City, and the 1988 Spitak Earthquake which destroyed most of Leninakan, Armenia. Much of this devastating response was explained by the conversion of seismic body waves to surface waves at the sediment/rock contacts of sedimentary basins. "Antakya Basin Strong Ground Motion Monitoring System" is set up with the aim of monitoring the earthquake response of the Antakya Basin, contributing to our understanding of basin response, contributing to earthquake risk assessment of Antakya, monitoring of regional earthquakes and determining the effects of local and regional earthquakes on the urban environment of Antakya. The soil properties beneath the strong motion stations (S-Wave velocity structure and dominant soil frequency) are determined by array measurements that involve broad-band seismometers. The strong motion monitoring system consists of six instruments installed in small buildings. The stations form a straight line along the short axis of Antakya basin passing through the city center. They are equipped with acceleration sensors, GPS and communication units and operate in continuous recording mode. For on-line data transmission the EDGE mode of available GSM systems are employed. In the array measurements for the determination of soil properties beneath the stations two 4-seismometer sets have been utilized. The system is the first monitoring installment in Turkey dedicated to understanding basin effects. The records obtained will allow for the visualization of the propagation of long-period ground motion in the basin and show the refraction of surface waves at the basin edge. The records will also serve to enhance our capacity to realistically synthesize the strong ground motion in basin-type environments.

  4. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  5. Earthquake Information System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  6. Earthquakes and plate tectonics.

    USGS Publications Warehouse

    Spall, H.

    1982-01-01

    Earthquakes occur at the following three kinds of plate boundary: ocean ridges where the plates are pulled apart, margins where the plates scrape past one another, and margins where one plate is thrust under the other. Thus, we can predict the general regions on the earth's surface where we can expect large earthquakes in the future. We know that each year about 140 earthquakes of magnitude 6 or greater will occur within this area which is 10% of the earth's surface. But on a worldwide basis we cannot say with much accuracy when these events will occur. The reason is that the processes in plate tectonics have been going on for millions of years. Averaged over this interval, plate motions amount to several mm per year. But at any instant in geologic time, for example the year 1982, we do not know, exactly where we are in the worldwide cycle of strain build-up and strain release. Only by monitoring the stress and strain in small areas, for instance, the San Andreas fault, in great detail can we hope to predict when renewed activity in that part of the plate tectonics arena is likely to take place. -from Author

  7. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by our actions. Using these global datasets will help to make the model as uniform as possible. The model must be built by scientists in the affected countries with GEM's support, augmented by their insights and data. The model will launch in 2014; to succeed it must be open, international, independent, and continuously tested. But the mission of GEM is not just the likelihood of ground shaking, but also gaging the economic and social consequences of earthquakes, which greatly amplify the losses. For example, should the municipality of Istanbul retrofit schools, or increase its insurance reserves and recovery capacity? Should a homeowner in a high-risk area move or strengthen her building? This is why GEM is a public-private partnership. GEM's fourteen public sponsors and eight non-governmental organization members are standing for the developing world. To extend GEM into the financial world, we draw upon the expertise of companies. GEM's ten private sponsors have endorsed the acquisition of public knowledge over private gain. In a competitive world, this is a courageous act. GEM is but one link in a chain of preparedness: from earth science and engineering research, through groups like GEM, to mitigation, retrofit or relocate decisions, building codes and insurance, and finally to prepared hospitals, schools, and homes. But it is a link that our community can make strong.

  8. Cooperative Monitoring Center Occasional Paper/9: De-Alerting Strategic Ballistic Missiles

    SciTech Connect

    Connell, Leonard W.; Edenburn, Michael W.; Fraley, Stanley K.; Trost, Lawrence C.

    1999-03-01

    This paper presents a framework for evaluating the technical merits of strategic ballistic missile de-alerting measures, and it uses the framework to evaluate a variety of possible measures for silo-based, land-mobile, and submarine-based missiles. De-alerting measures are defined for the purpose of this paper as reversible actions taken to increase the time or effort required to launch a strategic ballistic missile. The paper does not assess the desirability of pursuing a de-alerting program. Such an assessment is highly context dependent. The paper postulates that if de-alerting is desirable and is used as an arms control mechanism, de-alerting measures should satisfy specific cirteria relating to force security, practicality, effectiveness, significant delay, and verifiability. Silo-launched missiles lend themselves most readily to de-alerting verification, because communications necessary for monitoring do not increase the vulnerabilty of the weapons by a significant amount. Land-mobile missile de-alerting measures would be more challenging to verify, because monitoring measures that disclose the launcher's location would potentially increase their vulnerability. Submarine-launched missile de-alerting measures would be extremely challlenging if not impossible to monitor without increasing the submarine's vulnerability.

  9. Space weather monitoring by ground-based means carried out in Polar Geophysical Center at Arctic and Antarctic Research Institute

    NASA Astrophysics Data System (ADS)

    Janzhura, Alexander

    A real-time information on geophysical processes in polar regions is very important for goals of Space Weather monitoring by the ground-based means. The modern communication systems and computer technology makes it possible to collect and process the data from remote sites without significant delays. A new acquisition equipment based on microprocessor modules and reliable in hush climatic conditions was deployed at the Roshydromet networks of geophysical observations in Arctic and is deployed at observatories in Antarctic. A contemporary system for on-line collecting and transmitting the geophysical data from the Arctic and Antarctic stations to AARI has been realized and the Polar Geophysical Center (PGC) arranged at AARI ensures the near-real time processing and analyzing the geophysical information from 11 stations in Arctic and 5 stations in Antarctic. The space weather monitoring by the ground based means is one of the main tasks standing before the Polar Geophysical Center. As studies by Troshichev and Janzhura, [2012] showed, the PC index characterizing the polar cap magnetic activity appeared to be an adequate indicator of the solar wind energy that entered into the magnetosphere and the energy that is accumulating in the magnetosphere. A great advantage of the PC index application over other methods based on satellite data is a permanent on-line availability of information about magnetic activity in both northern and southern polar caps. A special procedure agreed between Arctic and Antarctic Research Institute (AARI) and Space Institute of the Danish Technical University (DTUSpace) ensures calculation of the unified PC index in quasi-real time by magnetic data from the Thule and Vostok stations (see public site: http://pc-index.org). The method for estimation of AL and Dst indices (as indicators of state of the disturbed magnetosphere) based on data on foregoing PC indices has been elaborated and testified in the Polar Geophysical Center. It is demonstrated that the PC index can be successfully used to monitor the state of the magnetosphere (space weather monitoring) and the readiness of the magnetosphere to producing substorm or storm (space weather nowcasting).

  10. Time-Clustering Behavior of Spreading-Center Seismicity Between 15-35 N on the Mid-Atlantic Ridge: Observations from Hydroacoustic Monitoring

    NASA Astrophysics Data System (ADS)

    Bohnenstiehl, D. R.; Tolstoy, M.; Smith, D. K.; Fox, C. G.; Dziak, R. P.

    2002-12-01

    An earthquake catalog derived from the detection of seismically-generated Tertiary (T) waves is used to study the time-clustering behavior of moderate-size (> 3.0 M) earthquakes along the north-central Mid-Atlantic Ridge. Because T-waves propagate efficiently within the ocean's sound channel, these data represent a significant improvement relative to the detection capabilities of land-based seismic stations. In addition, hydroacoustic monitoring overcomes many of the spatial and temporal limitations associated with ocean-bottom seismometer data, with the existing array being deployed continuously between 15-35 degrees N during the period February 1999-Februrary 2001.Within this region, the distribution of inter-event times is consistent with a non-random clustered process, with a coefficient of variation greater than 1.0. The clustered behavior is power-law in nature with temporal fluctuations characterized by a power spectral density that decays as 1/fα . Using Allan Factor analysis, α is found to range from 0.12-0.55 for different regions of the spreading axis. This scaling is negligible at time scales less than 3.5 x 103 s, and earthquake occurrence becomes less clustered (smaller α ) as increasing size thresholds are applied to the catalog. The highest degrees of clustering are associated temporally with large mainshock-aftershock sequences; however, some swarm-like activity also is evident. The distribution of acoustic magnitudes, or source levels, is consistent with a power-law size-frequency scaling for earthquakes. Although such behavior has been linked closely to the fractal nature of the underlying fault population in other environments, power-law fault size distributions have not been widely observed in the mid-ocean ridge setting.

  11. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  12. Long Period Earthquakes Beneath California's Young and Restless Volcanoes

    NASA Astrophysics Data System (ADS)

    Pitt, A. M.; Dawson, P. B.; Shelly, D. R.; Hill, D. P.; Mangan, M.

    2013-12-01

    The newly established USGS California Volcano Observatory has the broad responsibility of monitoring and assessing hazards at California's potentially threatening volcanoes, most notably Mount Shasta, Medicine Lake, Clear Lake Volcanic Field, and Lassen Volcanic Center in northern California; and Long Valley Caldera, Mammoth Mountain, and Mono-Inyo Craters in east-central California. Volcanic eruptions occur in California about as frequently as the largest San Andreas Fault Zone earthquakes-more than ten eruptions have occurred in the last 1,000 years, most recently at Lassen Peak (1666 C.E. and 1914-1917 C.E.) and Mono-Inyo Craters (c. 1700 C.E.). The Long Valley region (Long Valley caldera and Mammoth Mountain) underwent several episodes of heightened unrest over the last three decades, including intense swarms of volcano-tectonic (VT) earthquakes, rapid caldera uplift, and hazardous CO2 emissions. Both Medicine Lake and Lassen are subsiding at appreciable rates, and along with Clear Lake, Long Valley Caldera, and Mammoth Mountain, sporadically experience long period (LP) earthquakes related to migration of magmatic or hydrothermal fluids. Worldwide, the last two decades have shown the importance of tracking LP earthquakes beneath young volcanic systems, as they often provide indication of impending unrest or eruption. Herein we document the occurrence of LP earthquakes at several of California's young volcanoes, updating a previous study published in Pitt et al., 2002, SRL. All events were detected and located using data from stations within the Northern California Seismic Network (NCSN). Event detection was spatially and temporally uneven across the NCSN in the 1980s and 1990s, but additional stations, adoption of the Earthworm processing system, and heightened vigilance by seismologists have improved the catalog over the last decade. LP earthquakes are now relatively well-recorded under Lassen (~150 events since 2000), Clear Lake (~60 events), Mammoth Mountain (~320 events), and Long Valley Caldera (~40 events). LP earthquakes are notably absent under Mount Shasta. With the exception of Long Valley Caldera where LP earthquakes occur at depths of ≤5 km, hypocenters are generally between 15-25 km. The rates of LP occurrence over the last decade have been relatively steady within the study areas, except at Mammoth Mountain, where years of gradually declining LP activity abruptly increased after a swarm of unusually deep (20 km) VT earthquakes in October 2012. Epicenter locations relative to the sites of most recent volcanism vary across volcanic centers, but most LP earthquakes fall within 10 km of young vents. Source models for LP earthquakes often involve the resonance of fluid-filled cracks or nonlinear flow of fluids along irregular cracks (reviewed in Chouet and Matoza, 2013, JVGR). At mid-crustal depths the relevant fluids are likely to be low-viscosity basaltic melt and/or exsolved CO2-rich volatiles (Lassen, Clear Lake, Mammoth Mountain). In the shallow crust, however, hydrothermal waters/gases are likely involved in the generation of LP seismicity (Long Valley Caldera).

  13. Cooperative Monitoring Center Occasional Paper/8: Cooperative Border Security for Jordan: Assessment and Options

    SciTech Connect

    Qojas, M.

    1999-03-01

    This document is an analysis of options for unilateral and cooperative action to improve the security of Jordan's borders. Sections describe the current political, economic, and social interactions along Jordan's borders. Next, the document discusses border security strategy for cooperation among neighboring countries and the adoption of confidence-building measures. A practical cooperative monitoring system would consist of hardware for early warning, command and control, communications, and transportation. Technical solutions can expand opportunities for the detection and identification of intruders. Sensors (such as seismic, break-wire, pressure-sensing, etc.) can warn border security forces of intrusion and contribute to the identification of the intrusion and help formulate the response. This document describes conceptual options for cooperation, offering three scenarios that relate to three hypothetical levels (low, medium, and high) of cooperation. Potential cooperative efforts under a low cooperation scenario could include information exchanges on military equipment and schedules to prevent misunderstandings and the establishment of protocols for handling emergency situations or unusual circumstances. Measures under a medium cooperation scenario could include establishing joint monitoring groups for better communications, with hot lines and scheduled meetings. The high cooperation scenario describes coordinated responses, joint border patrols, and sharing border intrusion information. Finally, the document lists recommendations for organizational, technical, and operational initiatives that could be applicable to the current situation.

  14. Monitoring UT1 from astro-geodetic techniques at the EOP Center of the IERS

    NASA Astrophysics Data System (ADS)

    Gambis, D.; Bizouard, C.

    2010-11-01

    Monitoring the Earth rotation is essential in various domains linked to reference frames firstly with applications in orbit determination, space geodesy or Astronomy. Secondly for geophysical studies where are involved mass motions within the different external fluid layers, atmosphere, hydrosphere, core and mantle of the earth, this on time scales ranging from a few hours to decades. The Earth Orientation Centre of the IERS is continuously monitoring the earth orientation variations from results derived from the various astro-geodetic techniques. It has in particular the task of deriving an optimal combined series of UT1 which is now based mainly on Very Long Baseline Interferometry (VLBI) with some contribution of LOD derived from GPS. We give here a brief summary concerning the contribution of the various techniques to UT1 and in aprticular how the use of LOD derived from GPS can improve the combination. More details are available in Gambis (2004) and Bizouard and Gambis (2009) and the website http://hpiers.obspm.fr/eop-pc/

  15. Tohoku earthquake shook the ionosphere

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2011-08-01

    The giant 11 March 2011 magnitude 9 Tohoku earthquake not only shook the Earth and caused devastating tsunamis but also rattled the ionosphere, according to a new study. The surface seismic waves and tsunamis triggered waves in the atmosphere. These atmospheric waves propagated upward into the ionosphere, creating ripples in ionized gas nearly 350 kilometers above the Earth. Liu et al. measured these disturbances, called seismotraveling ionospheric disturbances (STID), using GPS receivers in Japan. The first disturbance appeared as a disk-shaped increase in electron density in the ionosphere about 7 minutes after the earthquake. Sequences of concentric waves of increased electron density then traveled from the STID center. Similar ionospheric disturbances have been observed following other earthquakes, but these were the largest ever seen, the authors report. (Journal of Geophysical Research-Space Physics, doi:10.1029/2011JA016761, 2011)

  16. Earthquake Prognosis With Applied Microseism.

    NASA Astrophysics Data System (ADS)

    Ahmedov, N.; Nagiyev, A.

    Earthquakes are the most dangerous natural catastrophy in terms of numerous casualties, amount of damages, areal coverage and difficulties associated with a need to provide secure measures. Inability to forecast these events makes the situation worse due to the following circumstances:-their buried focuses are invisible in the subsurface, they occur suddenly as a thunder, and some tens of the seconds later they leave devastated areas and casualties of tens of thousands of people. Currently earthquake forecausting is actually absolutely inefficient. Microseism application is one of the possible ways to forecast earthquakes. These small oscillation of up-going low-ampitude, irregular wawes observed on seismograms are refered to as microseism. Having been different from earhquakes itself, they are continuous, that is, have no origin coordinate on time axis. Their occurence is associated with breakers observed along shorelines, strong wind and hurricane patterns and so on. J.J.Linch has discovered a new tool to monitor hurricane motion trend over the seas with applied microseism recorded at ad hocstations. Similar to these observations it became possible to monitor the formation of the earthquake focuses based on correlation between low-frequency horizontal ahannels'N-S and E-W components. Microseism field and preceding abnormal variations monitoring data derived from "Cherepaha" 3M and 6/12 device enable to draw out some systematic trend in amplitude/frecuency domain. This relationship observed in a certain frequency range made it possible to define the generation of earthquake focuses with regard to the monitoring station. This variation trend was observed while Turkish and Iranian events happened during 1990,1992, and 1997. It is suggested to be useful to verify these effects in other regions to further correlate available data and work out common forecausting criteria.

  17. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected level of shaking intensity with empirical models of fatality losses calibrated on past earthquakes in each country. Non-seismic detections and macroseismic questionnaires collected online are combined to identify as many as possible of the felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the US Geological Survey, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. All together, we estimate that the number of detected felt earthquakes is around 1 000 per year, compared with the 35 000 earthquakes annually reported by the EMSC! Felt events are already the subject of the web page "Latest significant earthquakes" on EMSC website (http://www.emsc-csem.org/Earthquake/significant_earthquakes.php) and of a dedicated Twitter service @LastQuake. We will present the identification process of the earthquakes that matter, the smartphone application itself (to be released in May) and its future evolutions.

  18. Retrofitting Laboratory Fume Hoods With Face Velocity Monitors at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Wagner, Ingrid E.; Bold, Margaret D.; Diamond, David B.; Kall, Phillip M.

    1997-01-01

    Extensive use and reliance on laboratory fume hoods exist at LeRC for the control of chemical hazards (nearly 175 fume hoods). Flow-measuring devices are necessary to continually monitor hood performance. The flow-measuring device should he tied into an energy management control system to detect problems at a central location without relying on the users to convey information of a problem. Compatibility concerns and limitations should always be considered when choosing the most effective flow-measuring device for a particular situation. Good practice on initial hood design and placement will provide a system for which a flow-measuring device may be used to its full potential and effectiveness.

  19. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for recurrence, duration, and frequency response. At the Southern California field sites, one loop antenna was positioned for omni-directional reception and also detected a strong First Schumann Resonance; however, additional Schumann Resonances were absent. At the Timpson, TX field sites, loop antennae were positioned for directional reception, due to earthquake-induced, hydraulic fracturing activity currently conducted by the oil and gas industry. Two strong signals, one moderately strong signal, and approximately 6-8 weaker signals were detected in the immediate vicinity. The three stronger signals were mapped by a biangulation technique, followed by a triangulation technique for confirmation. This was the first antenna mapping technique ever performed for determining possible earthquake epicenters. Six and a half months later, Timpson experienced two M4 (M4.1 and M4.3) earthquakes on September 2, 2013 followed by a M2.4 earthquake three days later, all occurring at a depth of five kilometers. The Timpson earthquake activity now has a cyclical rate and a forecast was given to the proper authorities. As a result, the Southern California and Timpson, TX field results led to an improved design and construction of a third prototype antenna. With a loop antenna array, a viable communication system, and continuous monitoring, a full fracture cycle can be established and observed in real-time. In addition, field data could be reviewed quickly for assessment and lead to a much more improved earthquake forecasting capability. The EM precursors determined by this method appear to surpass all prior precursor claims, and the general public will finally receive long overdue forecasting.

  20. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will complete 450 entries, which will populate the E3 collection to a level that fully spans earthquake science and engineering. Scientists, engineers, and educators who have suggestions for content to be included in the Encyclopedia can visit www.earthquake.info now to complete the "Suggest a Web Page" form.

  1. Evidence for Ancient Mesoamerican Earthquakes

    NASA Astrophysics Data System (ADS)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the collapse of the hiearchical authority at these locations and may have contributed to the end of the Classic culture at other nearby sites in proximity to the Caribbean plate boundary zone.

  2. Using of Remote Sensing Techniques for Monitoring the Earthquakes Activities Along the Northern Part of the Syrian Rift System (LEFT-LATERAL),SYRIA

    NASA Astrophysics Data System (ADS)

    Dalati, Moutaz

    Earthquake mitigation can be achieved with a better knowledge of a region's infra-and substructures. High resolution Remote Sensing data can play a significant role to implement Geological mapping and it is essential to learn about the tectonic setting of a region. It is an effective method to identify active faults from different sources of Remote Sensing and compare the capability of some satellite sensors in active faults survey. In this paper, it was discussed a few digital image processing approaches to be used for enhancement and feature extraction related to faults. Those methods include band ratio, filtering and texture statistics . The experimental results show that multi-spectral images have great potentials in large scale active faults investigation. It has also got satisfied results when deal with invisible faults. Active Faults have distinct features in satellite images. Usually, there are obvious straight lines, circular structures and other distinct patterns along the faults locations. Remotely Sensed imagery Landsat ETM and SPOT XS /PAN are often used in active faults mapping. Moderate and high resolution satellite images are the best choice, because in low resolution images, the faults features may not be visible in most cases. The area under study is located Northwest of Syria that is part of one of the very active deformation belt on the Earth today. This area and the western part of Syria are located along the great rift system (Left-Lateral or African- Syrian Rift System). Those areas are tectonically active and caused a lot of seismically events. The AL-Ghab graben complex is situated within this wide area of Cenozoic deformation. The system formed, initially, as a result of the break up of the Arabian plate from the African plate. This action indicates that these sites are active and in a continual movement. In addition to that, the statistic analysis of Thematic Mapper data and the features from a digital elevation model ( DEM )produced from SAR interferometer show the existence of spectral structures at the same sites. The Arabian plate is moving in a NNW direction, whereas the African plate is moving to the North. The left-lateral motion along the Dead Sea Fault accommodates the difference in movement rate between both plates. The analysis of TM Space Imagery and digital image processing of spectral data show that the lineaments along AL-Ghab graben maybe considered as linear conjunctions accompanied with complex fracturing system. This complex is affected by distance stresses accompanied with intensive forces. The digital image processing of Radar imagery showing the presence of active and fresh faulting zones along the AL-Ghab graben. TM and SAR-DTM data, also showed a gradual color tone and interruptions of linear-ellipse shapes which reflecting the presence of discontinuity contours along the fault zone extension .This features refer to abundance of surface morphological features indicate to Fresh Faults. Recent faulting is expressed as freshly exposed soil within the colluvial apron visible by its light tone color. These indicators had been proved by field checks. Furthermore, the statistic digital analysis of the spectral data show that there are distribution of spectral plumes. These plumes are decreasing in intensity and color contrast from the center of the site to the direction of its edges.

  3. Radioanalytical Data Quality Objectives and Measurement Quality Objectives during a Federal Radiological Monitoring and Assessment Center Response

    SciTech Connect

    E. C. Nielsen

    2006-01-01

    During the early and intermediate phases of a nuclear or radiological incident, the Federal Radiological Monitoring and Assessment Center (FRMAC) collects environmental samples that are analyzed by organizations with radioanalytical capability. Resources dedicated to quality assurance (QA) activities must be sufficient to assure that appropriate radioanalytical measurement quality objectives (MQOs) and assessment data quality objectives (DQOs) are met. As the emergency stabilizes, QA activities will evolve commensurate with the need to reach appropriate DQOs. The MQOs represent a compromise between precise analytical determinations and the timeliness necessary for emergency response activities. Minimum detectable concentration (MDC), lower limit of detection, and critical level tests can all serve as measurements reflecting the MQOs. The relationship among protective action guides (PAGs), derived response levels (DRLs), and laboratory detection limits is described. The rationale used to determine the appropriate laboratory detection limit is described.

  4. Launch Complex 39 Observation Gantry Area (SWMU# 107) Annual Long-Term Monitoring Report (Year 1) Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Johnson, Jill W.; Towns, Crystal

    2015-01-01

    This document has been prepared by Geosyntec Consultants, Inc. (Geosyntec) to present and discuss the findings of the 2014 and 2015 Long-Term Monitoring (LTM) activities that were completed at the Launch Complex 39 (LC39) Observation Gantry Area (OGA) located at the John F. Kennedy Space Center (KSC), Florida (Site). The remainder of this report includes: (i) a description of the Site location; (ii) summary of Site background and previous investigations; (iii) description of field activities completed as part of the annual LTM program at the Site; (iv) groundwater flow evaluation; (v) presentation and discussion of field and analytical results; and (vi) conclusions and recommendations. Applicable KSC Remediation Team (KSCRT) Meeting minutes are included in Attachment A. This Annual LTM Letter Report was prepared by Geosyntec Consultants (Geosyntec) for NASA under contract number NNK12CA13B, Delivery Order NNK13CA39T project number PCN ENV2188.

  5. Earthquake location determination using data from DOMERAPI and BMKG seismic networks: A preliminary result of DOMERAPI project

    SciTech Connect

    Ramdhan, Mohamad; Nugraha, Andri Dian; Widiyantoro, Sri; Métaxian, Jean-Philippe; Valencia, Ayunda Aulia

    2015-04-24

    DOMERAPI project has been conducted to comprehensively study the internal structure of Merapi volcano, especially about deep structural features beneath the volcano. DOMERAPI earthquake monitoring network consists of 46 broad-band seismometers installed around the Merapi volcano. Earthquake hypocenter determination is a very important step for further studies, such as hypocenter relocation and seismic tomographic imaging. Ray paths from earthquake events occurring outside the Merapi region can be utilized to delineate the deep magma structure. Earthquakes occurring outside the DOMERAPI seismic network will produce an azimuthal gap greater than 180{sup 0}. Owing to this situation the stations from BMKG seismic network can be used jointly to minimize the azimuthal gap. We identified earthquake events manually and carefully, and then picked arrival times of P and S waves. The data from the DOMERAPI seismic network were combined with the BMKG data catalogue to determine earthquake events outside the Merapi region. For future work, we will also use the BPPTKG (Center for Research and Development of Geological Disaster Technology) data catalogue in order to study shallow structures beneath the Merapi volcano. The application of all data catalogues will provide good information as input for further advanced studies and volcano hazards mitigation.

  6. The ICOS Ecosystem network and Thematic Center: an infrastructure to monitor and better understand the ecosystem GHGs exchanges

    NASA Astrophysics Data System (ADS)

    Papale, D.; Ceulemans, R.; Janssens, I.; Loustau, D.; Valentini, R.

    2012-12-01

    The ICOS Ecosystem network is part of the ICOS European Research Infrastructure (www.icos-infrastructure.eu) together with the Atmospheric and Ocean networks. The ICOS Ecosystem includes highly standardized monitoring sites based on commercially available instruments embedded into an integrated system that is coordinated by the ICOS Ecosystem Thematic Center (ETC) which is responsible for the methodologies advancement, data processing and data distribution. The ecosystem monitoring activity will involve human intervention in field activities and for this reason rigorously standardized protocol for field ecosystem measurements are in preparation also in coordination with others international related activities. The core measurement in the ICOS Ecosystem sites are the main GHGs fluxes that include CO2, H2O, CH4 and N2O, using the eddy covariance method and chambers for the soil effluxes. To better interpret and understand the GHGs exchanges a full series of meteorological data (including spectral reflectance measurements and full radiation and water balance) are also collected and the sites are characterized in terms of carbon stocks, nutrients availability and management and disturbance history. Centralized raw data processing, QAQC and uncertainty estimation, test and development of new methodologies and techniques, assistance to the network and chemical analysis and long term storage of the vegetation and soil samples are the main activities where the ETC is responsible. The ETC, based in Italy and with sections in Belgium and France, is under construction and will be operative in 2013. The Ecosystem network, including the variables collected, the protocols under preparation and the data access and data use policies will be presented together with the Ecosystem Thematic Center role and development strategy. The aim is to identify and discuss integration and collaboration with others similar initiatives, also thanks to the support of the COOPEUS European project that will facilitate coordination between US and EU networks, and to receive the feedbacks from potential users of the infrastructure.

  7. The ICOS Ecosystem network and Thematic Center: an infrastructure to monitor and better understand the ecosystem GHGs exchanges

    NASA Astrophysics Data System (ADS)

    Janssens, Ivan; Papale, Dario; Ceulemans, Reinhart; Gielen, Bert; Loustau, Denis; de Beeck, Maarten Op; Valentini, Riccardo

    2013-04-01

    The ICOS Ecosystem network is part of the ICOS European Research Infrastructure (www.icos-infrastructure.eu) together with the Atmospheric and Ocean networks. The ecosystem network includes highly standardized monitoring sites based on commercially available instruments embedded into an integrated system that is coordinated by the ICOS Ecosystem Thematic Center (ETC), which is also responsible for the methodologies advancement, data processing and data distribution. The ecosystem monitoring activity will involve human intervention in field activities and for this reason rigorously standardized protocol for field ecosystem measurements are in preparation also in coordination with others international related activities. The core measurement in the ICOS Ecosystem sites are the main GHGs fluxes that include CO2, H2O, CH4 and N2O, using the eddy covariance method and chambers for the soil effluxes. To better interpret and understand the GHGs exchanges a full series of meteorological data (including spectral reflectance measurements and full radiation and water balance) are also collected and the sites are characterized in terms of carbon stocks, nutrients availability and management and disturbance history. Centralized raw data processing, QAQC and uncertainty estimation, test and development of new methodologies and techniques, assistance to the network and chemical analysis and long term storage of the vegetation and soil samples are the main activities where the ETC is responsible. The ETC, based in Italy and with sections in Belgium and France, is under construction and will be operative in 2013. We present the actual status of the Ecosystem network, including the variables collected, the protocols under preparation, the data access and data use policies and the Ecosystem Thematic Center role and development strategy, with special emphasis on the approaches followed to reach high level of to standardization together with the uncertainty quantification.

  8. Earthquake occurrence and effects.

    PubMed

    Adams, R D

    1990-01-01

    Although earthquakes are mainly concentrated in zones close to boundaries of tectonic plates of the Earth's lithosphere, infrequent events away from the main seismic regions can cause major disasters. The major cause of damage and injury following earthquakes is elastic vibration, rather than fault displacement. This vibration at a particular site will depend not only on the size and distance of the earthquake but also on the local soil conditions. Earthquake prediction is not yet generally fruitful in avoiding earthquake disasters, but much useful planning to reduce earthquake effects can be done by studying the general earthquake hazard in an area, and taking some simple precautions. PMID:2347628

  9. Biological monitoring of mercury exposure in individuals referred to a toxicological center in Venezuela.

    PubMed

    Rojas, Maritza; Seijas, David; Agreda, Olga; Rodríguez, Maritza

    2006-02-01

    People in developing countries are often considered at greater risk of mercury (Hg) poisoning due to a variety of factors including a lack of awareness regarding their occupational risks. Individuals requiring urine mercury (U-Hg) analysis at the Center for Toxicological Investigations of the University of Carabobo (CITUC), between 1998 and 2002 were studied to identify demographic characteristics associated to U-Hg levels. The studied population included individuals with a history of exposure (or related exposures) to Hg processes, and was comprised of 1159 individuals (65 children, 1094 adults) ages 0.58-79 years old, mean 36.63+/-12.4. Children's geometric mean U-Hg levels were 2.73 microg/g Creatinine (Ct) and in adults 2.55 microg/g Ct. The highest frequency of adults' occupations were shipyard workers (35.47%), dentists (23.5%), lab technicians (11.43%), dental employees 10.42% and miners (10.2%). Chemical laboratory technicians had the highest mean U-Hg (4.46 microg/g Ct). Mean U-Hg levels in female adults (3.45 microg/g Ct) were statistically superior to levels in male adults (2.15 microg/g Ct). Two of the 172 women in reproductive age, had U-Hg levels higher than 78 microg/g Ct. Individuals from Falcon State were found to have the highest mean U-Hg (4.53 microg/g Ct). U-Hg levels higher than permissible limits were found in only 2 states (Carabobo and Bolivar) with a total of 24 cases. Although the results of this investigation were highly variable, the findings can be used to examine circumstances which influence mercury toxicity trends, and possibly used in future studies working to identify Hg exposures. PMID:16399001

  10. Data Management and Site-Visit Monitoring of the Multi-Center Registry in the Korean Neonatal Network

    PubMed Central

    Choi, Chang Won

    2015-01-01

    The Korean Neonatal Network (KNN), a nationwide prospective registry of very-low-birth-weight (VLBW, < 1,500 g at birth) infants, was launched in April 2013. Data management (DM) and site-visit monitoring (SVM) were crucial in ensuring the quality of the data collected from 55 participating hospitals across the country on 116 clinical variables. We describe the processes and results of DM and SVM performed during the establishment stage of the registry. The DM procedure included automated proof checks, electronic data validation, query creation, query resolution, and revalidation of the corrected data. SVM included SVM team organization, identification of unregistered cases, source document verification, and post-visit report production. By March 31, 2015, 4,063 VLBW infants were registered and 1,693 queries were produced. Of these, 1,629 queries were resolved and 64 queries remain unresolved. By November 28, 2014, 52 participating hospitals were visited, with 136 site-visits completed since April 2013. Each participating hospital was visited biannually. DM and SVM were performed to ensure the quality of the data collected for the KNN registry. Our experience with DM and SVM can be applied for similar multi-center registries with large numbers of participating centers. PMID:26566353

  11. Data Management and Site-Visit Monitoring of the Multi-Center Registry in the Korean Neonatal Network.

    PubMed

    Choi, Chang Won; Park, Moon Sung

    2015-10-01

    The Korean Neonatal Network (KNN), a nationwide prospective registry of very-low-birth-weight (VLBW, < 1,500 g at birth) infants, was launched in April 2013. Data management (DM) and site-visit monitoring (SVM) were crucial in ensuring the quality of the data collected from 55 participating hospitals across the country on 116 clinical variables. We describe the processes and results of DM and SVM performed during the establishment stage of the registry. The DM procedure included automated proof checks, electronic data validation, query creation, query resolution, and revalidation of the corrected data. SVM included SVM team organization, identification of unregistered cases, source document verification, and post-visit report production. By March 31, 2015, 4,063 VLBW infants were registered and 1,693 queries were produced. Of these, 1,629 queries were resolved and 64 queries remain unresolved. By November 28, 2014, 52 participating hospitals were visited, with 136 site-visits completed since April 2013. Each participating hospital was visited biannually. DM and SVM were performed to ensure the quality of the data collected for the KNN registry. Our experience with DM and SVM can be applied for similar multi-center registries with large numbers of participating centers. PMID:26566353

  12. Nonextensive characteristics of earthquakes magnitude distribution in Javakheti region, Georgia

    NASA Astrophysics Data System (ADS)

    Chelidze, Tamaz; Matcharashvili, Teimuraz; Jorjiashvili, Nato; Javakhishvili, Zurab

    2010-05-01

    For last several years nonextensive statistical mechanics is increasingly used to study wide range of complex phenomena exhibiting the scale free nature in different domains. It is assumed that nonextensivity concepts may provide a suitable framework to shed new light on features of spatiotemporal and energetic behavior of seismic processes which presently are not fully understood. In present research we studied cumulative distribution of earthquakes magnitudes in Caucasus from both common and nonextensive statistical mechanics point of views. Data sets of earthquakes magnitudes from 1960 to 1991 have been compiled from data bases of Seismic Monitoring Center at Ilia State University in Georgia. Javakheti Region in Southern Georgia was selected based on its geological structure and high seismic activity; exact time interval was specified because of increased seismic activity in Caucasus for that period. Together with common seismic characteristics such as a andbvalues of Gutenberg-Richter relationship, we evaluated nonextensive characteristics in the framework of earthquakes fragment-asperity interaction model. Namely nonextensive parameter qand energy density value a were calculated. All these characteristics have been assessed for the whole observation period as well as for consecutive 10 year overlapping sliding windows. It was observed that calculated nonextensive characteristics both for whole catalogue and for sliding windows (q=1.6-1.83) are close to the range found earlier for other regions. At the same time we see that bothaandq values vary in the investigated period, for consecutive sliding windows. These changes are statistically significant and obviously are related to the earthquake generation process of Javakheti region. Indeed, it was observed that nonextensivity parameter increases according to local seismic activity, which may point to the increase of functional relationship between above parameters prior and during earthquake generation. At the same time energy density value a, which is assumed to be related with spatial distribution, decreases after strongest event for the considered time period. These results point to increased long range correlations of seismic process in energetic and spatial domains prior and during strongest regional earthquakes. Results of nonextensive analysis are in good accordance with b value analysis. After strongest event b value increases and a decreases that is consistent with a physical meaning of these parameters. Results of our research supports assumption that nonextensive statistics can provide a new promising approach to earthquakes distribution features in different domains.

  13. Investigation on the Possible Relationship between Magnetic Pulsations and Earthquakes

    NASA Astrophysics Data System (ADS)

    Jusoh, M.; Liu, H.; Yumoto, K.; Uozumi, T.; Takla, E. M.; Yousif Suliman, M. E.; Kawano, H.; Yoshikawa, A.; Asillam, M.; Hashim, M.

    2012-12-01

    The sun is the main source of energy to the solar system, and it plays a major role in affecting the ionosphere, atmosphere and the earth surface. The connection between solar wind and the ground magnetic pulsations has been proven empirically by several researchers previously (H. J. Singer et al., 1977, E. W. Greenstadt, 1979, I. A. Ansari 2006 to name a few). In our preliminary statistical analysis on relationship between solar and seismic activities (Jusoh and Yumoto, 2011, Jusoh et al., 2012), we observed a high possibility of solar-terrestrial coupling. We observed high tendency of earthquakes to occur during lower phase solar cycles which significantly related with solar wind parameters (i.e solar wind dynamic pressure, speed and input energy). However a clear coupling mechanism was not established yet. To connect the solar impact on seismicity, we investigate the possibility of ground magnetic pulsations as one of the connecting agent. In our analysis, the recorded ground magnetic pulsations are analyzed at different ranges of ultra low frequency; Pc3 (22-100 mHz), Pc4 (6.7-22 mHz) and Pc5 (1.7-6.7 mHz) with the occurrence of local earthquake events at certain time periods. This analysis focuses at 2 different major seismic regions; north Japan (mid latitude) and north Sumatera, Indonesia (low latitude). Solar wind parameters were obtained from the Goddard Space Flight Center, NASA via the OMNIWeb Data Explorer and the Space Physics Data Facility. Earthquake events were extracted from the Advanced National Seismic System (ANSS) database. The localized Pc3-Pc5 magnetic pulsations data were extracted from Magnetic Data Acquisition System (MAGDAS)/Circum Pan Magnetic Network (CPMN) located at Ashibetsu (Japan); for earthquakes monitored at north Japan and Langkawi (Malaysia); for earthquakes observed at north Sumatera. This magnetometer arrays has established by International Center for Space Weather Science and Education, Kyushu University, Japan. From the results, we observed significant correlations between ground magnetic pulsations and solar wind speed at difference earthquake epicenter depths. The details of the analysis will be discussed in the presentation.

  14. Anomalous Schumann resonance observed in China, possibly associated with Honshu, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Ouyang, X. Y.; Zhang, X. M.; Shen, X. H.; Miao, Y. Q.

    2012-04-01

    Schumann resonance (hereafter SR) occurs in the cavity between the Earth and the ionosphere, and it is originated by the global lightning activities [1]. Some recent publications showed that anomalous SR phenomena may occur before major earthquakes [2-4]. Considering good prospects for the application of SR in Earthquake monitoring, we have established four observatories in Yunnan province, a region with frequent seismicity in the southwest of China. Our instruments can provide three components of magnetic field in 0-30 Hz, including BNS(North-South component), BEW(East-West component) and BV (Vertical component). The sample frequency is 100 Hz. In this research, we use high quality data recorded at Yongsheng observatory (geographic coordinates: 26.7° N, 100.77°E) to analyze SR phenomena to find out anomalous effects possibly related with the Ms9.0 Earthquake (epicenter: 38.297° N, 142.372° E) near the east coast of Honshu, Japan on 11 March 2011. We select the data 15 days before and after the earthquake. SR in BNS and SR in BEWappear different in background characteristics. Frequencies of four SR modes in BNSare generally higher than that in BEW. Amplitude of SR in BNSis strong at around 05:00 LT, 15:00 LT and 23:00 LT of the day, while amplitude of SR in BEW is just intense around 16:00 LT, corresponding to about 08:00 UT. Because American, African and Asian thunderstorm centers play their dominant roles respectively in the intervals of 21:00UT±1h, 15:00UT±1h and 08:00UT±1h [1, 3], we can see that SR in BEWis most sensitive to signals from Asian center and SR in BNS is in good response to three centers. SR in BNS and SR in BEW have presented different features in the aspect of anomalous effects related with earthquakes. BEW component gives us a clear picture of anomalous SR phenomena, which are characterized by increase in amplitude of four SR modes and increase in frequency at first SR mode several days before the earthquake. The amplitude of four SR modes began to increase four days before Honshu earthquake (7th March). And this continued to the day of the earthquake (11th March). Then it fell to the usual intensity after the earthquake (12th March). The frequency at first SR mode in BEW unconventionally exceeded the first mode frequency in BNS with an enhancement of 0.7 Hz on 8th and 9th March. We did not find similar anomalous effects in BNS. The anomalous effects in BEW may be caused by interference between direct path from Asian center to the observatory and disturbed path scattered by the perturbation in the ionosphere over Honshu. More detailed analysis is going on. 1. Nickolaenko A P and Hayakawa M, Resonances in the Earth-ionosphere cavity. 2002: Kluwer Academic Pub. 2. Hayakawa M, Ohta K, Nickolaenko A P, et al. Anomalous effect in Schumann resonance phenomena observed in Japan, possibly associated with the Chi-chi earthquake in Taiwan. Annales geophysicae,2005. pp. 1335-1346. 3. Hayakawa M, Nickolaenko A P, Sekiguchi M, et al., Anomalous ELF phenomena in the Schumann resonance band as observed at Moshiri (Japan) in possible association with an earthquake in Taiwan. Nat. Hazards Earth Syst. Sci, 2008. 8(6): p. 1309-1316. 4. Ohta K, Izutsu J, and Hayakawa M, Anomalous excitation of Schumann resonances and additional anomalous resonances before the 2004 Mid-Niigata prefecture earthquake and the 2007 Noto Hantou Earthquake. Physics and Chemistry of the Earth, Parts A/B/C, 2009. 34(6-7): p. 441-448.

  15. Napa Earthquake impact on water systems

    NASA Astrophysics Data System (ADS)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  16. Monitors.

    ERIC Educational Resources Information Center

    Powell, David

    1984-01-01

    Provides guidelines for selecting a monitor to suit specific applications, explains the process by which graphics images are produced on a CRT monitor, and describes four types of flat-panel displays being used in the newest lap-sized portable computers. A comparison chart provides prices and specifications for over 80 monitors. (MBR)

  17. Multiple asperity model for earthquake prediction

    USGS Publications Warehouse

    Wyss, M.; Johnston, A.C.; Klein, F.W.

    1981-01-01

    Large earthquakes often occur as multiple ruptures reflecting strong variations of stress level along faults. Dense instrument networks with which the volcano Kilauea is monitored provided detailed data on changes of seismic velocity, strain accumulation and earthquake occurrence rate before the 1975 Hawaii 7.2-mag earthquake. During the ???4 yr of preparation time the mainshock source volume had separated into crustal volumes of high stress levels embedded in a larger low-stress volume, showing respectively high- and low-stress precursory anomalies. ?? 1981 Nature Publishing Group.

  18. Present Status of the Tsukuba Magnet Laboratory. A Report on the Aftereffects of the March 11, 2011 Earthquake

    NASA Astrophysics Data System (ADS)

    Nimori, Shigeki

    2014-10-01

    The Tsukuba Magnet Laboratory (TML) is located 324 km from the seismic center of the first 9.0 magnitude earthquake that struck Japan on Friday, March 11, 2011. TML suffered peak ground acceleration of 372 Gal. The large 930 and 1030 MHz nuclear magnetic resonance (NMR) magnets of TML were severely affected by the earthquake. The hybrid magnet and its control system were not significantly damaged. After the earthquake, serious electricity shortages occurred and our awareness of the importance of energy conservation increased. A control system for a hybrid magnet has been in development for several years. The system has sophisticated monitoring capability, detailed and rapid data recording, and is now nearing completion. The newly developed system provides detailed data; our ability to interpret this data and identify difficulties in the acquisition of critical data is improving. We are now beginning to optimize operations to reduce electricity consumption and achieve higher efficiency magnet operations.

  19. Post-Sumatra Enhancements at the Pacific Tsunami Warning Center

    NASA Astrophysics Data System (ADS)

    McCreery, C.; Weinstein, S.; Becker, N.; Cessaro, R.; Hirshorn, B.; Fryer, G.; Hsu, V.; Sardina, V.; Koyanagi, S.; Shiro, B.; Wang, D.; Walsh, D.

    2007-12-01

    Following the tragic Indian Ocean Tsunami of 2004, the Richard Hagemeyer Pacific Tsunami Warning Center (PTWC) has dramatically enhanced its capabilities. With improved communications PTWC now ingests seismic data from almost all broadband stations of the Global Seismographic Network and will soon add many stations from the International Monitoring System. As data sources are increased PTWC's response time to any earthquake declines; for most earthquakes the center now gets out an initial message in about 12 minutes. With 24-hour staffing, that performance is maintained around the clock. Direct measurement of tsunamis has been improved through communications upgrades to coastal tide gauges by NOAA and other collaborators in the Pacific Tsunami Warning System, and by the NOAA deployment of DART instruments throughout the world's oceans. In addition to providing warnings for the Pacific (with the exception of Alaska and the west coasts of the U.S, and Canada, which are the responsibility of the West Coast and Alaska Tsunami Warning Center), PTWC also operates as an interim warning center for the Indian Ocean (a task performed in collaboration with the Japan Meteorological Agency) and the Caribbean. PTWC also operates as a local warning center for the State of Hawaii. In Hawaii, the installation of new seismometers again means a continuous reduction in PTWC's response times. Initial assessments of local earthquakes are routinely accomplished in less than five minutes, and the first message for the Kiholo Bay Earthquake of 2006 was issued in only three minutes. With the development of the Hawaii Integrated Seismographic Network, in collaboration with the U.S. Geological Survey, the goal is to reduce the time for tsunami warnings to under two minutes for any earthquake in the Hawaiian Islands.

  20. EARTHQUAKE TRIGGERING AND SPATIAL-TEMPORAL RELATIONS IN THE VICINITY OF YUCCA MOUNTAIN, NEVADA

    SciTech Connect

    na

    2001-02-08

    It is well accepted that the 1992 M 5.6 Little Skull Mountain earthquake, the largest historical event to have occurred within 25 km of Yucca Mountain, Nevada, was triggered by the M 7.2 Landers earthquake that occurred the day before. On the premise that earthquakes can be triggered by applied stresses, we have examined the earthquake catalog from the Southern Great Basin Digital Seismic Network (SGBDSN) for other evidence of triggering by external and internal stresses. This catalog now comprises over 12,000 events, encompassing five years of consistent monitoring, and has a low threshold of completeness, varying from M 0 in the center of the network to M 1 at the fringes. We examined the SGBDSN catalog response to external stresses such as large signals propagating from teleseismic and regional earthquakes, microseismic storms, and earth tides. Results are generally negative. We also examined the interplay of earthquakes within the SGBDSN. The number of ''foreshocks'', as judged by most criteria, is significantly higher than the background seismicity rate. In order to establish this, we first removed aftershocks from the catalog with widely used methodology. The existence of SGBDSN foreshocks is supported by comparing actual statistics to those of a simulated catalog with uniform-distributed locations and Poisson-distributed times of occurrence. The probabilities of a given SGBDSN earthquake being followed by one having a higher magnitude within a short time frame and within a close distance are at least as high as those found with regional catalogs. These catalogs have completeness thresholds two to three units higher in magnitude than the SGBDSN catalog used here. The largest earthquake in the SGBDSN catalog, the M 4.7 event in Frenchman Flat on 01/27/1999, was preceded by a definite foreshock sequence. The largest event within 75 km of Yucca Mountain in historical time, the M 5.7 Scotty's Junction event of 08/01/1999, was also preceded by foreshocks. The monitoring area of the SGBDSN has been in a long period of very low moment release rate since February of 1999. The seismicity catalog to date suggests that the next significant (M > 4) earthquake within the SGBDSN will be preceded by foreshocks.

  1. The Mw=8.8 Maule earthquake aftershock sequence, event catalog and locations

    NASA Astrophysics Data System (ADS)

    Meltzer, A.; Benz, H.; Brown, L.; Russo, R. M.; Beck, S. L.; Roecker, S. W.

    2011-12-01

    The aftershock sequence of the Mw=8.8 Maule earthquake off the coast of Chile in February 2010 is one of the most well-recorded aftershock sequences from a great megathrust earthquake. Immediately following the Maule earthquake, teams of geophysicists from Chile, France, Germany, Great Britain and the United States coordinated resources to capture aftershocks and other seismic signals associated with this significant earthquake. In total, 91 broadband, 48 short period, and 25 accelerometers stations were deployed above the rupture zone of the main shock from 33-38.5°S and from the coast to the Andean range front. In order to integrate these data into a unified catalog, the USGS National Earthquake Information Center develop procedures to use their real-time seismic monitoring system (Bulletin Hydra) to detect, associate, location and compute earthquake source parameters from these stations. As a first step in the process, the USGS has built a seismic catalog of all M3.5 or larger earthquakes for the time period of the main aftershock deployment from March 2010-October 2010. The catalog includes earthquake locations, magnitudes (Ml, Mb, Mb_BB, Ms, Ms_BB, Ms_VX, Mc), associated phase readings and regional moment tensor solutions for most of the M4 or larger events. Also included in the catalog are teleseismic phases and amplitude measures and body-wave MT and CMT solutions for the larger events, typically M5.5 and larger. Tuning of automated detection and association parameters should allow a complete catalog of events to approximately M2.5 or larger for that dataset of more than 164 stations. We characterize the aftershock sequence in terms of magnitude, frequency, and location over time. Using the catalog locations and travel times as a starting point we use double difference techniques to investigate relative locations and earthquake clustering. In addition, phase data from candidate ground truth events and modeling of surface waves can be used to calibrate the velocity structure of central Chile to improve the real-time monitoring.

  2. Interpretations on the Geologic Setting of Yogyakarta Earthquakes 2006 (Central Java, Indonesia) Based on Integration of Aftershock Monitoring and Existing Geologic, Geophysical and Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Setijadji, L. D.; Watanabe, K.; Fukuoka, K.; Ehara, S.; Setiadji, Y.; Rahardjo, W.; Susilo, A.; Barianto, D. H.; Harijoko, A.; Sudarno, I.; Pramumijoyo, S.; Hendrayana, H.; Akmalludin, A.; Nishijima, J.; Itaya, T.

    2007-05-01

    The unprecedented 26 May 2006 Yogyakarta earthquake (central Java, Indonesia) that took victims of 5,700 lives was generally accepted to have a depth of about 10 km and moment magnitude of 6.4. However, the definition of location of active fault is still under debate as the epicenter of mainshock was reported quite differently by several institutions. Many researchers believe that the Opak fault which is located at the eastern boundary of Yogyakarta low-land area (or Yogyakarta Basin) and the high-land region of Southern Mountains was the source of year 2006 earthquakes. However, our result of aftershocks observation suggests that the ruptured zone was not located along the Opak fault but from an unknown fault located about 10 km to the east from it and within the Southern Mountains domain. Unfortunately, surface geologic manifestations are scarce as this area is now largely covered by limestone. Therefore the suspected active fault system must be studied through interpretations of the subsurface geology and evaluation of the Cenozoic geo-history of the region utilizing existing geologic, geophysical and remote sensing data. This work suggests that the Yogyakarta Basin is a volcano-tectonic depression formed gradually since the early Tertiary period (Oligo-Miocene or older). Geological and geophysical evidence suggest that structural trends changed from the Oligocene NE-SW towards the Oligo-Miocene NNE-SSW and the Plio-Pleistocene NW-SE and E-W directions. The ruptured "X" fault during the Yogyakarta earthquakes 2006 is likely to be a NNE-SSW trending fault which is parallel to the Opak fault and both were firstly active in the Oligo-Miocene as sinistral strike-slip faults. However, while the Opak fault had changed into a normal faulting after the Pliocene, the evidence from Kali Ngalang and Kali Widoro suggests that the "X" fault system was still reactivated as a strike-slip one during the Plio-Pleistocene orogeny. As this new interpretation of active fault causes spatial discrepancy between locations of earthquakes epicenters and highly damaged regions, other geo-engineering factors must be considerably important in determining the final scale of seismic hazards. The most vulnerable areas for seismic hazards are those located nearest to the ruptured fault and are underlain by thick Quaternary unconsolidated deposits. In case of regions along the fault line, seismic hazards seem to reach more distance region, such as the case of Gantiwarno region, as the seismic waves can travel more easily along the fault line.

  3. Long-term monitoring of creep rate along the Hayward fault and evidence for a lasting creep response to 1989 Loma Prieta earthquake

    USGS Publications Warehouse

    Lienkaemper, J.J.; Galehouse, J.S.; Simpson, R.W.

    2001-01-01

    We present results from over 30 yr of precise surveys of creep along the Hayward fault. Along most of the fault, spatial variability in long-term creep rates is well determined by these data and can help constrain 3D-models of the depth of the creeping zone. However, creep at the south end of the fault stopped completely for more than 6 years after the M7 1989 Loma Prieta Earthquake (LPEQ), perhaps delayed by stress drop imposed by this event. With a decade of detailed data before LPEQ and a decade after it, we report that creep response to that event does indeed indicate the expected deficit in creep.

  4. Design and characterization of the beam monitor detectors of the Italian National Center of Oncological Hadron-therapy (CNAO)

    NASA Astrophysics Data System (ADS)

    Giordanengo, S.; Donetti, M.; Garella, M. A.; Marchetto, F.; Alampi, G.; Ansarinejad, A.; Monaco, V.; Mucchi, M.; Pecka, I. A.; Peroni, C.; Sacchi, R.; Scalise, M.; Tomba, C.; Cirio, R.

    2013-01-01

    A new hadron-therapy facility implementing an active beam scanning technique has been developed at the Italian National Center of Oncological Hadron-therapy (CNAO). This paper presents the design and the characterization of the beam monitor detectors developed for the on-line monitoring and control of the dose delivered during a treatment at CNAO. The detectors are based on five parallel-plate transmission ionization chambers with either a single large electrode or electrodes segmented in 128 strips (strip chambers) and 32×32 pixels (pixel chamber). The detectors are arranged in two independent boxes with an active area larger than 200×200 mm2 and a total water equivalent thickness along the beam path of about 0.9 mm. A custom front-end chip with 64 channels converts the integrated ionization channels without dead-time. The detectors were tested at the clinical proton beam facility of the Paul Scherrer Institut (PSI) which implements a spot scanning technique, each spot being characterized by a predefined number of protons delivered with a pencil beam in a specified point of the irradiation field. The short-term instability was measured by delivering several identical spots in a time interval of few tenths of seconds and is found to be lower than 0.3%. The non-uniformity, measured by delivering sequences of spots in different points of the detector surface, results to be lower than 1% in the single electrode chambers and lower than 1.5% in the strip and pixel chambers, reducing to less than 0.5% and 1% in the restricted 100×100 mm2 central area of the detector.

  5. The Seminole Serpent Warrior At Miramar, FL, Shows Settlement Locations Enabled Environmental Monitoring Reminiscent Of the Four-corners Kokopelli-like EMF Phenomena, and Related to Earthquakes, Tornados and Hurricanes.

    NASA Astrophysics Data System (ADS)

    Balam Matagamon, Chan; Pawa Matagamon, Sagamo

    2004-03-01

    Certain Native Americans of the past seem to have correctly deduced that significant survival information for their tradition-respecting cultures resided in EMF-based phenomena that they were monitoring. This is based upon their myths and the place or cult-hero names they bequeathed us. The sites we have located in FL have been detectable by us visually, usually by faint blue light, or by the elicitation of pin-like prickings, by somewhat intense nervous-system response, by EMF interactions with aural electrochemical systems that can elicit tinitus, and other ways. In the northeast, Cautantowit served as a harbinger of Indian summer, and appears to be another alter ego of the EMF. The Miami, FL Tequesta site along the river clearly correlates with tornado, earthquake and hurricane locations. Sites like the Mohave Deserts giant man may have had similar significance.

  6. Greenhouse gas (GHG) mitigation and monitoring technology performance: Activities of the GHG Technology Verification Center. Report for January 1998--January 1999

    SciTech Connect

    Masemore, S.; Kirchgessner, D.A.

    1999-05-01

    The paper discusses greenhouse gas (GHG) mitigation and monitoring technology performance activities of the GHG Technology Verification Center. The Center is a public/private partnership between Southern Research Institute and the US EPA`s Office of Research and Development. The Center is part of EPA`s Environmental Technology Verification (ETV) Program, which has established 12 verification centers to evaluate a wide range of technologies in various environmental media and technology areas. The Center has published the results of its first verification: use of a phosphoric acid fuel cell to produce electricity from landfill gas. It has also initiated three new field verifications, two on technologies that reduce methane emissions from natural gas transmissions compressors, and one on a new microturbine electricity production technology.

  7. Center for Integration of Natural Disaster Information

    USGS Publications Warehouse

    U.S. Geological Survey

    2001-01-01

    The U.S. Geological Survey's Center for Integration of Natural Disaster Information (CINDI) is a research and operational facility that explores methods for collecting, integrating, and communicating information about the risks posed by natural hazards and the effects of natural disasters. The U.S. Geological Survey (USGS) is mandated by the Robert Stafford Act to warn citizens of impending landslides, volcanic eruptions, and earthquakes. The USGS also coordinates with other Federal, State, and local disaster agencies to monitor threats to communities from floods, coastal storms, wildfires, geomagnetic storms, drought, and outbreaks of disease in wildlife populations.

  8. Tracking Earthquake Cascades

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2011-12-01

    In assessing their risk to society, earthquakes are best characterized as cascades that can propagate from the natural environment into the socio-economic (built) environment. Strong earthquakes rarely occur as isolated events; they usually cluster in foreshock-mainshock-aftershock sequences, seismic swarms, and extended sequences of large earthquakes that propagate along major fault systems. These cascades are regulated by stress-mediated interactions among faults driven by tectonic loading. Within these cascades, each large event can itself cause a chain reaction in which the primary effects of faulting and ground shaking induce secondary effects, including tsunami, landslides, liquefaction, and set off destructive processes within the built environment, such as fires and radiation leakage from nuclear plants. Recent earthquakes have demonstrated how the socio-economic effects of large earthquakes can reverberate for many years. To reduce earthquake risk and improve the resiliency of communities to earthquake damage, society depends on five geotechnologies for tracking earthquake cascades: long-term probabilistic seismic hazard analysis (PSHA), short-term (operational) earthquake forecasting, earthquake early warning, tsunami warning, and the rapid production of post-event information for response and recovery (see figure). In this presentation, I describe how recent advances in earthquake system science are leading to improvements in this geotechnology pipeline. In particular, I will highlight the role of earthquake simulations in predicting strong ground motions and their secondary effects before and during earthquake cascades

  9. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and hazard response to create a program that is both educational and provides a public service. Seismic Sleuths and Written in Stone are the harbingers of a new genre of earthquake programs that are the antithesis of the 1974 film Earthquake and the 2004 miniseries 10.5. Film producers and those in the earthquake education community are demonstrating that it is possible to tell an exciting story, inspire awareness, and encourage empowerment without sensationalism.

  10. Increased Seismicity in the Tsaoling Reservoir Region After the 1999 Chi-Chi, Taiwan, Earthquake

    NASA Astrophysics Data System (ADS)

    Chang, K.; Chi, W.

    2006-12-01

    The 1999 Mw7.6 Chi-Chi, Taiwan, earthquake has triggered several large landslides. Among them, the Tsaoling landslide has blocked the flow of Ching-sui River. The stream water backed up behind the landslide deposit, forming a 4.6 million cubic-meter reservoir about 5 km and 50 m deep. This reservoir was then filled by sediments about 4 years later. As a result, it provides a rare opportunity to monitor possible reservoir- induced seismicity. From the earthquake catalog derived from the dense seismic network of Central Weather Bureau of Taiwan, we selected 1666 earthquakes that occurred between the years 1997 and 2004 within a 10 km by 10 km rectangular region centered at 23.584763N and 120.661160E. We compared this catalog with another published catalog that relocated only the earthquakes in 1999 using double-difference method. The double-difference catalog has less events, possibly due to the stricter criteria used in the relocation inversion. However, we found the overall results from these two catalogs are similar, suggesting that the catalog used for this study is of high quality. On average only 0.6 earthquakes occurred per month prior to the Chi-Chi mainshock. However, high seismicity, with an average rate of 27.8 event/month, has occurred right after the Chi-Chi mainshock until the reservoir got filled by sediments in 2003, after which the seismicity almost ceased for 3.5 weeks. Following that, the raining season started, causing the seismicity to increase with a rate of 23.7 event/month. It is not usual for an Mw7.6 earthquake like Chi-Chi earthquake to have aftershocks continuously for more than 4 years. Thus we interpreted that some of these earthquakes were induced by the reservoir. There are 44 earthquakes shallower than 5 km located sparsely in this small region. However, there is one cluster of about 16 earthquakes located south of the reservoir. Field mapping between 1999 and 2004 found that the river channel has cut through a shale unit that overlies a south-dipping sandstone unit, providing a conduit for reservoir water to migrate to the south, and possibly induced this cluster. Due to the complicated geologic structures in this region, some other vertical fluid conduits might have formed by the strong ground shaking of the Chi-Chi mainshock. In addition, we found the shallow earthquakes usually occur during the raining season in this region for the five years post-Chi Chi earthquake period. In sum, the post-Chi Chi earthquake seismicity in Tsaoling region shows increased seismicity. And these earthquakes correlate spatially with landslide-induced reservoir region and temporally with precipitation. We interpret that part of these earthquakes were triggered by fluid-related processes.

  11. Intracranial Pressure Monitoring in Severe Traumatic Brain Injury in Latin America: Process and Methods for a Multi-Center Randomized Controlled Trial

    PubMed Central

    Lujan, Silvia; Dikmen, Sureyya; Temkin, Nancy; Petroni, Gustavo; Pridgeon, Jim; Barber, Jason; Machamer, Joan; Cherner, Mariana; Chaddock, Kelley; Hendrix, Terence; Rondina, Carlos; Videtta, Walter; Celix, Juanita M.; Chesnut, Randall

    2012-01-01

    Abstract In patients with severe traumatic brain injury (TBI), the influence on important outcomes of the use of information from intracranial pressure (ICP) monitoring to direct treatment has never been tested in a randomized controlled trial (RCT). We are conducting an RCT in six trauma centers in Latin America to test this question. We hypothesize that patients randomized to ICP monitoring will have lower mortality and better outcomes at 6-months post-trauma than patients treated without ICP monitoring. We selected three centers in Bolivia to participate in the trial, based on (1) the absence of ICP monitoring, (2) adequate patient accession and data collection during the pilot phase, (3) preliminary institutional review board approval, and (4) the presence of equipoise about the value of ICP monitoring. We conducted extensive training of site personnel, and initiated the trial on September 1, 2008. Subsequently, we included three additional centers. A total of 176 patients were entered into the trial as of August 31, 2010. Current enrollment is 81% of that expected. The trial is expected to reach its enrollment goal of 324 patients by September of 2011. We are conducting a high-quality RCT to answer a question that is important globally. In addition, we are establishing the capacity to conduct strong research in Latin America, where TBI is a serious epidemic. Finally, we are demonstrating the feasibility and utility of international collaborations that share resources and unique patient populations to conduct strong research about global public health concerns. PMID:22435793

  12. Speeding earthquake disaster relief

    USGS Publications Warehouse

    Mortensen, Carl; Donlin, Carolyn; Page, Robert A.; Ward, Peter

    1995-01-01

    In coping with recent multibillion-dollar earthquake disasters, scientists and emergency managers have found new ways to speed and improve relief efforts. This progress is founded on the rapid availability of earthquake information from seismograph networks.

  13. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  14. Development of a telecare system based on ZigBee mesh network for monitoring blood pressure of patients with hemodialysis in health care centers.

    PubMed

    Du, Yi-Chun; Lee, You-Yun; Lu, Yun-Yuan; Lin, Chia-Hung; Wu, Ming-Jei; Chen, Chung-Lin; Chen, Tainsong

    2011-10-01

    In Taiwan, the number of the patients needing dialysis increases rapidly in recent years. Because there is risk in every hemodialysis session, monitoring physiological status, such as blood pressure measurement every 30 min to 1 h is needed during about 4 h hemodialysis process. Therefore, an assisted measurement on blood pressure is needful in dialysis care centers. Telecare system (TCS) is regarded as one of important technique in the medical care. In this study, we utilized ZigBee wireless technique to establish a mesh network for monitoring blood pressure automatically and data storage in medical record system for display and further analysis. Moreover, while the blood pressure exceeds the normal range, the system could send a warning signal to remind, or inform the relatives and clinicians in health care center through the personal handy-phone system (PHS) immediately. The proposed system provides an assisted device for monitoring patients' blood pressure during hemodialysis process and saving medical manpower. PMID:20703683

  15. Real-Time Earthquake Analysis for Disaster Mitigation (READI) Network

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2014-12-01

    Real-time GNSS networks are making a significant impact on our ability to forecast, assess, and mitigate the effects of geological hazards. I describe the activities of the Real-time Earthquake Analysis for Disaster Mitigation (READI) working group. The group leverages 600+ real-time GPS stations in western North America operated by UNAVCO (PBO network), Central Washington University (PANGA), US Geological Survey & Scripps Institution of Oceanography (SCIGN project), UC Berkeley & US Geological Survey (BARD network), and the Pacific Geosciences Centre (WCDA project). Our goal is to demonstrate an earthquake and tsunami early warning system for western North America. Rapid response is particularly important for those coastal communities that are in the near-source region of large earthquakes and may have only minutes of warning time, and who today are not adequately covered by existing seismic and basin-wide ocean-buoy monitoring systems. The READI working group is performing comparisons of independent real time analyses of 1 Hz GPS data for station displacements and is participating in government-sponsored earthquake and tsunami exercises in the Western U.S. I describe a prototype seismogeodetic system using a cluster of southern California stations that includes GNSS tracking and collocation with MEMS accelerometers for real-time estimation of seismic velocity and displacement waveforms, which has advantages for improved earthquake early warning and tsunami forecasts compared to seismic-only or GPS-only methods. The READI working group's ultimate goal is to participate in an Indo-Pacific Tsunami early warning system that utilizes GNSS real-time displacements and ionospheric measurements along with seismic, near-shore buoys and ocean-bottom pressure sensors, where available, to rapidly estimate magnitude and finite fault slip models for large earthquakes, and then forecast tsunami source, energy scale, geographic extent, inundation and runup. This will require cooperation with other real-time efforts around the Pacific Rim in terms of sharing, analysis centers, and advisory bulletins to the responsible government agencies. The IAG's Global Geodetic Observing System (GGOS), in particular its natural hazards theme, provides a natural umbrella for achieving this objective.

  16. Hastings Center

    MedlinePlus

    ... Center Report IRB: Ethics & Human Research Special Publications Bioethics Forum blog Over 65 blog Health Care Cost Monitor blog Books and Monographs Bioethics Briefing Book Help with Hard Questions Research Current ...

  17. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  18. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models

  19. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  20. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  1. Earthquake Prediction and Forecasting

    NASA Astrophysics Data System (ADS)

    Jackson, David D.

    Prospects for earthquake prediction and forecasting, and even their definitions, are actively debated. Here, "forecasting" means estimating the future earthquake rate as a function of location, time, and magnitude. Forecasting becomes "prediction" when we identify special conditions that make the immediate probability much higher than usual and high enough to justify exceptional action. Proposed precursors run from aeronomy to zoology, but no identified phenomenon consistently precedes earthquakes. The reported prediction of the 1975 Haicheng, China earthquake is often proclaimed as the most successful, but the success is questionable. An earthquake predicted to occur near Parkfield, California in 19885 years has not happened. Why is prediction so hard? Earthquakes start in a tiny volume deep within an opaque medium; we do not know their boundary conditions, initial conditions, or material properties well; and earthquake precursors, if any, hide amongst unrelated anomalies. Earthquakes cluster in space and time, and following a quake earthquake probability spikes. Aftershocks illustrate this clustering, and later earthquakes may even surpass earlier ones in size. However, the main shock in a cluster usually comes first and causes the most damage. Specific models help reveal the physics and allow intelligent disaster response. Modeling stresses from past earthquakes may improve forecasts, but this approach has not yet been validated prospectively. Reliable prediction of individual quakes is not realistic in the foreseeable future, but probabilistic forecasting provides valuable information for reducing risk. Recent studies are also leading to exciting discoveries about earthquakes.

  2. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  3. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  4. Precursory signals around epicenters and local active faults prior to inland or coastal earthquakes

    NASA Astrophysics Data System (ADS)

    Valizadeh Alvan, Habibeh

    Although earthquakes are still considered as unpredictable phenomenon but scientific efforts during the past decade have revealed some pronounced changes in the quality and quantity of some materials and natural phenomenon on and above the earth’s surface taking place before strong shakes. Pre-earthquake physical and chemical interactions in the earth’s ground may cause anomalies in temperature, surface latent heat flux (SLHF), relative humidity, upwelling index and chlorophyll-a (Chl-a) concentration on the ground or sea surface. Earthquakes are triggered when the energy accumulated in rocks releases causing ruptures in place of faults. The main purpose of this study is to explore and demonstrate possibility of any changes in surface temperature or latent heat flux before, during and after earthquakes. We expect that variations in these factors are accompanied with the increase of Chl-a concentration on the sea surface and upwelling events prior to coastal earthquake events. For monitoring the changes in surface temperature we used NOAA-AVHRR and microwave radiometers like AMSR-E/Aqua data. SLHF data and upwelling indices are provided by National Centers for Environmental Prediction (NCEP) Reanalysis Project and Pacific Fisheries Environmental Laboratory (PFEL) respectively. Chl-a concentration is also available in MODIS website. Our detailed analyses show significant increase of SLHF and upwelling of nutrient-rich water prior to the main events which is attributed to the raise in surface temperature and Chl-a concentration at that time. Meaningful increases in temperature, relative humidity and SLHF variations from weeks before the earthquakes in epicentral areas and along local active faults are revealed. In addition, considerable anomalies in Chl-a concentration are also attributed to the raise in upwelling index.

  5. Supercomputing meets seismology in earthquake exhibit

    ScienceCinema

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2014-07-22

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  6. Supercomputing meets seismology in earthquake exhibit

    SciTech Connect

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2013-10-03

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  7. Crustal earthquake triggering by modern great earthquakes on subduction zone thrusts

    NASA Astrophysics Data System (ADS)

    Gomberg, Joan; Sherrod, Brian

    2014-02-01

    Among the many questions raised by the recent abundance of great (M > 8.0) subduction thrust earthquakes is their potential to trigger damaging earthquakes on crustal faults within the overriding plate and beneath many of the world's densely populated urban centers. We take advantage of the coincident abundance of great earthquakes globally and instrumental observations since 1960 to assess this triggering potential by analyzing centroids and focal mechanisms from the centroid moment tensor catalog for events starting in 1976 and published reports about the M9.5 1960 Chile and M9.2 1964 Alaska earthquake sequences. We find clear increases in the rates of crustal earthquakes in the overriding plate within days following all subduction thrust earthquakes of M > 8.6, within about ±10° of the triggering event centroid latitude and longitude. This result is consistent with dynamic triggering of more distant increases of shallow seismicity rates at distances beyond ±10°, suggesting that dynamic triggering may be important within the near field too. Crustal earthquake rate increases may also follow smaller M > 7.5 subduction thrust events, but because activity typically occurs offshore in the immediately vicinity of the triggering rupture plane, it cannot be unambiguously attributed to sources within the overriding plate. These observations are easily explained in the context of existing earthquake scaling laws.

  8. Retrospective stress-forecasting of earthquakes

    NASA Astrophysics Data System (ADS)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to retrospectively stress-forecasting ~17 earthquakes ranging in magnitude from a M1.7 swarm event in N Iceland, to the 1999 M7.7 Chi-Chi Earthquake in Taiwan, and the 2004 Mw9.2 Sumatra-Andaman Earthquake (SAE). Before SAE, the changes in SWS were observed at seismic stations in Iceland at a distance of ~10,500km the width of the Eurasian Plate, from Indonesia demonstrating the 'butterfly wings' sensitivity of the New Geophysics of a critically microcracked Earth. At that time, the sensitivity of the phenomena had not been recognised, and the SAE was not stress-forecast. These results have been published at various times in various formats in various journals. This presentation displays all the results in a normalised format that allows the similarities to be recognised, confirming that observations of SWS time-delays can stress-forecast the times, magnitudes, and in some circumstances fault-breaks, of impending earthquakes. Papers referring to these developments can be found in geos.ed.ac.uk/home/scrampin/opinion. Also see abstracts in EGU2015 Sessions: Crampin & Gao (SM1.1), Liu & Crampin (NH2.5), and Crampin & Gao (GD.1).

  9. Engaging Students in Earthquake Science

    NASA Astrophysics Data System (ADS)

    Cooper, I. E.; Benthien, M.

    2004-12-01

    The Southern California Earthquake Center Communication, Education, and Outreach program (SCEC CEO) has been collaborating with the University of Southern California (USC) Joint Education Project (JEP) and the Education Consortium of Central Los Angeles (ECCLA) to work directly with the teachers and schools in the local community around USC. The community surrounding USC is 57 % Hispanic (US Census, 2000) and 21% African American (US Census, 2000). Through the partnership with ECCLA SCEC has created a three week enrichment intersession program, targeting disadvantaged students at the fourth/fifth grade level, dedicated entirely to earthquakes. SCEC builds partnerships with the intersession teachers, working together to actively engage the students in learning about earthquakes. SCEC provides a support system for the teachers, supplying them with the necessary content background as well as classroom manipulatives. SCEC goes into the classrooms with guest speakers and take the students out of the classroom on two field trips. There are four intersession programs each year. SCEC is also working with USC's Joint Education Project program. The JEP program has been recognized as one of the "oldest and best organized" Service-Learning programs in the country (TIME Magazine and the Princeton Review, 2000). Through this partnership SCEC is providing USC students with the necessary tools to go out to the local schools and teach students of all grade levels about earthquakes. SCEC works with the USC students to design engaging lesson plans that effectively convey content regarding earthquakes. USC students can check out hands-on/interactive materials to use in the classrooms from the SCEC Resource Library. In both these endeavors SCEC has expanded our outreach to the local community. SCEC is reaching over 200 minority children each year through our partnerships, and this number will increase as our programs grow.

  10. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  11. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  12. ElarmS Earthquake Early Warning System Enhancements and Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Henson, I. H.; Neuhauser, D. S.; Allen, R. M.

    2013-12-01

    ElarmS is an earthquake early warning system that contributes alerts to CISN ShakeAlert, a prototype end-to-end earthquake early warning system being developed and tested by the California Integrated Seismic Network (CISN). ElarmS is one of several systems based on independent methodologies that contribute to CISN ShakeAlert. The UC Berkeley ElarmS system consists of multiple continuous-waveform processors and trigger-association processors running at three geographical locations and communicating via the Apache ActiveMQ Messaging system. Recent enhancements to the ElarmS system include reductions in trigger report times, reductions in trigger association and event alert times, and the development and testing of redundant processing and communication architectures. To enable redundant processing, ElarmS trigger associators handle duplicate trigger information arriving from duplicate waveform processors via different transmission paths. We have developed performance monitoring tools that report system component latencies and earthquake hypocenter parameter accuracy. Statistics for hypocenter and origin time accuracy and alert times latencies can be computed for different time periods, magnitude ranges and geographic regions. Individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers, from arrival detection through several stages of processing to the association with an individual earthquake alert. Detailed event information includes latencies associated with the transmission of individual waveform packets from station to processing centers, waveform processing queues, trigger message queues, trigger message transmissions, trigger association and hypocenter location cpu times. Changes to the ElarmS algorithm and system architecture are frequently tested by running multiple versions of ElarmS simultaneously. A web browser interface to the performance monitoring tools includes tabular, mapping, and statistical analysis graphical components (generated by the R-Statistics System) that make it easy to compare different development versions of ElarmS.

  13. Combining Real-time Seismic and Geodetic Data to Improve Rapid Earthquake Information

    NASA Astrophysics Data System (ADS)

    Murray, M. H.; Neuhauser, D. S.; Gee, L. S.; Dreger, D. S.; Basset, A.; Romanowicz, B.

    2002-12-01

    The Berkeley Seismological Laboratory operates seismic and geodetic stations in the San Francisco Bay area and northern California for earthquake and deformation monitoring. The seismic systems, part of the Berkeley Digital Seismic Network (BDSN), include strong motion and broadband sensors, and 24-bit dataloggers. The data from 20 GPS stations, part of the Bay Area Regional Deformation (BARD) network of more than 70 stations in northern California, are acquired in real-time. We have developed methods to acquire GPS data at 12 stations that are collocated with the seismic systems using the seismic dataloggers, which have large on-site data buffer and storage capabilities, merge it with the seismic data stream in MiniSeed format, and continuously stream both data types using reliable frame relay and/or radio modem telemetry. Currently, the seismic data are incorporated into the Rapid Earthquake Data Integration (REDI) project to provide notification of earthquake magnitude, location, moment tensor, and strong motion information for hazard mitigation and emergency response activities. The geodetic measurements can provide complementary constraints on earthquake faulting, including the location and extent of the rupture plane, unambiguous resolution of the nodal plane, and distribution of slip on the fault plane, which can be used, for example, to refine strong motion shake maps. We are developing methods to rapidly process the geodetic data to monitor transient deformation, such as coseismic station displacements, and for combining this information with the seismic observations to improve finite-fault characterization of large earthquakes. The GPS data are currently processed at hourly intervals with 2-cm precision in horizontal position, and we are beginning a pilot project in the Bay Area in collaboration with the California Spatial Reference Center to do epoch-by-epoch processing with greater precision.

  14. Associating an ionospheric parameter with major earthquake occurrence throughout the world

    NASA Astrophysics Data System (ADS)

    Ghosh, D.; Midya, S. K.

    2014-02-01

    With time, ionospheric variation analysis is gaining over lithospheric monitoring in serving precursors for earthquake forecast. The current paper highlights the association of major (Ms ≥ 6.0) and medium (4.0 ≤ Ms < 6.0) earthquake occurrences throughout the world in different ranges of the Ionospheric Earthquake Parameter (IEP) where `Ms' is earthquake magnitude on the Richter scale. From statistical and graphical analyses, it is concluded that the probability of earthquake occurrence is maximum when the defined parameter lies within the range of 0-75 (lower range). In the higher ranges, earthquake occurrence probability gradually decreases. A probable explanation is also suggested.

  15. Earthquake forecasting and warning

    SciTech Connect

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  16. Continental dynamics and continental earthquakes

    NASA Astrophysics Data System (ADS)

    Zhang, Dong-Ning; Zhang, Guo-Min; Zhang, Pei-Zhen

    2003-09-01

    Two key research projects in geoscience field in China since the IUGG meeting in Birmingham in 1999, the project of “East Asian Continental Geodynamics” and the project of “Mechanism and Prediction of Strong Continental Earthquakes” are introduced in this paper. Some details of two projects, such as their sub-projects, some initial research results published are also given here. Because of the large magnitude of the November 14, 2001 Kunlun Mountain Pass M S=8.1 earthquake, in the third part of this paper, some initial research results are reviewed for the after-shock monitoring and the multi-discipline field survey, the impact and disaster of this earthquake on the construction site of Qinghai-Xizang (Tibet) railway and some other infrastructure.

  17. Monitoring

    DOEpatents

    Orr, Christopher Henry; Luff, Craig Janson; Dockray, Thomas; Macarthur, Duncan Whittemore

    2004-11-23

    The invention provides apparatus and methods which facilitate movement of an instrument relative to an item or location being monitored and/or the item or location relative to the instrument, whilst successfully excluding extraneous ions from the detection location. Thus, ions generated by emissions from the item or location can successfully be monitored during movement. The technique employs sealing to exclude such ions, for instance, through an electro-field which attracts and discharges the ions prior to their entering the detecting location and/or using a magnetic field configured to repel the ions away from the detecting location.

  18. Research on earthquake prediction from infrared cloud images

    NASA Astrophysics Data System (ADS)

    Fan, Jing; Chen, Zhong; Yan, Liang; Gong, Jing; Wang, Dong

    2015-12-01

    In recent years, the occurrence of large earthquakes is frequent all over the word. In the face of the inevitable natural disasters, the prediction of the earthquake is particularly important to avoid more loss of life and property. Many achievements in the field of predict earthquake from remote sensing images have been obtained in the last few decades. But the traditional prediction methods presented do have the limitations of can't forecast epicenter location accurately and automatically. In order to solve the problem, a new predicting earthquakes method based on extract the texture and emergence frequency of the earthquake cloud is proposed in this paper. First, strengthen the infrared cloud images. Second, extract the texture feature vector of each pixel. Then, classified those pixels and converted to several small suspected area. Finally, tracking the suspected area and estimate the possible location. The inversion experiment of Ludian earthquake show that this approach can forecast the seismic center feasible and accurately.

  19. Seismic Monitoring in Haiti

    Following the devastating 2010 Haiti earthquake, the USGS has been helping with earthquake awareness and monitoring in the country, with continued support from the U.S. Agency for International Development (USAID). This assistance has helped the Bureau des Mines et de l'Energie (BME) in Port-au-Prin...

  20. Seismotectonic constraints at the western edge of the Pyrenees: aftershock series monitoring of the 2002 February 21, 4.1 Lg earthquake

    NASA Astrophysics Data System (ADS)

    Ruiz, M.; Díaz, J.; Gallart, J.; Pulgar, J. A.; González-Cortina, J. M.; López, C.

    2006-07-01

    Seismic data recorded from a temporary network deployed at the western edge of the Pyrenees is used to study the aftershocks series following a magnitude 4.1 earthquake that took place on 2002 February 21, to the NW of Pamplona city. Aftershock determinations showed events distributed between 1 and 4 km depth in a small active area of about 4 km2, E-W oriented delineating the southern sector of the Aralar thrust unit. This seismogenic feature is supported by focal solutions showing a consistent E-W nodal plane with normal faulting following the main strike-slip rupture. The Aralar structure with its shallow activity may be interpreted as a conjugate system of the NE-SW deep-seated Pamplona active fault nearby. Cross-correlation techniques and relative location of event clusters further constrained the epicentral domain to 2 km long and 1 km wide. Statistical relations and parameters established indicate a rather low b-value of 0.8 for the Gutenberg-Richter distribution, denoting a region of concentrated seismicity, and a P-parameter of 0.9 for the Omori's law corresponding to a low decay of the aftershock activity in this area. More than 100 aftershocks were accurately located in this high-resolution experiment, whereas only 13 of them could be catalogued by the permanent agencies in the same period, due to a much sparser distribution. The results enhance the importance of using dense temporary networks to infer relevant seismotectonic and hazard constraints.

  1. Earthquake Hazard in the Heart of the Homeland

    USGS Publications Warehouse

    Gomberg, Joan; Schweig, Eugene

    2007-01-01

    Evidence that earthquakes threaten the Mississippi, Ohio, and Wabash River valleys of the Central United States abounds. In fact, several of the largest historical earthquakes to strike the continental United States occurred in the winter of 1811-1812 along the New Madrid seismic zone, which stretches from just west of Memphis, Tenn., into southern Illinois. Several times in the past century, moderate earthquakes have been widely felt in the Wabash Valley seismic zone along the southern border of Illinois and Indiana. Throughout the region, between 150 and 200 earthquakes are recorded annually by a network of monitoring instruments, although most are too small to be felt by people. Geologic evidence for prehistoric earthquakes throughout the region has been mounting since the late 1970s. But how significant is the threat? How likely are large earthquakes and, more importantly, what is the chance that the shaking they cause will be damaging?

  2. Retrospective Evaluation of Earthquake Forecasts during the 2010-12 Canterbury, New Zealand, Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Werner, M. J.; Marzocchi, W.; Taroni, M.; Zechar, J. D.; Gerstenberger, M.; Liukis, M.; Rhoades, D. A.; Cattania, C.; Christophersen, A.; Hainzl, S.; Helmstetter, A.; Jimenez, A.; Steacy, S.; Jordan, T. H.

    2014-12-01

    The M7.1 Darfield, New Zealand (NZ), earthquake triggered a complex earthquake cascade that provides a wealth of new scientific data to study earthquake triggering and the predictive skill of statistical and physics-based forecasting models. To this end, the Collaboratory for the Study of Earthquake Predictability (CSEP) is conducting a retrospective evaluation of over a dozen short-term forecasting models that were developed by groups in New Zealand, Europe and the US. The statistical model group includes variants of the Epidemic-Type Aftershock Sequence (ETAS) model, non-parametric kernel smoothing models, and the Short-Term Earthquake Probabilities (STEP) model. The physics-based model group includes variants of the Coulomb stress triggering hypothesis, which are embedded either in Dieterich's (1994) rate-state formulation or in statistical Omori-Utsu clustering formulations (hybrid models). The goals of the CSEP evaluation are to improve our understanding of the physical mechanisms governing earthquake triggering, to improve short-term earthquake forecasting models and time-dependent hazard assessment for the Canterbury area, and to understand the influence of poor-quality, real-time data on the skill of operational (real-time) forecasts. To assess the latter, we use the earthquake catalog data that the NZ CSEP Testing Center archived in near real-time during the earthquake sequence and compare the predictive skill of models using the archived data as input with the skill attained using the best available data today. We present results of the retrospective model comparison and discuss implications for operational earthquake forecasting.

  3. A continuation of base-line studies for environmentally monitoring Space Transportation Systems at John F. Kennedy Space Center. Volume 2: Chemical studies of rainfall and soil analysis

    NASA Technical Reports Server (NTRS)

    Madsen, B. C.

    1980-01-01

    The results of a study which was designed to monitor, characterize, and evaluate the chemical composition of precipitation (rain) which fell at the Kennedy Space Center, Florida (KSC) during the period July 1977 to March 1979 are reported. Results which were obtained from a soil sampling and associated chemical analysis are discussed. The purpose of these studies was to determine the environmental perturbations which might be caused by NASA space activities.

  4. Post-earthquake building safety assessments for the Canterbury Earthquakes

    USGS Publications Warehouse

    Marshall, J.; Barnes, J.; Gould, N.; Jaiswal, K.; Lizundia, B.; Swanson, David A.; Turner, F.

    2012-01-01

    This paper explores the post-earthquake building assessment program that was utilized in Christchurch, New Zealand following the Canterbury Sequence of earthquakes beginning with the Magnitude (Mw.) 7.1 Darfield event in September 2010. The aftershocks or triggered events, two of which exceeded Mw 6.0, continued with events in February and June 2011 causing the greatest amount of damage. More than 70,000 building safety assessments were completed following the February event. The timeline and assessment procedures will be discussed including the use of rapid response teams, selection of indicator buildings to monitor damage following aftershocks, risk assessments for demolition of red-tagged buildings, the use of task forces to address management of the heavily damaged downtown area and the process of demolition. Through the post-event safety assessment program that occurred throughout the Canterbury Sequence of earthquakes, many important lessons can be learned that will benefit future response to natural hazards that have potential to damage structures.

  5. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  6. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  7. Gravity drives Great Earthquakes

    NASA Astrophysics Data System (ADS)

    Lister, Gordon; Forster, Marnie

    2010-05-01

    The most violent of Great Earthquakes are driven by ruptures on giant megathrusts adjacent to actively forming mountain belts. Current theory suggests that the seismic rupture harvests (and thus releases) elastic energy that has been previously stored in locked segments of the megathrust. The general belief, however, is that this energy was accumulated as the result of relative motion of the adjacent stiff elastic tectonic plates. This mechanism fails to explain many first order aspects of large earthquakes, however. The energy source for strain accumulation must also include gravitational collapse of orogenic crust and/or in the foundering (or roll-back) of an adjacent subducting lithospheric slab. Therefore we have conducted an analysis of the geometry of aftershocks, and report that this allows distinction of two types of failure on giant megathrusts. Mode I failure involves horizontal shortening, and is consistent with the classic view that megathrusts fail in compression, with motion analogous to that expected if accretion takes place against a rigid (or elastic) backstop. Mode II failure involves horizontal extension, and requires the over-riding plate to stretch during an earthquake. This process is likely to continue during the subsequent period of afterslip, and therefore will again be evident in aftershock patterns. Mode I behaviour may well have applied to the southern segment of the Sumatran megathrust, from whence emanated the rupture that drove the 2004 Great Earthquake. Mode II behaviour appears to apply to the northern segment of the same rupture, however. The geometry of aftershocks beneath the Andaman Sea suggest that the crust above the initial rupture failed in an extensional mode. The edge of the Indian plate is foundering, with slab-hinge roll-back in a direction orthogonal to its motion vector. The only possible cause for this extension therefore is westward roll-back of the subducting Indian plate, and the consequent gravity-driven movement of the over-riding crust and mantle. This is possible for the crust and mantle above major subduction zones is mechanically weakened by the flux of heat and water associated with subduction zone processes. In consequence the lithosphere of the over-riding orogens can act more like a fluid than a rigid plate. Such fluid-like behaviour has been noted for the Himalaya and for the crust of the uplifted adjacent Tibetan Plateau, which appear to be collapsing. Similar conclusions as to the fluid-like behaviour of an orogen can also be reached for the crust and mantle of Myanmar and Indonesia, since here again, there is evidence for arc-normal motion adjacent to rolling-back subduction zones. Prior to the Great Sumatran Earthquake of 2004 we had postulated such movements on geological time-scales, describing them as ‘surges‘ driven by the gravitational potential energy of the adjacent orogen. But we considered time-scales that were very different to those that apply in the lead up, or during and subsequent to a catastrophic seismic event. The Great Sumatran Earthquake taught us quite differently. Data from satellites support the hypothesis that extension took place in a discrete increment, which we interpret to be the result of a gravitationally driven surge of the Indonesian crust westward over the weakened rupture during and after the earthquake. Mode II megathrusts are tsunamigenic for one very simple reason: the crust has been attenuated as the result of ongoing extension, so they can be overlain by large tracts of water, and they have a long rupture run time, allowing a succession of stress accumulations to be harvested. The after-slip beneath the Andaman Sea was also significant (in terms of moment) although non-seismogenic in its character. Operation of a Mode II megathrust prior to catastrophic failure may involve relatively quiescent motion with a mixture of normal faults and reverse faults, much like south of Java today. Ductile yield may produce steadily increasing (and accelerating) subsidence (on decadal time scales) as roll-back deepens the trench and adjacent fore-arc basins. This suggests a relatively simple (and cost effective) strategy that would allow precursor motions on Mode II megathrusts to be precisely monitored.

  8. Unexpectedly frequent occurrence of very small repeating earthquakes (-5.1 ≤ Mw ≤ -3.6) in a South African gold mine: Implications for monitoring intraplate faults

    NASA Astrophysics Data System (ADS)

    Naoi, Makoto; Nakatani, Masao; Igarashi, Toshihiro; Otsuki, Kenshiro; Yabe, Yasuo; Kgarume, Thabang; Murakami, Osamu; Masakale, Thabang; Ribeiro, Luiz; Ward, Anthony; Moriya, Hirokazu; Kawakata, Hironori; Nakao, Shigeru; Durrheim, Raymond; Ogasawara, Hiroshi

    2015-12-01

    We observed very small repeating earthquakes with -5.1 ≤ Mw ≤ -3.6 on a geological fault at 1 km depth in a gold mine in South Africa. Of the 851 acoustic emissions that occurred on the fault during the 2 month analysis period, 45% were identified as repeaters on the basis of waveform similarity and relative locations. They occurred steadily at the same location with similar magnitudes, analogous to repeaters at plate boundaries, suggesting that they are repeat ruptures of the same asperity loaded by the surrounding aseismic slip (background creep). Application of the Nadeau and Johnson (1998) empirical formula (NJ formula), which relates the amount of background creep and repeater activity and is well established for plate boundary faults, to the present case yielded an impossibly large estimate of the background creep. This means that the presently studied repeaters were produced more efficiently, for a given amount of background creep, than expected from the NJ formula. When combined with an independently estimated average stress drop of 16 MPa, which is not particularly high, it suggests that the small asperities of the presently studied repeaters had a high seismic coupling (almost unity), in contrast to one physical interpretation of the plate boundary repeaters. The productivity of such repeaters, per unit background creep, is expected to increase strongly as smaller repeaters are considered (∝ Mo -1/3 as opposed to Mo -1/6 of the NJ formula), which may be usable to estimate very slow creep that may occur on intraplate faults.

  9. Oscillating brittle and viscous behavior through the earthquake cycle in the Red River Shear Zone: Monitoring flips between reaction and textural softening and hardening

    NASA Astrophysics Data System (ADS)

    Wintsch, Robert P.; Yeh, Meng-Wan

    2013-03-01

    Microstructures associated with cataclasites and mylonites in the Red River shear zone in the Diancang Shan block, Yunnan Province, China show evidence for both reaction hardening and softening at lower greenschist facies metamorphic conditions. The earliest fault-rocks derived from Triassic porphyritic orthogneiss protoliths are cataclasites. Brittle fractures and crushed grains are cemented by newly precipitated quartz. These cataclasites are subsequently overprinted by mylonitic fabrics. Truncations and embayments of relic feldspars and biotites show that these protolith minerals have been dissolved and incompletely replaced by muscovite, chlorite, and quartz. Both K-feldspar and plagioclase porphyroclasts are truncated by muscovite alone, suggesting locally metasomatic reactions of the form: 3K-feldspar + 2H+ = muscovite + 6SiO2(aq) + 2K+. Such reactions produce muscovite folia and fish, and quartz bands and ribbons. Muscovite and quartz are much weaker than the reactant feldspars and these reactions result in reaction softening. Moreover, the muscovite tends to align in contiguous bands that constitute textural softening. These mineral and textural modifications occurred at constant temperature and drove the transition from brittle to viscous deformation and the shift in deformation mechanism from cataclasis to dissolution-precipitation and reaction creep. These mylonitic rocks so produced are cut by K-feldspar veins that interrupt the mylonitic fabric. The veins add K-feldspar to the assemblage and these structures constitute both reaction and textural hardening. Finally these veins are boudinaged by continued viscous deformation in the mylonitic matrix, thus defining a late ductile strain event. Together these overprinting textures and microstructures demonstrate several oscillations between brittle and viscous deformation, all at lower greenschist facies conditions where only frictional behavior is predicted by experiments. The overlap of the depths of greenschist facies conditions with the base of the crustal seismic zone suggests that the implied oscillations in strain rate may have been related to the earthquake cycle.

  10. Uplift and Subsidence Associated with the Great Aceh-Andaman Earthquake of 2004

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The magnitude 9.2 Indian Ocean earthquake of December 26, 2004, produced broad regions of uplift and subsidence. In order to define the lateral extent and the downdip limit of rupture, scientists from Caltech, Pasadena, Calif.; NASA's Jet Propulsion Laboratory, Pasadena, Calif.; Scripps Institution of Oceanography, La Jolla, Calif.; the U.S. Geological Survey, Pasadena, Calif.; and the Research Center for Geotechnology, Indonesian Institute of Sciences, Bandung, Indonesia; first needed to define the pivot line separating those regions. Interpretation of satellite imagery and a tidal model were one of the key tools used to do this.

    These pre-Sumatra earthquake (a) and post-Sumatra earthquake (b) images of North Sentinel Island in the Indian Ocean, acquired from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra spacecraft, show emergence of the coral reef surrounding the island following the earthquake. The tide was 30 plus or minus 14 centimeters lower in the pre-earthquake image (acquired November 21, 2000) than in the post-earthquake image (acquired February 20, 2005), requiring a minimum of 30 centimeters of uplift at this locality. Observations from an Indian Coast Guard helicopter on the northwest coast of the island suggest that the actual uplift is on the order of 1 to 2 meters at this site.

    In figures (c) and (d), pre-earthquake and post-earthquake ASTER images of a small island off the northwest coast of Rutland Island, 38 kilometers east of North Sentinel Island, show submergence of the coral reef surrounding the island. The tide was higher in the pre-earthquake image (acquired January 1, 2004) than in the post-earthquake image (acquired February 4, 2005), requiring subsidence at this locality. The pivot line must run between North Sentinel and Rutland islands. Note that the scale for the North Sentinel Island images differs from that for the Rutland Island images.

    The tidal model used for this study was based on data from JPL's Topex/Poseidon satellite. The model was used to determine the relative sea surface height at each location at the time each image was acquired, a critical component used to quantify the deformation.

    The scientists' method of using satellite imagery to recognize changes in elevation relative to sea surface height and of using a tidal model to place quantitative bounds on coseismic uplift or subsidence is a novel approach that can be adapted to other forms of remote sensing and can be applied to other subduction zones in tropical regions.

    ASTER is one of five Earth-observing instruments launched December 18, 1999, on NASA's Terra satellite. The instrument was built by Japan's Ministry of Economy, Trade and Industry. A joint U.S./Japan science team is responsible for validation and calibration of the instrument and the data products.

    The broad spectral coverage and high spectral resolution of ASTER provides scientists in numerous disciplines with critical information for surface mapping, and monitoring of dynamic conditions and temporal change. Example applications are: monitoring glacial advances and retreats; monitoring potentially active volcanoes; identifying crop stress; determining cloud morphology and physical properties; wetlands evaluation; thermal pollution monitoring; coral reef degradation; surface temperature mapping of soils and geology; and measuring surface heat balance.

    The U.S. science team is located at NASA's Jet Propulsion Laboratory, Pasadena, Calif. The Terra mission is part of NASA's Science Mission Directorate.

  11. The Performance of Trauma Research Centers of Iran during the Past 10 Years; A Science Monitor Survey

    PubMed Central

    Yadollahi, Mahnaz; Shamsedini, Narges; Shayan, Leila; Rezaianzadeh, Abbas; Bolandparvaz, Shahram

    2014-01-01

    Objective: To compare and evaluation of scores of trauma research center of Shiraz University of Medical Sciences in Iran with other trauma research centers in Iran. Methods: The assessment scores of each center were gathered from Iran medical research and Ministry of Health and Medical Education website. Each score is recorded in helical year which is defined from the 21th of March of every year until the 20th of March of the next. They are ranked and scored by knowledge production, capacity development, and research projects. Results: The total evaluation scores of the trauma research center of Iran's Universities of Medical Sciences have increased from establishment. The highest increase in assessment scores was related to Tehran Trauma Research Center. An upward trend was observed in the total indicators of knowledge production index of all the trauma research centers from 2001/2002 to 2011/2012. An ascending trend was showed in the published articles score of Shiraz and Kashan Trauma Research Centers through the recent years. Conclusion: The increasing trend in scores of trauma research centers in Iran indicated a significant role in the knowledge production but it is need to find barriers of research and doing interventional projects to promote trauma care and prevention.

  12. 76 FR 61115 - Migrant and Seasonal Farmworkers (MSFWs) Monitoring Report and One-Stop Career Center Complaint...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ...-Stop Career Center Complaint/Referral Record: Comments Agency: Employment and Training Administration... 8429, One-Stop Career Center Complaint/ Referral Record, to March 1, 2015. The changes incorporated to... description of complaint, (2) in Part II, of items 2 and 3, added the word ``Job Service,'' and (3) Part...

  13. Benefits of Earthquake Early Warning to Large Municipalities (Invited)

    NASA Astrophysics Data System (ADS)

    Featherstone, J.

    2013-12-01

    The City of Los Angeles has been involved in the testing of the Cal Tech Shake Alert, Earthquake Early Warning (EQEW) system, since February 2012. This system accesses a network of seismic monitors installed throughout California. The system analyzes and processes seismic information, and transmits a warning (audible and visual) when an earthquake occurs. In late 2011, the City of Los Angeles Emergency Management Department (EMD) was approached by Cal Tech regarding EQEW, and immediately recognized the value of the system. Simultaneously, EMD was in the process of finalizing a report by a multi-discipline team that visited Japan in December 2011, which spoke to the effectiveness of EQEW for the March 11, 2011 earthquake that struck that country. Information collected by the team confirmed that the EQEW systems proved to be very effective in alerting the population of the impending earthquake. The EQEW in Japan is also tied to mechanical safeguards, such as the stopping of high-speed trains. For a city the size and complexity of Los Angeles, the implementation of a reliable EQEW system will save lives, reduce loss, ensure effective and rapid emergency response, and will greatly enhance the ability of the region to recovery from a damaging earthquake. The current Shake Alert system is being tested at several governmental organizations and private businesses in the region. EMD, in cooperation with Cal Tech, identified several locations internal to the City where the system would have an immediate benefit. These include the staff offices within EMD, the Los Angeles Police Department's Real Time Analysis and Critical Response Division (24 hour crime center), and the Los Angeles Fire Department's Metropolitan Fire Communications (911 Dispatch). All three of these agencies routinely manage the collaboration and coordination of citywide emergency information and response during times of crisis. Having these three key public safety offices connected and included in the early testing of an EQEW system will help shape the EQEW policy which will determine the seismic safety of millions of Californians in the years to come.

  14. Earthquake at 40 feet

    USGS Publications Warehouse

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  15. Are Earthquake Magnitudes Clustered?

    SciTech Connect

    Davidsen, Joern; Green, Adam

    2011-03-11

    The question of earthquake predictability is a long-standing and important challenge. Recent results [Phys. Rev. Lett. 98, 098501 (2007); ibid.100, 038501 (2008)] have suggested that earthquake magnitudes are clustered, thus indicating that they are not independent in contrast to what is typically assumed. Here, we present evidence that the observed magnitude correlations are to a large extent, if not entirely, an artifact due to the incompleteness of earthquake catalogs and the well-known modified Omori law. The latter leads to variations in the frequency-magnitude distribution if the distribution is constrained to those earthquakes that are close in space and time to the directly following event.

  16. Istanbul Earthquake Early Warning and Rapid Response System

    NASA Astrophysics Data System (ADS)

    Erdik, M. O.; Fahjan, Y.; Ozel, O.; Alcik, H.; Aydin, M.; Gul, M.

    2003-12-01

    As part of the preparations for the future earthquake in Istanbul a Rapid Response and Early Warning system in the metropolitan area is in operation. For the Early Warning system ten strong motion stations were installed as close as possible to the fault zone. Continuous on-line data from these stations via digital radio modem provide early warning for potentially disastrous earthquakes. Considering the complexity of fault rupture and the short fault distances involved, a simple and robust Early Warning algorithm, based on the exceedance of specified threshold time domain amplitude levels is implemented. The band-pass filtered accelerations and the cumulative absolute velocity (CAV) are compared with specified threshold levels. When any acceleration or CAV (on any channel) in a given station exceeds specific threshold values it is considered a vote. Whenever we have 2 station votes within selectable time interval, after the first vote, the first alarm is declared. In order to specify the appropriate threshold levels a data set of near field strong ground motions records form Turkey and the world has been analyzed. Correlations among these thresholds in terms of the epicenter distance the magnitude of the earthquake have been studied. The encrypted early warning signals will be communicated to the respective end users by UHF systems through a "service provider" company. The users of the early warning signal will be power and gas companies, nuclear research facilities, critical chemical factories, subway system and several high-rise buildings. Depending on the location of the earthquake (initiation of fault rupture) and the recipient facility the alarm time can be as high as about 8s. For the rapid response system one hundred 18 bit-resolution strong motion accelerometers were placed in quasi-free field locations (basement of small buildings) in the populated areas of the city, within an area of approximately 50x30km, to constitute a network that will enable early damage assessment and rapid response information after a damaging earthquake. Early response information is achieved through fast acquisition and analysis of processed data obtained from the network. The stations are routinely interrogated on regular basis by the main data center. After triggered by an earthquake, each station processes the streaming strong motion data to yield the spectral accelerations at specific periods, 12Hz filtered PGA and PGV and will send these parameters in the form of SMS messages at every 20s directly to the main data center through a designated GSM network and through a microwave system. A shake map and damage distribution map (using aggregate building inventories and fragility curves) will be automatically generated using the algorithm developed for this purpose. Loss assessment studies are complemented by a large citywide digital database on the topography, geology, soil conditions, building, infrastructure and lifeline inventory. The shake and damage maps will be conveyed to the governor's and mayor's offices, fire, police and army headquarters within 3 minutes using radio modem and GPRS communication. An additional forty strong motion recorders were placed on important structures in several interconnected clusters to monitor the health of these structures after a damaging earthquake.

  17. Cooperative Monitoring Center Occasional Paper/13: Cooperative monitoring for confidence building: A case study of the Sino-Indian border areas

    SciTech Connect

    SIDHU,WAHEGURU PAL SINGH; YUAN,JING-DONG; BIRINGER,KENT L.

    1999-08-01

    This occasional paper identifies applicable cooperative monitoring techniques and develops models for possible application in the context of the border between China and India. The 1993 and 1996 Sino-Indian agreements on maintaining peace and tranquility along the Line of Actual Control (LAC) and establishing certain confidence building measures (CBMs), including force reductions and limitation on military exercises along their common border, are used to examine the application of technically based cooperative monitoring in both strengthening the existing terms of the agreements and also enhancing trust. The paper also aims to further the understanding of how and under what conditions technology-based tools can assist in implementing existing agreements on arms control and confidence building. The authors explore how cooperative monitoring techniques can facilitate effective implementation of arms control agreements and CBMS between states and contribute to greater security and stability in bilateral, regional, and global contexts.

  18. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ≈3 lower than average for California earthquakes. I present intensity observations from the 2014 South Napa earthquake that suggest that it may have been a low stress drop event.

  19. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information such as epicenter, magnitude, and strong-motion recordings. Without quantitative data, prioritization of response measures, including building and infrastructure inspection, are not possible. The main advantage of Twitter is speed, especially in sparsely instrumented areas. A Twitter based system potentially could provide a quick notification that there was a possible event and that seismographically derived information will follow. If you are interested in learning more, follow @USGSted on Twitter.

  20. Waveform Cross-Correlation for Improved North Texas Earthquake Locations

    NASA Astrophysics Data System (ADS)

    Phillips, M.; DeShon, H. R.; Oldham, H. R.; Hayward, C.

    2014-12-01

    In November 2013, a sequence of earthquakes began in Reno and Azle, TX, two communities located northwest of Fort Worth in an area of active oil and gas extraction. Only one felt earthquake had been reported within the area before the occurrence of probable injection-induced earthquakes at the Dallas-Fort Worth airport in 2008. The USGS National Earthquakes Information Center (NEIC) has reported 27 felt earthquakes in the Reno-Azle area through January 28, 2014. A temporary seismic network was installed beginning in December 2013 to acquire data to improve location and magnitude estimates and characterize the earthquake sequence. Here, we present high-resolution relative earthquake locations derived using differential time data from waveform cross-correlation. Cross-correlation is computed using the GISMO software suite and event relocation is done using double difference relocation techniques. Waveform cross-correlation of the local data indicates high (>70%) similarity between 4 major swarms of events lasting between 18 and 24 hours. These swarms are temporal zones of high event frequency; 1.4% of the time series data accounts for 42.1% of the identified local earthquakes. Local earthquakes are occurring along the Newark East Fault System, a NE-SW striking normal fault system previously thought inactive at depths between 2 and 8 km in the Ellenburger limestone formation and underlying Precambrian basement. Data analysis is ongoing and continued characterization of the associated fault will provide improved location estimates.

  1. The evaluation of the earthquake hazard using the exponential distribution method for different seismic source regions in and around Aǧrı

    NASA Astrophysics Data System (ADS)

    Bayrak, Yusuf; Türker, Tuǧba

    2016-04-01

    The aim of this study; were determined of the earthquake hazard using the exponential distribution method for different seismic sources of the Aǧrı and vicinity. A homogeneous earthquake catalog has been examined for 1900-2015 (the instrumental period) with 456 earthquake data for Aǧrı and vicinity. Catalog; Bogazici University Kandilli Observatory and Earthquake Research Institute (Burke), National Earthquake Monitoring Center (NEMC), TUBITAK, TURKNET the International Seismological Center (ISC), Seismological Research Institute (IRIS) has been created using different catalogs like. Aǧrı and vicinity are divided into 7 different seismic source regions with epicenter distribution of formed earthquakes in the instrumental period, focal mechanism solutions, and existing tectonic structures. In the study, the average magnitude value are calculated according to the specified magnitude ranges for 7 different seismic source region. According to the estimated calculations for 7 different seismic source regions, the biggest difference corresponding with the classes of determined magnitudes between observed and expected cumulative probabilities are determined. The recurrence period and earthquake occurrence number per year are estimated of occurring earthquakes in the Aǧrı and vicinity. As a result, 7 different seismic source regions are determined occurrence probabilities of an earthquake 3.2 magnitude, Region 1 was greater than 6.7 magnitude, Region 2 was greater than than 4.7 magnitude, Region 3 was greater than 5.2 magnitude, Region 4 was greater than 6.2 magnitude, Region 5 was greater than 5.7 magnitude, Region 6 was greater than 7.2 magnitude, Region 7 was greater than 6.2 magnitude. The highest observed magnitude 7 different seismic source regions of Aǧrı and vicinity are estimated 7 magnitude in Region 6. Region 6 are determined according to determining magnitudes, occurrence years of earthquakes in the future years, respectively, 7.2 magnitude was in 158 years, 6.7 magnitude was in 70 years, 6.2 magnitude was in 31 years, 5.7 magnitude was in 13 years, 5.2 magnitude was in 6 years.

  2. Earthquake history of Oklahoma

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    The strongest and most widely felt earthquake in Oklahoma occured on April 9, 1952. The intensity VII (Modified Mercalli Scale) tremor was felt over 362,000 sqaure kilometres. A second intensity VII earthquake, felt over a very small area, occurred in October 1956. In addition, 15 other shocks, intensity V or VI, have originated within Oklahoma. 

  3. Children's Beliefs about Earthquakes.

    ERIC Educational Resources Information Center

    Ross, Katharyn E. K.; Shuell, Thomas J.

    1993-01-01

    Summarizes the results of three related studies whose overall purpose was to determine elementary students' conceptions about earthquakes at two widely separated locations in the United States. Certain topics, such as the cause of earthquakes, seemed to cause difficulty for students. New definitional responses emerged in the studies that took…

  4. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  5. Catalog of Earthquake Hypocenters at Alaskan Volcanoes: January 1 through December 31, 2006

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl

    2008-01-01

    Between January 1 and December 31, 2006, AVO located 8,666 earthquakes of which 7,783 occurred on or near the 33 volcanoes monitored within Alaska. Monitoring highlights in 2006 include: an eruption of Augustine Volcano, a volcanic-tectonic earthquake swarm at Mount Martin, elevated seismicity and volcanic unrest at Fourpeaked Mountain, and elevated seismicity and low-level tremor at Mount Veniaminof and Korovin Volcano. A new seismic subnetwork was installed on Fourpeaked Mountain. This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field during 2006, (2) a description of earthquake detection, recording, analysis, and data archival systems, (3) a description of seismic velocity models used for earthquake locations, (4) a summary of earthquakes located in 2006, and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2006.

  6. Demand surge following earthquakes

    USGS Publications Warehouse

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  7. Observing the Greatest Earthquakes

    NASA Astrophysics Data System (ADS)

    Atwater, Brian; Barrientos, Sergio; Cifuentes, Inís; Cisternas, Marco; Wang, Kelin

    2010-11-01

    AGU Chapman Conference on Giant Earthquakes and Their Tsunamis; Viña del Mar and Valparaíso, Chile, 16-20 May 2010 ; An AGU Chapman Conference commemorated the fiftieth anniversary of the 1960 M 9.5 Chile earthquake. Participants reexamined this earthquake, the largest ever recorded instrumentally, and compared it with Chile's February 2010 M 8.8 earthquake. They also addressed the giant earthquake potential of subduction zones worldwide and strategies for reducing losses due to tsunamis. The conference drew 96 participants from 18 countries, and it reached out to public audiences in Chile. Its program and abstracts are posted at http://www.agu.org/meetings/chapman/2010/acall/pdf/Scientific_Program.pdf.

  8. Modeling earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Charpentier, Arthur; Durand, Marilou

    2015-07-01

    In this paper, we investigate questions arising in Parsons and Geist (Bull Seismol Soc Am 102:1-11, 2012). Pseudo causal models connecting magnitudes and waiting times are considered, through generalized regression. We do use conditional model (magnitude given previous waiting time, and conversely) as an extension to joint distribution model described in Nikoloulopoulos and Karlis (Environmetrics 19: 251-269, 2008). On the one hand, we fit a Pareto distribution for earthquake magnitudes, where the tail index is a function of waiting time following previous earthquake; on the other hand, waiting times are modeled using a Gamma or a Weibull distribution, where parameters are functions of the magnitude of the previous earthquake. We use those two models, alternatively, to generate the dynamics of earthquake occurrence, and to estimate the probability of occurrence of several earthquakes within a year or a decade.

  9. Psychological distress following urban earthquakes in California.

    PubMed

    Bourque, Linda B; Siegel, Judith M; Shoaf, Kimberley I

    2002-01-01

    During and following a disaster caused by a natural event, human populations are thought to be at greater risk of psychological morbidity and mortality directly attributable to increased, disaster-induced stress. Drawing both on the research of others and that conducted at the Center for Public Health and Disaster Relief of the University of California-Los Angeles (UCLA) following California earthquakes, this paper examines the extent to which research evidence supports these assumptions. Following a brief history of disaster research in the United States, the response of persons at the time of an earthquake was examined with particular attention to psychological morbidity; the number of deaths that can be attributed to cardiovascular events and suicides; and the extent to which and by whom, health services are used following an earthquake. The implications of research findings for practitioners in the field are discussed. PMID:12500731

  10. Disruption of groundwater systems by earthquakes

    NASA Astrophysics Data System (ADS)

    Liao, Xin; Wang, Chi-Yuen; Liu, Chun-Ping

    2015-11-01

    Earthquakes are known to enhance permeability at great distances, and this phenomenon may also disrupt groundwater systems by breaching the barrier between different reservoirs. Here we analyze the tidal response of water level in a deep (~4 km) well before and after the 2008 M7.9 Wenchuan earthquake to show that the earthquake not only changed the permeability but also altered the poroelastic properties of the groundwater system. Based on lithologic well logs and experimental data for rock properties, we interpret the change to reflect a coseismic breaching of aquitards bounding the aquifer, due perhaps to clearing of preexisting cracks and creation of new cracks, to depths of several kilometers. This may cause mixing of groundwater from previously isolated reservoirs and impact the safety of groundwater supplies and underground waste repositories. The method demonstrated here may hold promise for monitoring aquitard breaching by both natural and anthropogenic processes.

  11. Crowd-Sourced Global Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  12. Patient experiences with self-monitoring renal function after renal transplantation: results from a single-center prospective pilot study

    PubMed Central

    van Lint, Céline L; van der Boog, Paul JM; Wang, Wenxin; Brinkman, Willem-Paul; Rövekamp, Ton JM; Neerincx, Mark A; Rabelink, Ton J; van Dijk, Sandra

    2015-01-01

    Background After a kidney transplantation, patients have to visit the hospital often to monitor for early signs of graft rejection. Self-monitoring of creatinine in addition to blood pressure at home could alleviate the burden of frequent outpatient visits, but only if patients are willing to self-monitor and if they adhere to the self-monitoring measurement regimen. A prospective pilot study was conducted to assess patients’ experiences and satisfaction. Materials and methods For 3 months after transplantation, 30 patients registered self-measured creatinine and blood pressure values in an online record to which their physician had access to. Patients completed a questionnaire at baseline and follow-up to assess satisfaction, attitude, self-efficacy regarding self-monitoring, worries, and physician support. Adherence was studied by comparing the number of registered with the number of requested measurements. Results Patients were highly motivated to self-monitor kidney function, and reported high levels of general satisfaction. Level of satisfaction was positively related to perceived support from physicians (P<0.01), level of self-efficacy (P<0.01), and amount of trust in the accuracy of the creatinine meter (P<0.01). The use of both the creatinine and blood pressure meter was considered pleasant and useful, despite the level of trust in the accuracy of the creatinine device being relatively low. Trust in the accuracy of the creatinine device appeared to be related to level of variation in subsequent measurement results, with more variation being related to lower levels of trust. Protocol adherence was generally very high, although the range of adherence levels was large and increased over time. Conclusion Patients’ high levels of satisfaction suggest that at-home monitoring of creatinine and blood pressure after transplantation offers a promising strategy. Important prerequisites for safe implementation in transplant care seem to be support from physicians and patients’ confidence in both their own self-monitoring skills and the accuracy of the devices used. PMID:26673985

  13. An appraisal of aftershocks behavior for large earthquakes in Persia

    NASA Astrophysics Data System (ADS)

    Nemati, Majid

    2014-01-01

    This study focuses on the distribution of aftershocks in both location and magnitude for recent earthquakes in Iran. 43 Earthquakes are investigated, using data from the global International Seismological Center (ISC) seismic catalogue and from the regional earthquake catalogue of the Institute of Geophysics, University of Tehran (IGUT) between 1961-2006 and 2006-2012 respectively. We only consider the earthquakes with magnitude greater than 5.0. The majority of these events are intracontinental, occurring over four seismotectonic provinces across Iran. Processing aftershock sequences reported by both catalogues with cut-off magnitude of 2.5 and a sequence duration of 70 days, leads us to define a spatial horizontal area (A) occupied with the aftershocks as a function of mainshock magnitude (M) for Persian earthquakes: ISC: Log10(A) = 0.45MS + 0.23; IGUT: Log10(A) = 0.25MN + 1.7.

  14. The mass balance of earthquakes and earthquake sequences

    NASA Astrophysics Data System (ADS)

    Marc, O.; Hovius, N.; Meunier, P.

    2016-04-01

    Large, compressional earthquakes cause surface uplift as well as widespread mass wasting. Knowledge of their trade-off is fragmentary. Combining a seismologically consistent model of earthquake-triggered landsliding and an analytical solution of coseismic surface displacement, we assess how the mass balance of single earthquakes and earthquake sequences depends on fault size and other geophysical parameters. We find that intermediate size earthquakes (Mw 6-7.3) may cause more erosion than uplift, controlled primarily by seismic source depth and landscape steepness, and less so by fault dip and rake. Such earthquakes can limit topographic growth, but our model indicates that both smaller and larger earthquakes (Mw < 6, Mw > 7.3) systematically cause mountain building. Earthquake sequences with a Gutenberg-Richter distribution have a greater tendency to lead to predominant erosion, than repeating earthquakes of the same magnitude, unless a fault can produce earthquakes with Mw > 8 or more.

  15. Regional characterization of mine blasts, earthquakes, mine tremors, and nuclear explosions using the intelligent seismic event identification system. Final report, 1 April 1992-1 July 1993

    SciTech Connect

    Baumgardt, D.R.

    1993-07-31

    This report describes the results of a study of the Intelligent Seismic Event Identification System (ISEIS) which was installed at the Center for Seismic Studies and applied to regional events in the Intelligent Monitoring System (IMS) database. A subset of IMS data has been collected for known events in a database called the Ground Truth Database (GTD) and these events were processed by ISEIS. This has shown that the regional high-frequency PIS ratio discriminates between explosions and earthquakes in the Vogtland region recorded at the GERESSS array. Mine tremors in the Lubin and Upper Silesia resemble earthquakes. Lg spectral ratio was found to separate explosions and earthquakes in the Vogtland region, but the Lubin and Upper Silesia region mine tremors had large scatter. An evaluation was made of the discrimination rules in the ISEIS expert system on the events in four regions (Vogtland. Lubin, Upper Silesia, and Steigen) in the GTD. This report also describes the results of the analysis of the December 31, 1992 event which occurred near the Russian test site on Novaya Zemlya. Analysis of Pn/Sn ratios at NORESS indicated that these ratios were comparable to those measured for Kola Peninsula mine blasts, although the propagation paths were different. The ratios were only slightly greater than those observed for earthquakes in the Greenland Sea. The August 1. 1986 event recorded was re-analyzed and also found to resemble mine blasts. However, other discriminants indicate that the event was probably an earthquake.

  16. Campi Flegrei Structure From Earthquake Tomography

    NASA Astrophysics Data System (ADS)

    de Luca, G.; de Natale, G.; Benz, H.; Troise, C.; Capuano, P.

    Campi Flegrei caldera is an active volcanic area of Southern Italy characterised, with the neighbouring Mt. Vesuvius, by the highest volcanic risk in the World. Recent episodes of rapid uplift at spectacular rates (1 m per year), accompanied by intense seismicity of low to moderate magnitude (up to ML=4.2), allowed to collect funda- mental information about caldera unrests. Local earthquake arrival times collected by surveillance and temporary seismic networks from 1970 to present formed the basis for a selected data set consisting of about 450 earthquakes, recorded at total of 77 seismic station locations, not contemporary. Tomographic inversion, carried out with different ray tracing methods, put in evidence a minimum Vp, Vs and Vp/Vs located at the center of caldera. This low velocity anomaly, better resolved here with respect to previous work, mainly because of a much larger sampling of earthquakes and seismic station locations, is likely to mark the lighter pyroclastic rocks filling the innermost caldera collapse. The intensity of the anomalies, and the high value of the Vp/Vs ratio in the central zone, are coherent with a strongly fractured, fluid filled medium. The detailed knowledge of the velocity structure also allows to get more precise earth- quake locations. The relocation in the 3D model of over 1000 local earthquakes gives the most detailed picture of Campi Flegrei seismicity so far obtained. The use of a Bayesian algorithm for earthquake location shows a rather sharp picture of the main seismogenic structures of the area.

  17. Business Activity Monitoring: Real-Time Group Goals and Feedback Using an Overhead Scoreboard in a Distribution Center

    ERIC Educational Resources Information Center

    Goomas, David T.; Smith, Stuart M.; Ludwig, Timothy D.

    2011-01-01

    Companies operating large industrial settings often find delivering timely and accurate feedback to employees to be one of the toughest challenges they face in implementing performance management programs. In this report, an overhead scoreboard at a retailer's distribution center informed teams of order selectors as to how many tasks were…

  18. Business Activity Monitoring: Real-Time Group Goals and Feedback Using an Overhead Scoreboard in a Distribution Center

    ERIC Educational Resources Information Center

    Goomas, David T.; Smith, Stuart M.; Ludwig, Timothy D.

    2011-01-01

    Companies operating large industrial settings often find delivering timely and accurate feedback to employees to be one of the toughest challenges they face in implementing performance management programs. In this report, an overhead scoreboard at a retailer's distribution center informed teams of order selectors as to how many tasks were

  19. Earthquake-induced water-level fluctuations at Yucca Mountain, Nevada, June 1992

    SciTech Connect

    O`Brien, G.M.

    1993-07-01

    This report presents earthquake-induced water-level and fluid-pressure data for wells in the Yucca Mountain area, Nevada, during June 1992. Three earthquakes occurred which caused significant water-level and fluid-pressure responses in wells. Wells USW H-5 and USW H-6 are continuously monitored to detect short-term responses caused by earthquakes. Two wells, monitored hourly, had significant, longer-term responses in water level following the earthquakes. On June 28, 1992, a 7.5-magnitude earthquake occurred near Landers, California causing an estimated maximum water-level change of 90 centimeters in well USW H-5. Three hours later a 6.6-magnitude earthquake occurred near Big Bear Lake, California; the maximum water-level fluctuation was 20 centimeters in well USW H-5. A 5.6-magnitude earthquake occurred at Little Skull Mountain, Nevada, on June 29, approximately 23 kilometers from Yucca Mountain. The maximum estimated short-term water-level fluctuation from the Little Skull Mountain earthquake was 40 centimeters in well USW H-5. The water level in well UE-25p {number_sign}1, monitored hourly, decreased approximately 50 centimeters over 3 days following the Little Skull Mountain earthquake. The water level in UE-25p {number_sign}1 returned to pre-earthquake levels in approximately 6 months. The water level in the lower interval of well USW H-3 increased 28 centimeters following the Little Skull Mountain earthquake. The Landers and Little Skull Mountain earthquakes caused responses in 17 intervals of 14 hourly monitored wells, however, most responses were small and of short duration. For several days following the major earthquakes, many smaller magnitude aftershocks occurred causing measurable responses in the continuously monitored wells.

  20. The USGS Earthquake Notification Service (ENS): Customizable notifications of earthquakes around the globe

    USGS Publications Warehouse

    Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul; Martinez, Eric; Oppenheimer, David

    2008-01-01

    At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.

  1. AMBIENT AIR MONITORING AT GROUND ZERO AND LOWER MANHATTAN FOLLOWING THE COLLAPSE OF THE WORLD TRADE CENTER

    EPA Science Inventory

    The U.S. EPA National Exposure Research Laboratory (NERL) collaborated with EPA's Regional offices to establish a monitoring network to characterize ambient air concentrations of particulate matter (PM) and air toxics in lower Manhattan following the collapse of the World Trade...

  2. A search for paleoliquefaction and evidence bearing on the recurrence behavior of the great 1811-12 New Madrid earthquakes

    USGS Publications Warehouse

    Wesnousky, S.G.; Leffler, L.M.

    1994-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Missouri. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This professional paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  3. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt earthquakes within, as an average 90s of their occurrence, and can map, in certain cases, the damaged areas. Thanks to the flashsourced and crowdsourced information, we developed an innovative Twitter earthquake information service (currently under test and to be open by November) which intends to offer notifications for earthquakes that matter for the public only. It provides timely information for felt and damaging earthquakes regardless their magnitude and heads-up for seismologists. In conclusion, the experience developed at the EMSC demonstrates the benefit of involving eyewitnesses in earthquake surveillance. The data collected directly and indirectly from eyewitnesses complement information derived from monitoring networks and contribute to improved services. By increasing interaction between science and society, it opens new opportunities for raising awareness on seismic hazard.

  4. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    SciTech Connect

    Saragoni, G. Rodolfo

    2008-07-08

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  5. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Saragoni, G. Rodolfo

    2008-07-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  6. Magnitude and location of historical earthquakes in Japan and implications for the 1855 Ansei Edo earthquake

    USGS Publications Warehouse

    Bakun, W.H.

    2005-01-01

    Japan Meteorological Agency (JMA) intensity assignments IJMA are used to derive intensity attenuation models suitable for estimating the location and an intensity magnitude Mjma for historical earthquakes in Japan. The intensity for shallow crustal earthquakes on Honshu is equal to -1.89 + 1.42MJMA - 0.00887?? h - 1.66log??h, where MJMA is the JMA magnitude, ??h = (??2 + h2)1/2, and ?? and h are epicentral distance and focal depth (km), respectively. Four earthquakes located near the Japan Trench were used to develop a subducting plate intensity attenuation model where intensity is equal to -8.33 + 2.19MJMA -0.00550??h - 1.14 log ?? h. The IJMA assignments for the MJMA7.9 great 1923 Kanto earthquake on the Philippine Sea-Eurasian plate interface are consistent with the subducting plate model; Using the subducting plate model and 226 IJMA IV-VI assignments, the location of the intensity center is 25 km north of the epicenter, Mjma is 7.7, and MJMA is 7.3-8.0 at the 1?? confidence level. Intensity assignments and reported aftershock activity for the enigmatic 11 November 1855 Ansei Edo earthquake are consistent with an MJMA 7.2 Philippine Sea-Eurasian interplate source or Philippine Sea intraslab source at about 30 km depth. If the 1855 earthquake was a Philippine Sea-Eurasian interplate event, the intensity center was adjacent to and downdip of the rupture area of the great 1923 Kanto earthquake, suggesting that the 1855 and 1923 events ruptured adjoining sections of the Philippine Sea-Eurasian plate interface.

  7. Implications for earthquake risk reduction in the United States from the Kocaeli, Turkey, earthquake of August 17, 1999

    USGS Publications Warehouse

    U.S. Geological Survey

    2000-01-01

    This report documents implications for earthquake risk reduction in the U.S. The magnitude 7.4 earthquake caused 17,127 deaths, 43,953 injuries, and displaced more than 250,000 people from their homes. The report warns that similar disasters are possible in the United States where earthquakes of comparable size strike the heart of American urban areas. Another concern described in the report is the delayed emergency response that was caused by the inadequate seismic monitoring system in Turkey, a problem that contrasts sharply with rapid assessment and response to the September Chi-Chi earthquake in Taiwan. Additionally, the experience in Turkey suggests that techniques for forecasting earthquakes may be improving.

  8. Simulating Earthquakes for Science and Society: New Earthquake Visualizations Ideal for Use in Science Communication

    NASA Astrophysics Data System (ADS)

    de Groot, R. M.; Benthien, M. L.

    2006-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently have gained visibility via television news coverage in Southern California. These types of visualizations are becoming pervasive in the teaching and learning of concepts related to earth science. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin &Brick, 2002). Earthquakes are ideal candidates for visualization products: they cannot be predicted, are completed in a matter of seconds, occur deep in the earth, and the time between events can be on a geologic time scale. For example, the southern part of the San Andreas fault has not seen a major earthquake since about 1690, setting the stage for an earthquake as large as magnitude 7.7 -- the "big one." Since no one has experienced such an earthquake, visualizations can help people understand the scale of such an event. Accordingly, SCEC has developed a revolutionary simulation of this earthquake, with breathtaking visualizations that are now being distributed. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  9. Landslides caused by earthquakes.

    USGS Publications Warehouse

    Keefer, D.K.

    1984-01-01

    Data from 40 historical world-wide earthquakes were studied to determine the characteristics, geologic environments, and hazards of landslides caused by seismic events. This sample was supplemented with intensity data from several hundred US earthquakes to study relations between landslide distribution and seismic parameters. Correlations between magnitude (M) and landslide distribution show that the maximum area likely to be affected by landslides in a seismic event increases from approximately 0 at M = 4.0 to 500 000 km2 at M = 9.2. Each type of earthquake-induced landslide occurs in a particular suite of geologic environments. -from Author

  10. Earthquake engineering in Peru

    USGS Publications Warehouse

    Vargas, N.J

    1983-01-01

    During the last decade, earthquake engineering research in Peru has been carried out at the Catholic University of Peru and at the Universidad Nacional de Ingeniera (UNI). The Geophysical Institute (IGP) under the auspices of the Organization of American States (OAS) has initiated in Peru other efforts in regional seismic hazard assessment programs with direct impact to the earthquake engineering program. Further details on these programs have been reported by L. Ocola in the Earthquake Information Bulletin, January-February 1982, vol. 14, no. 1, pp. 33-38. 

  11. Earthquakes and emergence

    NASA Astrophysics Data System (ADS)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  12. An Atlas of ShakeMaps for Selected Global Earthquakes

    USGS Publications Warehouse

    Allen, Trevor I.; Wald, David J.; Hotovec, Alicia J.; Lin, Kuo-Wan; Earle, Paul; Marano, Kristin D.

    2008-01-01

    An atlas of maps of peak ground motions and intensity 'ShakeMaps' has been developed for almost 5,000 recent and historical global earthquakes. These maps are produced using established ShakeMap methodology (Wald and others, 1999c; Wald and others, 2005) and constraints from macroseismic intensity data, instrumental ground motions, regional topographically-based site amplifications, and published earthquake-rupture models. Applying the ShakeMap methodology allows a consistent approach to combine point observations with ground-motion predictions to produce descriptions of peak ground motions and intensity for each event. We also calculate an estimated ground-motion uncertainty grid for each earthquake. The Atlas of ShakeMaps provides a consistent and quantitative description of the distribution and intensity of shaking for recent global earthquakes (1973-2007) as well as selected historic events. As such, the Atlas was developed specifically for calibrating global earthquake loss estimation methodologies to be used in the U.S. Geological Survey Prompt Assessment of Global Earthquakes for Response (PAGER) Project. PAGER will employ these loss models to rapidly estimate the impact of global earthquakes as part of the USGS National Earthquake Information Center's earthquake-response protocol. The development of the Atlas of ShakeMaps has also led to several key improvements to the Global ShakeMap system. The key upgrades include: addition of uncertainties in the ground motion mapping, introduction of modern ground-motion prediction equations, improved estimates of global seismic-site conditions (VS30), and improved definition of stable continental region polygons. Finally, we have