Science.gov

Sample records for earthquake monitoring center

  1. Earthquake Monitoring in Haiti

    USGS Multimedia Gallery

    Following the devastating 2010 Haiti earthquake, the USGS has been helping with earthquake awareness and monitoring in the country, with continued support from the U.S. Agency for International Development (USAID). This assistance has helped the Bureau des Mines et de l'Energie (BME) in Port-au-Prin...

  2. Earthquake Observation through Groundwater Monitoring in South Korea

    NASA Astrophysics Data System (ADS)

    Piao, J.; Woo, N. C.

    2014-12-01

    According to previous researches, the influence of the some earthquakes can be detected by groundwater monitoring. Even in some countries groundwater monitoring is being used as an important tool to identify earthquake precursors and prediction measures. Thus, in this study we attempt to catch the anomalous changes in groundwater produced by earthquakes occurred in Korea through the National Groundwater Monitoring Network (NGMN). For observing the earthquake impacts on groundwater more effectively, from the National Groundwater Monitoring Network we selected 28 stations located in the five earthquake-prone zones in South Korea. And we searched the responses to eight earthquakes with M ?2.5 which occurred in the vicinity of five earthquake-prone zones in 2012. So far, we tested the groundwater monitoring data (water-level, temperature and electrical conductivity). Those data have only been treated to remove barometric pressure changes. Then we found 29 anomalous changes, confirming that groundwater monitoring data can provide valuable information on earthquake effects. To identify the effect of the earthquake from mixture signals of water-level, other signals must be separated from the original data. Periodic signals will be separated from the original data using Fast Fourier Transform (FFT). After that we will attempt to separate precipitation effect, and determine if the anomalies were generated by earthquake or not.

  3. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    USGS Publications Warehouse

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-01-01

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  4. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  5. Remote sensing hazard monitoring and assessment in Yushu earthquake disaster

    NASA Astrophysics Data System (ADS)

    Wen, Qi; Xu, Feng; Chen, Shirong

    2011-12-01

    Yushu Earthquake of magnitude 7.1 Richter in 2010 has brought a huge loss of personal lives and properties to China. National Disaster Reduction Center of China implemented the disaster assessment by using remote sensing images and field investigation. Preliminary judgment of disaster scope and damage extent was acquired by change detection. And the building region of hard-hit area Jiegu town was partitioned into 3-level grids in airborne remote sensing images by street, type of use, structure, and about 685 girds were numbered. Hazard assessment expert group were sent to implement field investigation according to each grid. The housing damage scope and extent of loss were defined again integrated field investigation data and local government reported information. Though remote sensing technology has played an important role in huge disaster monitoring and assessment, the automatic capability of disaster information extraction flow, three-dimensional disaster monitoring mode and bidirectional feedback mechanism of products and services should still be further improved.

  6. Real-time earthquake monitoring: Early warning and rapid response

    NASA Technical Reports Server (NTRS)

    1991-01-01

    A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.

  7. Monitoring seismic velocity changes associated with the 2014 Mw 6.0 South Napa earthquake

    NASA Astrophysics Data System (ADS)

    Taira, T.; Brenguier, F.; Kong, Q.

    2014-12-01

    We analyze ambient seismic noise wavefield to explore temporal variations in seismic velocity associated with the 24 August 2014 Mw 6.0 South Napa earthquake. We estimate relative velocity changes (dv/v) with MSNoise [Lecocq et al., 2014, SRL] by analyzing continuous waveforms collected at 10 seismic stations that locate near the epicenter of the 2014 South Napa earthquake. Following Brenguier et al. [2008, Science], our preliminary analysis focuses on the vertical component waveforms in a frequency range of 0.1-0.9 Hz. We determine the reference Green's function (GF) for each station pair as the average of 1-day stacks of GFs obtained in the time interval, January through July 2014. We estimate the time history of dv/v by measuring delay times between 10-day stacks of GF and reference GF. We find about 0.07% velocity reduction immediately after the 2014 South Napa earthquake by measuring the delay times between stacked and reference GFs. Our preliminary result also reveals a post-seismic relaxation process. The velocity reduction is down to 0.04% about 20 days after the 2014 South Napa earthquake. We have implemented an automated system to monitor the time history of dv/v (http://earthquakes.berkeley.edu/~taira/SNapa/SNapa_Noise.html) by using waveforms archived at the Northern California Earthquake Data Center. We will characterize the detailed temporal evolution of velocity change associated with the 2014 South Napa earthquake.

  8. USGS Hires Students to Help Improve Earthquake Monitoring

    USGS Multimedia Gallery

    A USGS student employee and sophomore at the Colorado School of Mines, was among the first hired by USGS using Recovery Act funding to upgrade the seismic stations of the Advanced National Seismic System (ANSS) Backbone. The USGS is using Recovery Act funding to upgrade its earthquake monitoring net...

  9. Overview of the Multidisciplinary Center for Earthquake Engineering Research (MCEER)

    E-print Network

    Bruneau, Michel

    operational after an earthquake--namely hospitals, lifeline systems (water and power distribution networks Research, University at Buffalo, 105 Red Jacket Quad, Buffalo, NY 14261, USA #12;INTRODUCTION at the University at Buffalo, the Center was established in 1986 by the National Science Foundation (NSF

  10. The USGS National Earthquake Information Center's Response to the Wenchuan, China Earthquake

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Wald, D. J.; Benz, H.; Sipkin, S.; Dewey, J.; Allen, T.; Jaiswal, K.; Buland, R.; Choy, G.; Hayes, G.; Hutko, A.

    2008-12-01

    Immediately after detecting the May 12th, 2008 Mw 7.9 Wenchuan Earthquake, the USGS National Earthquake Information Center (NEIC) began a coordinated effort to understand and communicate the earthquake's seismological characteristics, tectonic context, and humanitarian impact. NEIC's initial estimates of magnitude and location were distributed within 30 minutes of the quake by e-mail and text message to 70,000 users via the Earthquake Notification System. The release of these basic parameters automatically triggered the generation of more sophisticated derivative products that were used by relief and government agencies to plan their humanitarian response to the disaster. Body-wave and centroid moment tensors identified the earthquake's mechanism. Predictive ShakeMaps provided the first estimates of the geographic extent and amplitude of shaking. The initial automated population exposure estimate generated and distributed by the Prompt Assessment of Global Earthquakes for Response (PAGER) system stated that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater), indicating a large-scale disaster had occurred. NEIC's modeling of the mainshock and aftershocks was continuously refined and expanded. The length and orientation of the fault were determined from aftershocks, finite-fault models, and back-projection source imaging. Firsthand accounts of shaking intensity were collected and mapped by the "Did You Feel It" system. These results were used to refine our ShakeMaps and PAGER exposure estimates providing a more accurate assessment of the extent and enormity of the disaster. The products were organized and distributed in an event-specific summary poster and via the USGS Earthquake Program web pages where they were viewed by millions and reproduced by major media outlets (over 1/2 billion hits were served that month). Rather than just a point showing magnitude and epicenter, several of the media's schematic maps showed both intensity distribution and population exposure, achieving a significant communication goal of the PAGER project and the Advanced National Seismic System (ANSS).

  11. Towards a Better Earthquake and Tsunami Monitoring System: Indian Effort

    NASA Astrophysics Data System (ADS)

    Bansal, B.; Gupta, G.

    2005-12-01

    The December 26, 2004 earthquake (mw 9.3) in the Andaman-Sumatra subduction zone was unprecedented in its size, rupture extent as well as tsunamigenic capacity. Knowledge about the lack of a predecessor to this event was part of the reason for the apparent lack of anticipation and preparedness. Clearly, this event has changed the perception of earthquake/tsunami hazard along the Andaman and Nicobar Islands as well as regions along the southwest coast of India, far removed from the source earthquake. The government of India is embarking on a major programme to study the earthquake processes in the Andaman and Nicobar regions, part of the subduction zone stretching about 1000 km, most of which was affected by the earthquake. These efforts include expansion and modernization of the existing seismic network, continuous and campaign-mode GPS surveys, geological and geophysical investigations and inundation mapping. Research programmes being funded by the DST aims at improved understanding of the seismic sources, their past behavior, rupture characteristics, physical processing related to earthquakes in this subduction zone and style of deformation using geodetic techniques. A network of more than 100 seismological stations operate in India presently, most of them being operated by the India Meteorological Department, the nodal agency for seismological studies. Linking and modernization and addition of more seismic observatories are underway. The station at Port Blair has been upgraded as broadband and a good network of portable stations are now operational. Added to these are the GPS campaign mode surveys that are being done along the entire arc. Establishment of a multiparametric geophysical observatory to monitor physical processes prior to large earthquakes is another experiment in plan. The structure of the Tsunami Warning System being proposed also involves establishment of more tide gauges and pressure sensors at strategic locations. It is expected that the data generated through various research initiatives will provide the necessary scientific basis for the proposed warning system.

  12. Recent Progress and Development on Multi-parameters Remote Sensing Application in Earthquake Monitoring in China

    NASA Astrophysics Data System (ADS)

    Shen, Xuhui; Zhang, Xuemin; Hong, Shunying; Jing, Feng; Zhao, Shufan

    2014-05-01

    In the last ten years, a few national research plans and scientific projects on remote sensing application in Earthquake monitoring research are implemented in China. Focusing on advancing earthquake monitoring capability searching for the way of earthquake prediction, satellite electromagnetism, satellite infrared and D-InSAR technology were developed systematically and some remarkable progress were achieved by statistical research on historical earthquakes and summarized initially the space precursory characters, which laid the foundation for gradually promoting the practical use. On the basis of these works, argumentation on the first space-based platform has been finished in earthquake stereoscope observation system in China, and integrated earthquake remote sensing application system has been designed comprehensively. To develop the space-based earthquake observational system has become a major trend of technological development in earthquake monitoring and prediction. We shall pay more emphasis on the construction of the space segment of China earthquake stereoscope observation system and Imminent major scientific projects such as earthquake deformation observation system and application research combined INSAR, satellite gravity and GNSS with the goal of medium and long term earthquake monitoring and forcasting, infrared observation and technical system and application research with the goal of medium and short term earthquake monitoring and forcasting, and satellite-based electromagnetic observation and technical system and application system with the goal of short term and imminent earthquake monitoring.

  13. Enhanced Earthquake Monitoring in the European Arctic

    NASA Astrophysics Data System (ADS)

    Antonovskaya, Galina; Konechnaya, Yana; Kremenetskaya, Elena O.; Asming, Vladimir; Kværna, Tormod; Schweitzer, Johannes; Ringdal, Frode

    2015-03-01

    This paper presents preliminary results from a cooperative initiative between the Norwegian Seismic Array (NORSAR) institution in Norway and seismological institutions in NW Russia (Arkhangelsk and Apatity). We show that the joint processing of data from the combined seismic networks of all these institutions leads to a considerable increase in the number of located seismic events in the European Arctic compared to standard seismic bulletins such as the NORSAR reviewed regional seismic bulletin and the Reviewed Event Bulletin (REB) issued by the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) organization. The increase is particularly pronounced along the Gakkel Ridge to the north of the Svalbard and Franz-Josef Land archipelagos. We also note that the vast majority of the events along the Gakkel Ridge have been located slightly to the south of the ridge. We interpret this as an effect of the lack of recording stations closer to and north of the Gakkel Ridge, and the use of a one-dimensional velocity model which is not fully representative for travel-times along observed propagation paths. We conclude that while the characteristics of earthquake activity in the European Arctic is currently poorly known, the knowledge can be expected to be significantly improved by establishing the appropriate cooperative seismic recording infrastructures.

  14. Helping safeguard Veterans Affairs' hospital buildings by advanced earthquake monitoring

    USGS Publications Warehouse

    Kalkan, Erol; Banga, Krishna; Ulusoy, Hasan S.; Fletcher, Jon Peter B.; Leith, William S.; Blair, James L.

    2012-01-01

    In collaboration with the U.S. Department of Veterans Affairs (VA), the National Strong Motion Project of the U.S. Geological Survey has recently installed sophisticated seismic systems that will monitor the structural integrity of hospital buildings during earthquake shaking. The new systems have been installed at more than 20 VA medical campuses across the country. These monitoring systems, which combine sensitive accelerometers and real-time computer calculations, are capable of determining the structural health of each structure rapidly after an event, helping to ensure the safety of patients and staff.

  15. Earthquake Monitoring at Different Scales with Seiscomp3

    NASA Astrophysics Data System (ADS)

    Grunberg, M.; Engels, F.

    2013-12-01

    In the last few years, the French National Network of Seismic Survey (BCSF-RENASS) had to modernize its old and aging earthquake monitoring system coming from an inhouse developement. After having tried and conducted intensive tests on several real time frameworks such as EarthWorm and Seiscomp3 we have finaly adopted in 2012 Seiscomp3. Our actual system runs with two pipelines in parallel: the first one is tuned at a global scale to monitor the world seismicity (for event's magnitude > 5.5) and the second one is tuned at a national scale for the monitoring of the metropolitan France. The seismological stations used for the "world" pipeline are coming mainly from Global Seismographic Network (GSN), whereas for the "national" pipeline the stations are coming from the RENASS short period network and from the RESIF broadband network. More recently we have started to tune seiscomp3 at a smaller scale to monitor in real time the geothermal project (a R&D program in Deep Geothermal Energy) in the North-East part of France. Beside the use of the real time monitoring capabilities of Seiscomp3 we have also used a very handy feature to playback a 4 month length dataset at a local scale for the Rambervillers earthquake (22/02/2003, Ml=5.4) leading to the build of roughly 2000 aftershock's detections and localisations.

  16. Earthquakes

    MedlinePLUS

    ... Health Matters What's New Preparation & Planning Disasters & Severe Weather Earthquakes Extreme Heat Floods Hurricanes Landslides Tornadoes Tsunamis ... during an earthquake. Be Ready! Earthquakes Disasters & Severe Weather Earthquakes Extreme Heat Floods Hurricanes Landslides Tornadoes Tsunamis ...

  17. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These data can be accessed through the above web services and through special NCEDC web pages.

  18. Lessons Learned from Creating the Public Earthquake Resource Center at CERI

    NASA Astrophysics Data System (ADS)

    Patterson, G. L.; Michelle, D.; Johnston, A.

    2004-12-01

    The Center for Earthquake Research and Information (CERI) at the University of Memphis opened the Public Earthquake Resource Center (PERC) in May 2004. The PERC is an interactive display area that was designed to increase awareness of seismology, Earth Science, earthquake hazards, and earthquake engineering among the general public and K-12 teachers and students. Funding for the PERC is provided by the US Geological Survey, The NSF-funded Mid America Earthquake Center, and the University of Memphis, with input from the Incorporated Research Institutions for Seismology. Additional space at the facility houses local offices of the US Geological Survey. PERC exhibits are housed in a remodeled residential structure at CERI that was donated by the University of Memphis and the State of Tennessee. Exhibits were designed and built by CERI and US Geological Survey staff and faculty with the help of experienced museum display subcontractors. The 600 square foot display area interactively introduces the basic concepts of seismology, real-time seismic information, seismic network operations, paleoseismology, building response, and historical earthquakes. Display components include three 22" flat screen monitors, a touch sensitive monitor, 3 helicorder elements, oscilloscope, AS-1 seismometer, life-sized liquefaction trench, liquefaction shake table, and building response shake table. All displays include custom graphics, text, and handouts. The PERC website at www.ceri.memphis.edu/perc also provides useful information such as tour scheduling, ask a geologist, links to other institutions, and will soon include a virtual tour of the facility. Special consideration was given to address State science standards for teaching and learning in the design of the displays and handouts. We feel this consideration is pivotal to the success of any grass roots Earth Science education and outreach program and represents a valuable lesson that has been learned at CERI over the last several years. Another critical lesson that has been learned is to employ K-12 education professionals and utilize undergrad and graduate student workers in the University's Department of Education. Such staff members are keenly aware of the pressures and needs in diverse communities such as Shelby County, Tennessee and are uniquely suited to design and implement new and innovative programs that provide substantive short-term user benefits and promote long-term relationships with the K-12 teachers, students, and teacher's organizations.

  19. Quantifying 10 years of improvements in earthquake monitoring in the Caribbean region

    USGS Publications Warehouse

    McNamara, Daniel E.; Hillebrandt-Andrade, Christa; Saurel, Jean-Marie; Huerfano-Moreno, V.; Lynch, Lloyd

    2015-01-01

    Over 75 tsunamis have been documented in the Caribbean and adjacent regions during the past 500 years. Since 1500, at least 4484 people are reported to have perished in these killer waves. Hundreds of thousands are currently threatened along the Caribbean coastlines. Were a great tsunamigenic earthquake to occur in the Caribbean region today, the effects would potentially be catastrophic due to an increasingly vulnerable region that has seen significant population increases in the past 40–50 years and currently hosts an estimated 500,000 daily beach visitors from North America and Europe, a majority of whom are not likely aware of tsunami and earthquake hazards. Following the magnitude 9.1 Sumatra–Andaman Islands earthquake of 26 December 2004, the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Early Warning System for the Caribbean and Adjacent Regions (CARIBE?EWS) was established and developed minimum performance standards for the detection and analysis of earthquakes. In this study, we model earthquake?magnitude detection threshold and P?wave detection time and demonstrate that the requirements established by the UNESCO ICG CARIBE?EWS are met with 100% of the network operating. We demonstrate that earthquake?monitoring performance in the Caribbean Sea region has improved significantly in the past decade as the number of real?time seismic stations available to the National Oceanic and Atmospheric Administration tsunami warning centers have increased. We also identify weaknesses in the current international network and provide guidance for selecting the optimal distribution of seismic stations contributed from existing real?time broadband national networks in the region.

  20. D a t a s o u r c e s Alaska earthquake data from the Alaska Earthquake Information Center (www.aeic.alaska.edu)

    E-print Network

    West, Michael

    D a t a s o u r c e s Alaska earthquake data from the Alaska Earthquake Information Center (www.aeic.alaska.edu) Lower 48 earthquake data drawn from the ANSS composite catalog (http://www.ncedc.org/cnss/catalog-search.html) Earthquake occurrence rate in Alaska 1 9 6 0

  1. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake

  2. Earthquake!

    ERIC Educational Resources Information Center

    Markle, Sandra

    1987-01-01

    A learning unit about earthquakes includes activities for primary grade students, including making inferences and defining operationally. Task cards are included for independent study on earthquake maps and earthquake measuring. (CB)

  3. Earthquakes

    MedlinePLUS

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  4. Monitoring of ULF (ultra-low-frequency) Geomagnetic Variations Associated with Earthquakes

    PubMed Central

    Hayakawa, Masashi; Hattori, Katsumi; Ohta, Kenji

    2007-01-01

    ULF (ultra-low-frequency) electromagnetic emission is recently recognized as one of the most promising candidates for short-term earthquake prediction. This paper reviews previous convincing evidence on the presence of ULF emissions before a few large earthquakes. Then, we present our network of ULF monitoring in the Tokyo area by describing our ULF magnetic sensors and we finally present a few, latest results on seismogenic electromagnetic emissions for recent large earthquakes with the use of sophisticated signal processings.

  5. Real-time earthquake monitoring using a search engine method

    PubMed Central

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1?s after receiving the long-period surface wave data. PMID:25472861

  6. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  7. The Global Seismographic Network The U.S. Geological Survey's National Earthquake Information Center reports on more than

    E-print Network

    and to assess earthquake hazards and risks, particularly because earthquakes outside the boundariesThe Global Seismographic Network The U.S. Geological Survey's National Earthquake Information Center reports on more than 30,000 earthquakes a year worldwide, automatically detecting, locating

  8. Real-time earthquake monitoring using a search engine method.

    PubMed

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  9. Advanced Real-time Monitoring System and Simulation Researches for Earthquakes and Tsunamis in Japan -Towards Disaster Mitigation on Earthquakes and Tsunamis-

    NASA Astrophysics Data System (ADS)

    Hyodo, M.; Kaneda, Y.; Takahashi, N.; Baba, T.; Hori, T.; Kawaguchi, K.; Araki, E.; Matsumoto, H.; Nakamura, T.; Kamiya, S.; Ariyoshi, K.; Nakano, M.; Choi, J. K.; Nishida, S.

    2014-12-01

    How to mitigate and reduce damages by Earthquakes and Tsunamis? This is very important and indispensable problem for Japan and other high seismicity countries.Based on lessons learned from the 2004 Sumatra Earthquake/Tsunamis and the 2011 East Japan Earthquake/Tsunami, we recognized the importance of real-time monitoring of these natural hazards. As real-time monitoring system, DONET1 (Dense Ocean floor Network for Earthquakes and Tsunamis) was deployed and DONET2 is being developed around the Nankai trough Southwestern Japan for Seismology and Earthquake/Tsunami Early Warning. Based on simulation researches, DONET1 and DONET2 with multi-kinds of sensors such as broadband seismometers and precise pressure gauges will be expected to monitor slow events such as low frequency tremors and slow earthquakes for the estimation of seismic stage which is the inter-seismic or pre-seismic stage. In advanced simulation researches such as the recurrence cycle of mega thrust earthquakes, the data assimilation is very powerful tool to improve the reliability. Furthermore, tsunami inundations, seismic responses on buildings/city and agent simulations are very important towards future disaster mitigation programs and related measures. Finally, real-time monitoring data and advanced simulations will be integrated for precise Earthquake/Tsunami Early Warning and Estimation of damages in future compound disasters on Earthquakes and Tsunamis. We will introduce the present progress of advanced researches and future scope for disaster mitigation researches on earthquakes and Tsunamis.

  10. Simplifying Construction of Complex Workflows for Non-Expert Users of the Southern California Earthquake Center Community Modeling Environment

    E-print Network

    Kim, Jihie

    90292, (3) Southern California Earthquake Center, USC, Los Angeles CA, 90089, USA, {Corresponding Author of the Community Modeling Environment developed by the Southern California Earthquake Center, these tools California Earthquake Center (SCEC) created the Community Modeling Environment (SCEC/CME) [14]. The CME

  11. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    NASA Astrophysics Data System (ADS)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of detections is very small compared to the 5,175 earthquakes in the USGS PDE global earthquake catalog for the same five month time period, and no accurate location or magnitude can be assigned based on Tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 80% occurred within 2 minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided (very) short first-impression narratives from people who experienced the shaking. The USGS will continue investigating how to use Twitter and other forms of social media to augment is current suite of seismographically derived products.

  12. *jplynch@stanford.edu; phone 1-650-723-6213; fax 1-650-725-9755; The John A. Blume Earthquake Engineering Center, Stanford University; Stanford, CA 94305;

    E-print Network

    Lynch, Jerome P.

    *jplynch@stanford.edu; phone 1-650-723-6213; fax 1-650-725-9755; The John A. Blume Earthquake Engineering Center, Stanford University; Stanford, CA 94305; Validation of a wireless modular monitoring of Electrical Engineering, Stanford University ABSTRACT A wireless sensing unit for use in a Wireless Modular

  13. Recent Experiences Operating a Large, International Network of Electromagnetic Earthquake Monitors

    NASA Astrophysics Data System (ADS)

    Bleier, T.; Dunson, J. C.; Lemon, J.

    2014-12-01

    Leading a 5-nation international collaboration, QuakeFinder currently has a network of 168 instruments along with a Data Center that processes the 10 GB of data each day, 7 days a week. Each instrument includes 3-axis induction magnetometers, positive and negative ion sensors, and a geophone. These ground instruments are augmented with GOES weather satellite infrared monitoring of California (and in the future—other countries). The nature of the signals we are trying to detect and identify to enable forecasts for significant earthquakes (>M5) involves refining algorithms that both identify quake-related signals at some distance and remove a myriad of natural and anthropogenic noise. Maximum detection range was further investigated this year. An initial estimated maximum detection distance of 10 miles (16 km) was challenged with the onset of a M8.2 quake near Iquique, Chile on April 1, 2014. We will discuss the different strategies used to push the limits of detection for this quake which was 93 miles (149 km) from the instrument that had just been installed 2 months before the quake. Identifying and masking natural and man-made noise to reduce the number of misses and false alarms, and to increase the number of "hits" in a limited earthquake data set continues to be a top priority. Several novel approaches were tried, and the resulting progress will be discussed.

  14. Everyday Earthquakes.

    ERIC Educational Resources Information Center

    Svec, Michael

    1996-01-01

    Describes methods to access current earthquake information from the National Earthquake Information Center. Enables students to build genuine learning experiences using real data from earthquakes that have recently occurred. (JRH)

  15. Romanian Data Center: A modern way for seismic monitoring

    NASA Astrophysics Data System (ADS)

    Neagoe, Cristian; Marius Manea, Liviu; Ionescu, Constantin

    2014-05-01

    The main seismic survey of Romania is performed by the National Institute for Earth Physics (NIEP) which operates a real-time digital seismic network. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Mark l4c, Ranger, gs21, Mark l22) and acceleration sensors (Episensor Kinemetrics). The data are transmitted at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for the Black Sea tsunami events. NIEP is a data acquisition node for the seismic network of Moldova (FDSN code MD) composed of five seismic stations. NIEP has installed in the northern part of Bulgaria eight seismic stations equipped with broadband sensors and Episensors and nine accelerometers (Episensors) installed in nine districts along the Danube River. All the data are acquired at NIEP for Early Warning System and for primary estimation of the earthquake parameters. The real-time acquisition (RT) and data exchange is done by Antelope software and Seedlink (from Seiscomp3). The real-time data communication is ensured by different types of transmission: GPRS, satellite, radio, Internet and a dedicated line provided by a governmental network. For data processing and analysis at the two data centers Antelope 5.2 TM is being used running on 3 workstations: one from a CentOS platform and two on MacOS. Also a Seiscomp3 server stands as back-up for Antelope 5.2 Both acquisition and analysis of seismic data systems produce information about local and global parameters of earthquakes. In addition, Antelope is used for manual processing (event association, calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV, etc.), generating ShakeMap products and interaction with global data centers. National Data Center developed tools to enable centralizing of data from software like Antelope and Seiscomp3. These tools allow rapid distribution of information about damages observed after an earthquake to the public. Another feature of the developed application is the alerting of designated persons, via email and SMS, based on the earthquake parameters. In parallel, Seiscomp3 sends automatic notifications (emails) with the earthquake parameters. The real-time seismic network and software acquisition and data processing used in the National Data Center development have increased the number of events detected locally and globally, the increase of the quality parameters obtained by data processing and potentially increasing visibility on the national and internationally.

  16. Onsite infectious agents and toxins monitoring in 12 May Sichuan earthquake affected areas.

    PubMed

    Yao, Maosheng; Zhu, Tong; Li, Kejun; Dong, Shuofei; Wu, Yan; Qiu, Xinghua; Jiang, Bo; Chen, Liansheng; Zhen, Shiqi

    2009-11-01

    At 14:28 on 12 May 2008, Sichuan Province of China suffered a devastating earthquake measuring 8.0 on the Richter scale with more than 80 000 human lives lost and millions displaced. With inadequate shelter, poor access to health services, and disrupted ecology, the survivors were at enormous risk of infectious disease outbreaks. This work, believed to be unprecedented, was carried out to contain a possible outbreak through onsite monitoring of airborne biological agents in the high-risk areas. In such a mission, a mobile laboratory was developed using a customized vehicle along with state-of-art bioaerosol and molecular equipment and tools, and deployed to Sichuan 11 days after the earthquake. Using a high volume bioaerosol sampler (RCS High Flow) and Button Inhalable Aerosol Sampler equipped with gelatin filters, a total of 55 air samples, among which are 28 filter samples, were collected from rubble, medical centers, and camps of refugees, troops and rescue workers between 23 May and 9 June, 2008. After pre-treatment of the air samples, quantitative polymerase chain reaction (qPCR), gel electrophoresis, limulus amebocyte lysate (LAL) assay and enzyme-linked immunosorbent assay (ELISA) were applied to detect infectious agents and to quantify environmental toxins and allergens. The results revealed that, while high levels of endotoxin (180 approximately 975 ng/m3) and (1,3)-beta-d-glucans (11 approximately 100 ng/m3) were observed, infectious agents such as Bacillus anthracis, Bordetella pertussis, Neisseria meningitidis, Mycobacterium tuberculosis, influenza A virus, bird flu virus (H5N1), enteric viruses, and Meningococcal meningitis were found below their detection limits. The total bacterial concentrations were found to range from 250 to 2.5 x 10(5) DNA copies/L. Aspergillus fumigatus (Asp f 1) and dust mite allergens (Der p 1 and Der f 1) were also found below their detection limits. PMID:19890556

  17. Current Development at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.; Clayton, R. W.

    2005-12-01

    Over the past year, the SCEDC completed or is near completion of three featured projects: Station Information System (SIS) Development: The SIS will provide users with an interface into complete and accurate station metadata for all current and historic data at the SCEDC. The goal of this project is to develop a system that can interact with a single database source to enter, update and retrieve station metadata easily and efficiently. The system will provide accurate station/channel information for active stations to the SCSN real-time processing system, as will as station/channel information for stations that have parametric data at the SCEDC i.e., for users retrieving data via STP. Additionally, the SIS will supply information required to generate dataless SEED and COSMOS V0 volumes and allow stations to be added to the system with a minimum, but incomplete set of information using predefined defaults that can be easily updated as more information becomes available. Finally, the system will facilitate statewide metadata exchange for both real-time processing and provide a common approach to CISN historic station metadata. Moment Tensor Solutions: The SCEDC is currently archiving and delivering Moment Magnitudes and Moment Tensor Solutions (MTS) produced by the SCSN in real-time and post-processing solutions for events spanning back to 1999. The automatic MTS runs on all local events with magnitudes > 3.0, and all regional events > 3.5. The distributed solution automatically creates links from all USGS Simpson Maps to a text e-mail summary solution, creates a .gif image of the solution, and updates the moment tensor database tables at the SCEDC. Searchable Scanned Waveforms Site: The Caltech Seismological Lab has made available 12,223 scanned images of pre-digital analog recordings of major earthquakes recorded in Southern California between 1962 and 1992 at http://www.data.scec.org/research/scans/. The SCEDC has developed a searchable web interface that allows users to search the available files, select multiple files for download and then retrieve a zipped file containing the results. Scanned images of paper records for M>3.5 southern California earthquakes and several significant teleseisms are available for download via the SCEDC through this search tool.

  18. Application of collocated GPS and seismic sensors to earthquake monitoring and early warning.

    PubMed

    Li, Xingxing; Zhang, Xiaohong; Guo, Bofeng

    2013-01-01

    We explore the use of collocated GPS and seismic sensors for earthquake monitoring and early warning. The GPS and seismic data collected during the 2011 Tohoku-Oki (Japan) and the 2010 El Mayor-Cucapah (Mexico) earthquakes are analyzed by using a tightly-coupled integration. The performance of the integrated results is validated by both time and frequency domain analysis. We detect the P-wave arrival and observe small-scale features of the movement from the integrated results and locate the epicenter. Meanwhile, permanent offsets are extracted from the integrated displacements highly accurately and used for reliable fault slip inversion and magnitude estimation. PMID:24284765

  19. Application of Collocated GPS and Seismic Sensors to Earthquake Monitoring and Early Warning

    PubMed Central

    Li, Xingxing; Zhang, Xiaohong; Guo, Bofeng

    2013-01-01

    We explore the use of collocated GPS and seismic sensors for earthquake monitoring and early warning. The GPS and seismic data collected during the 2011 Tohoku-Oki (Japan) and the 2010 El Mayor-Cucapah (Mexico) earthquakes are analyzed by using a tightly-coupled integration. The performance of the integrated results is validated by both time and frequency domain analysis. We detect the P-wave arrival and observe small-scale features of the movement from the integrated results and locate the epicenter. Meanwhile, permanent offsets are extracted from the integrated displacements highly accurately and used for reliable fault slip inversion and magnitude estimation. PMID:24284765

  20. Hiding earthquakes from scrupulous monitoring eyes of dense local seismic networks

    NASA Astrophysics Data System (ADS)

    Bogiatzis, P.; Ishii, M.; Kiser, E.

    2012-12-01

    Accurate and complete cataloguing of aftershocks is essential for a variety of purposes, including the estimation of the mainshock rupture area, the identification of seismic gaps, and seismic hazard assessment. However, immediately following large earthquakes, the seismograms recorded by local networks are noisy, with energy arriving from hundreds of aftershocks, in addition to different seismic phases interfering with one another. This causes deterioration in the performance of detection and location of earthquakes using conventional methods such as the S-P approach. This is demonstrated by results of back-projection analysis of teleseismic data showing that a significant number of events are undetected by the Japan Meteorological Agency, within the first twenty-four hours after the Mw9.0 Tohoku-oki, Japan earthquake. The spatial distribution of the hidden events is not arbitrary. Most of these earthquakes are located close to the trench, while some are located at the outer rise. Furthermore, there is a relatively sharp trench-parallel boundary separating the detected and undetected events. We investigate the cause of these hidden earthquakes using forward modeling. The calculation of raypaths for various source locations and takeoff angles with the "shooting" method suggests that this phenomenon is a consequence of the complexities associated with subducting slab. Laterally varying velocity structure defocuses the seismic energy from shallow earthquakes located near the trench and makes the observation of P and S arrivals difficult at stations situated on mainland Japan. Full waveform simulations confirm these results. Our forward calculations also show that the probability of detection is sensitive to the depth of the event. Shallower events near the trench are more difficult to detect than deeper earthquakes that are located inside the subducting plate for which the shadow-zone effect diminishes. The modeling effort is expanded to include three-dimensional structure in velocity and intrinsic attenuation to evaluate possible laterally varying patterns. Our study suggests that the phenomenon of hidden earthquakes could be present at other regions around the world with active subductions. Considering that many of these subduction zones are not as well monitored as Japan, the number of missed events, especially after large earthquakes, could be significant. The results of this work can help to identify "blind spots" of present seismic networks, and can contribute to improving monitoring activities.

  1. The Development of an Earthquake Preparedness Plan for a Child Care Center in a Geologically Hazardous Region.

    ERIC Educational Resources Information Center

    Wokurka, Linda

    The director of a child care center at a community college in California developed an earthquake preparedness plan for the center which met state and local requirements for earthquake preparedness at schools. The plan consisted of: (1) the identification and reduction of nonstructural hazards in classrooms, office, and staff rooms; (2) storage of…

  2. 88 hours: the U.S. Geological Survey National Earthquake Information Center response to the March 11, 2011 Mw 9.0 Tohoku earthquake

    USGS Publications Warehouse

    Wald, David J.; Hayes, Gavin P.; Benz, Harley M.; Earle, Paul; Briggs, Richard W.

    2011-01-01

    The M 9.0 11 March 2011 Tohoku, Japan, earthquake and associated tsunami near the east coast of the island of Honshu caused tens of thousands of deaths and potentially over one trillion dollars in damage, resulting in one of the worst natural disasters ever recorded. The U.S. Geological Survey National Earthquake Information Center (USGS NEIC), through its responsibility to respond to all significant global earthquakes as part of the National Earthquake Hazards Reduction Program, quickly produced and distributed a suite of earthquake information products to inform emergency responders, the public, the media, and the academic community of the earthquake's potential impact and to provide scientific background for the interpretation of the event's tectonic context and potential for future hazard. Here we present a timeline of the NEIC response to this devastating earthquake in the context of rapidly evolving information emanating from the global earthquake-response community. The timeline includes both internal and publicly distributed products, the relative timing of which highlights the inherent tradeoffs between the requirement to provide timely alerts and the necessity for accurate, authoritative information. The timeline also documents the iterative and evolutionary nature of the standard products produced by the NEIC and includes a behind-the-scenes look at the decisions, data, and analysis tools that drive our rapid product distribution.

  3. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  4. Correlation of major eastern earthquake centers with mafic/ultramafic basement masses

    USGS Publications Warehouse

    Kane, Martin Francis

    1977-01-01

    Extensive gravity highs and associated magnetic anomalies are present in or near seven major eastern North American earthquake areas as defined by Hadley and Devine (1974). The seven include the five largest of the eastern North American earthquake .centers. The immediate localities of the gravity anomalies are, however, relatively free of seismicity, particularly the largest events. The anomalies are presumably caused by extensive mafic or ultramafic masses embedded in the crystalline basement. Laboratory experiments show that serpentinized gabbro and dunite fail under stress in a creep mode rather than in a stick-slip mode. A possible explanation of the correlation between the earthquake patterns and the anomalies is that the mafic/ultramafic masses are serpentinized and can only sustain low stress fields thereby acting to concentrate regional stress outside their boundaries. The proposed model is analogous to the hole-in-plate problem of mechanics whereby stresses around a hole in a stressed plate may. reach values several times the average.

  5. Earthquake.

    PubMed

    Cowen, A R; Denney, J P

    1994-04-01

    On January 25, 1 week after the most devastating earthquake in Los Angeles history, the Southern California Hospital Council released the following status report: 928 patients evacuated from damaged hospitals. 805 beds available (136 critical, 669 noncritical). 7,757 patients treated/released from EDs. 1,496 patients treated/admitted to hospitals. 61 dead. 9,309 casualties. Where do we go from here? We are still waiting for the "big one." We'll do our best to be ready when Mother Nature shakes, rattles and rolls. The efforts of Los Angeles City Fire Chief Donald O. Manning cannot be overstated. He maintained department command of this major disaster and is directly responsible for implementing the fire department's Disaster Preparedness Division in 1987. Through the chief's leadership and ability to forecast consequences, the city of Los Angeles was better prepared than ever to cope with this horrendous earthquake. We also pay tribute to the men and women who are out there each day, where "the rubber meets the road." PMID:10133439

  6. Novel Algorithms Enabling Rapid, Real-Time Earthquake Monitoring and Tsunami Early Warning Worldwide

    NASA Astrophysics Data System (ADS)

    Lomax, A.; Michelini, A.

    2012-12-01

    We have introduced recently new methods to determine rapidly the tsunami potential and magnitude of large earthquakes (e.g., Lomax and Michelini, 2009ab, 2011, 2012). To validate these methods we have implemented them along with other new algorithms within the Early-est earthquake monitor at INGV-Rome (http://early-est.rm.ingv.it, http://early-est.alomax.net). Early-est is a lightweight software package for real-time earthquake monitoring (including phase picking, phase association and event detection, location, magnitude determination, first-motion mechanism determination, ...), and for tsunami early warning based on discriminants for earthquake tsunami potential. In a simulation using archived broadband seismograms for the devastating M9, 2011 Tohoku earthquake and tsunami, Early-est determines: the epicenter within 3 min after the event origin time, discriminants showing very high tsunami potential within 5-7 min, and magnitude Mwpd(RT) 9.0-9.2 and a correct shallow-thrusting mechanism within 8 min. Real-time monitoring with Early-est givess similar results for most large earthquakes using currently available, real-time seismogram data. Here we summarize some of the key algorithms within Early-est that enable rapid, real-time earthquake monitoring and tsunami early warning worldwide: >>> FilterPicker - a general purpose, broad-band, phase detector and picker (http://alomax.net/FilterPicker); >>> Robust, simultaneous association and location using a probabilistic, global-search; >>> Period-duration discriminants TdT0 and TdT50Ex for tsunami potential available within 5 min; >>> Mwpd(RT) magnitude for very large earthquakes available within 10 min; >>> Waveform P polarities determined on broad-band displacement traces, focal mechanisms obtained with the HASH program (Hardebeck and Shearer, 2002); >>> SeisGramWeb - a portable-device ready seismogram viewer using web-services in a browser (http://alomax.net/webtools/sgweb/info.html). References (see also: http://alomax.net/pub_list.html): Lomax, A. and A. Michelini (2012), Tsunami early warning within 5 minutes, Pure and Applied Geophysics, 169, nnn-nnn, doi: 10.1007/s00024-012-0512-6. Lomax, A. and A. Michelini (2011), Tsunami early warning using earthquake rupture duration and P-wave dominant period: the importance of length and depth of faulting, Geophys. J. Int., 185, 283-291, doi: 10.1111/j.1365-246X.2010.04916.x. Lomax, A. and A. Michelini (2009b), Tsunami early warning using earthquake rupture duration, Geophys. Res. Lett., 36, L09306, doi:10.1029/2009GL037223. Lomax, A. and A. Michelini (2009a), Mwpd: A Duration-Amplitude Procedure for Rapid Determination of Earthquake Magnitude and Tsunamigenic Potential from P Waveforms, Geophys. J. Int.,176, 200-214, doi:10.1111/j.1365-246X.2008.03974.x

  7. EQInfo - earthquakes world-wide

    NASA Astrophysics Data System (ADS)

    Weber, Bernd; Herrnkind, Stephan

    2014-05-01

    EQInfo is a free Android app providing recent earthquake information from various earthquake monitoring centers as GFZ, EMSC, USGS and others. It allows filtering of agency, region and magnitude as well as controlling update interval, institute priority and alarm types. Used by more than 25k active users and beeing in the top ten list of Google Play, EQInfo is one of the most popular apps for earthquake information.

  8. New approach for earthquake/tsunami monitoring using dense GPS networks

    PubMed Central

    Li, Xingxing; Ge, Maorong; Zhang, Yong; Wang, Rongjiang; Xu, Peiliang; Wickert, Jens; Schuh, Harald

    2013-01-01

    In recent times increasing numbers of high-rate GPS stations have been installed around the world and set-up to provide data in real-time. These networks provide a great opportunity to quickly capture surface displacements, which makes them important as potential constituents of earthquake/tsunami monitoring and warning systems. The appropriate GPS real-time data analysis with sufficient accuracy for this purpose is a main focus of the current GPS research. In this paper we propose an augmented point positioning method for GPS based hazard monitoring, which can achieve fast or even instantaneous precise positioning without relying on data of a specific reference station. The proposed method overcomes the limitations of the currently mostly used GPS processing approaches of relative positioning and global precise point positioning. The advantages of the proposed approach are demonstrated by using GPS data, which was recorded during the 2011 Tohoku-Oki earthquake in Japan. PMID:24045328

  9. New approach for earthquake/tsunami monitoring using dense GPS networks.

    PubMed

    Li, Xingxing; Ge, Maorong; Zhang, Yong; Wang, Rongjiang; Xu, Peiliang; Wickert, Jens; Schuh, Harald

    2013-01-01

    In recent times increasing numbers of high-rate GPS stations have been installed around the world and set-up to provide data in real-time. These networks provide a great opportunity to quickly capture surface displacements, which makes them important as potential constituents of earthquake/tsunami monitoring and warning systems. The appropriate GPS real-time data analysis with sufficient accuracy for this purpose is a main focus of the current GPS research. In this paper we propose an augmented point positioning method for GPS based hazard monitoring, which can achieve fast or even instantaneous precise positioning without relying on data of a specific reference station. The proposed method overcomes the limitations of the currently mostly used GPS processing approaches of relative positioning and global precise point positioning. The advantages of the proposed approach are demonstrated by using GPS data, which was recorded during the 2011 Tohoku-Oki earthquake in Japan. PMID:24045328

  10. Quantifying 10 years of Improvements in Earthquake and Tsunami Monitoring in the Caribbean and Adjacent Regions

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.; Huerfano Moreno, V. A.; McNamara, D. E.; Saurel, J. M.

    2014-12-01

    The magnitude-9.3 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness to the destructive hazard of earthquakes and tsunamis. Post event assessments of global coastline vulnerability highlighted the Caribbean as a region of high hazard and risk and that it was poorly monitored. Nearly 100 tsunamis have been reported for the Caribbean region and Adjacent Regions in the past 500 years and continue to pose a threat for its nations, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North and South America. Significant efforts to improve monitoring capabilities have been undertaken since this time including an expansion of the United States Geological Survey (USGS) Global Seismographic Network (GSN) (McNamara et al., 2006) and establishment of the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS). The minimum performance standards it recommended for initial earthquake locations include: 1) Earthquake detection within 1 minute, 2) Minimum magnitude threshold = M4.5, and 3) Initial hypocenter error of <30 km. In this study, we assess current compliance with performance standards and model improvements in earthquake and tsunami monitoring capabilities in the Caribbean region since the first meeting of the UNESCO ICG-Caribe EWS in 2006. The three measures of network capability modeled in this study are: 1) minimum Mw detection threshold; 2) P-wave detection time of an automatic processing system and; 3) theoretical earthquake location uncertainty. By modeling three measures of seismic network capability, we can optimize the distribution of ICG-Caribe EWS seismic stations and select an international network that will be contributed from existing real-time broadband national networks in the region. Sea level monitoring improvements both offshore and along the coast will also be addressed. With the support of Member States and other countries and organizations it has been possible to significantly expand the sea level network thus reducing the amount of time it now takes to verify tsunamis.

  11. Data Sets and Data Services at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2014-12-01

    The Northern California Earthquake Data Center (NCEDC) houses a unique and comprehensive data archive and provides real-time services for a variety of seismological and geophysical data sets that encompass northern and central California. We have over 80 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates in both raw and RINEX format. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 890,000 events from 1984 to the present, and the NCEDC provides catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also host and provide event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a variety of ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  12. Real-Time Data Processing Systems and Products at the Alaska Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Ruppert, N. A.; Hansen, R. A.

    2007-05-01

    The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.

  13. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    NASA Astrophysics Data System (ADS)

    Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

    2009-12-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this collaboration, and lessons learned from interacting with free-choice learning institutions.

  14. Two Decades of Seismic Monitoring by WEBNET: Disclosing a Lifecycle of an Earthquake Swarm Zone

    NASA Astrophysics Data System (ADS)

    Fischer, T.; Horalek, J.; Cermakova, H.; Michalek, J.; Doubravova, J.; Bouskova, A.; Bachura, M.

    2014-12-01

    The area of West Bohemia/Vogtland in western Eger Rift is typified by earthquake swarm activity with maximum magnitudes not exceeding ML 5. The seismicity is dominated by the area near Novy Kostel where earthquakes cluster along a narrow and steeply dipping focal zone of 8 km length that strikes about N-S in the depth range 7-11 km. Detailed seismic monitoring has been carried out by the WEBNET seismic network since 1992. During that period earthquake swarms with several mainshocks exceeding magnitude level ML 3 took place in 2000, 2008 and 2011. These swarms were characteristic by episodic character where the activity of individual episodes overlapped in time and space. Interestingly, the rate of activity of individual swarms increased with each subsequent swarm; the 2000 swarm being the slowest and the 2011 swarm the most rapid one. In 2014 the character of seismicity has changed from a swarm-like activity to a mainshock-aftershock activity. Already three mainshocks has occurred since May 2014; the ML 3.6 event of May 24, the ML 4.5 event of May 31 and the ML 3.5 event of August 3. All these events were followed by a short aftershock sequence of one to four days duration. All three events exceeded the following aftershocks by more than one magnitude level and none of these mainshocks were preceded by foreshocks, which differentiates this activity from the preceding swarm seismicity. Interestingly, the hypocenters of the mentioned earthquake swarms and mainshock-aftershock sequences share a common fault zone and overlap significantly. We present detailed analysis of precise hypocenter locations and statistical characteristics of the activity in order to find the origin of different behavior of seismic activity, which results in either earthquake swarms or mainshock-aftershock activity.

  15. Cloud-based systems for monitoring earthquakes and other environmental quantities

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Olson, M.; Liu, A.; Chandy, M.; Bunn, J.; Guy, R.

    2013-12-01

    There are many advantages to using a cloud-based system to record and analyze environmental quantities such as earthquakes, radiation, various gases, dust and meteorological parameters. These advantages include robustness and dynamic scalability, and also reduced costs. In this paper, we present our experiences over the last three years in developing a cloud-based earthquake monitoring system (the Community Seismic Network). This network consists of over 600 sensors (accelerometers) in the S. California region that send data directly to the Google App Engine where they are analyzed. The system is capable of handing many other types of sensor data and generating a situation-awareness analysis as a product. Other advantages to the cloud-based system are integration with other peer networks, and being able to deploy anywhere in the world without have to build addition computing infrastructure.

  16. A Correction to the article "Geo-center movement caused by huge earthquakes" by Wenke Sun and Jie Dong

    NASA Astrophysics Data System (ADS)

    Zhou, Jiangcun; Sun, Wenke; Dong, Jie

    2015-07-01

    Sun and Dong (2014) studied the co-seismic geo-center movement using dislocation theory for a spherical earth model. However, they incorrectly considered the maximum vertical co-seismic displacement as the rigid geo-center motion (i.e., they did not separate the rigid shift and elastic deformation). In this paper, we correct Sun and Dong (2014) by using a new approach. We now define the geo-center motion as a shift of the center of figure of the Earth relative to the center of mass of the Earth. Furthermore, we derive new formulas to compute the co-seismic geo-center and inner core's center movements caused by huge earthquakes. The 2004 Sumatra earthquake and the 2011 Tohoku-Oki earthquake changed the geo-center by 1-4 mm and about 2 mm, respectively, and caused the inner core's center to displace by about 0.05 mm and 0.025 mm, respectively.

  17. Implications of the World Trade Center Health Program (WTCHP) for the Public Health Response to the Great East Japan Earthquake

    PubMed Central

    CRANE, Michael A.; CHO, Hyunje G.; LANDRIGAN, Phillip J.

    2013-01-01

    The attacks on the World Trade Center (WTC) on September 11, 2001 resulted in a serious burden of physical and mental illness for the 50,000 rescue workers that responded to 9/11 as well as the 400,000 residents and workers in the surrounding areas of New York City. The Zadroga Act of 2010 established the WTC Health Program (WTCHP) to provide monitoring and treatment of WTC exposure-related conditions and health surveillance for the responder and survivor populations. Several reports have highlighted the applicability of insights gained from the WTCHP to the public health response to the Great East Japan Earthquake. Optimal exposure monitoring processes and attention to the welfare of vulnerable exposed sub-groups are critical aspects of the response to both incidents. The ongoing mental health care concerns of 9/11 patients accentuate the need for accessible and appropriately skilled mental health care in Fukushima. Active efforts to demonstrate transparency and to promote community involvement in the public health response will be highly important in establishing successful long-term monitoring and treatment programs for the exposed populations in Fukushima. PMID:24317449

  18. UNLV’s environmentally friendly Science and Engineering Building is monitored for earthquake shaking

    USGS Publications Warehouse

    Kalkan, Erol; Savage, Woody; Reza, Shahneam; Knight, Eric; Tian, Ying

    2013-01-01

    The University of Nevada Las Vegas’ (UNLV) Science and Engineering Building is at the cutting edge of environmentally friendly design. As the result of a recent effort by the U.S. Geological Survey’s National Strong Motion Project in cooperation with UNLV, the building is now also in the forefront of buildings installed with structural monitoring systems to measure response during earthquakes. This is particularly important because this is the first such building in Las Vegas. The seismic instrumentation will provide essential data to better understand the structural performance of buildings, especially in this seismically active region.

  19. Role of WEGENER (World Earthquake GEodesy Network for Environmental Hazard Research) in monitoring natural hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Ozener, H.; Zerbini, S.; Bastos, M. L.; Becker, M. H.; Meghraoui, M.; Reilinger, R. E.

    2013-12-01

    WEGENER was originally the acronym for Working Group of European Geoscientists for the Establishment of Networks for Earth-science Research. It was founded in March 1981 in response to an appeal delivered at the Journées Luxembourgeoises de Geodynamique in December 1980 to respond with a coordinated European proposal to a NASA Announcement of Opportunity inviting participation in the Crustal Dynamics and Earthquake Research Program. WEGENER, during the past 33 years, has always kept a close contact with the Agencies and Institutions responsible for the development and maintenance of the global space geodetic networks with the aim to make them aware of the scientific needs and outcomes of the project which might have an influence on the general science policy trends. WEGENER served as Inter-commission Project 3.2, between Commission 1 and Commission 3, of the International Association of Geodesy (IAG) until 2012. Since then, WEGENER project has become the Sub-commission 3.5 of IAG commission 3, namely Tectonics and Earthquake Geodesy. In this presentation, we briefly review the accomplishments of WEGENER as originally conceived and outline and justify the new focus of the WEGENER consortium. The remarkable and rapid evolution of the present state of global geodetic monitoring in regard to the precision of positioning capabilities (and hence deformation) and global coverage, the development of InSAR for monitoring strain with unprecedented spatial resolution, and continuing and planned data from highly precise satellite gravity and altimetry missions, encourage us to shift principal attention from mainly monitoring capabilities by a combination of space and terrestrial geodetic techniques to applying existing observational methodologies to the critical geophysical phenomena that threaten our planet and society. Our new focus includes developing an improved physical basis to mitigate earthquake, tsunami, and volcanic risks, and the effects of natural and anthropogenic climate change (sea level, ice degradation). In addition, expanded applications of space geodesy to atmospheric studies will remain a major focus with emphasis on ionospheric and tropospheric monitoring to support forecasting extreme events. Towards these ends, we will encourage and foster interdisciplinary, integrated initiatives to develop a range of case studies for these critical problems. Geological studies are needed to extend geodetic deformation studies to geologic time scales, and new modeling approaches will facilitate full exploitation of expanding geodetic databases. In light of this new focus, the WEGENER acronym now represents, 'World Earthquake GEodesy Network for Environmental Hazard Research.

  20. Space Radiation Monitoring Center at SINP MSU

    NASA Astrophysics Data System (ADS)

    Kalegaev, Vladimir; Barinova, Wera; Barinov, Oleg; Bobrovnikov, Sergey; Dolenko, Sergey; Mukhametdinova, Ludmila; Myagkova, Irina; Nguen, Minh; Panasyuk, Mikhail; Shiroky, Vladimir; Shugay, Julia

    2015-04-01

    Data on energetic particle fluxes from Russian satellites have been collected in Space monitoring data center at Moscow State University in the near real-time mode. Web-portal http://smdc.sinp.msu.ru/ provides operational information on radiation state of the near-Earth space. Operational data are coming from space missions ELECTRO-L1, Meteor-M2. High-resolution data on energetic electron fluxes from MSU's satellite VERNOV with RELEC instrumentation on board are also available. Specific tools allow the visual representation of the satellite orbit in 3D space simultaneously with particle fluxes variations. Concurrent operational data coming from other spacecraft (ACE, GOES, SDO) and from the Earth's surface (geomagnetic indices) are used to represent geomagnetic and radiation state of near-Earth environment. Internet portal http://swx.sinp.msu.ru provides access to the actual data characterizing the level of solar activity, geomagnetic and radiation conditions in heliosphere and the Earth's magnetosphere in the real-time mode. Operational forecasting services automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons, using data from LEO and GEO orbits. The models of space environment working in autonomous mode are used to generalize the information obtained from different missions for the whole magnetosphere. On-line applications created on the base of these models provide short-term forecasting for SEP particles and relativistic electron fluxes at GEO and LEO, Dst and Kp indices online forecasting up to 1.5 hours ahead. Velocities of high-speed streams in solar wind on the Earth orbit are estimated with advance time of 3-4 days. Visualization system provides representation of experimental and modeling data in 2D and 3D.

  1. Change of permeability caused by 2011 Tohoku earthquake detected from pore pressure monitoring

    NASA Astrophysics Data System (ADS)

    Kinoshita, C.; Kano, Y.; Ito, H.

    2013-12-01

    Earthquake-induced groundwater changes which are the pre- and co-seismic changes have been long reported (e.g. Roeloffs, 1996). For example, 1995 Kobe earthquake, water inflow into observation tunnel changed at Rokko (Fujimori et al., 1995), at the times of 1964 Alaska earthquake (M8.6) (Coble, 1967) and 1999 Taiwan Chi-Chi earthquake (M7.6) (Chia et al., 2001), groundwater leve were fluctuated. The shaking of seismic waves and crack formation by crustal deformation are proposed as one causes but the mechanism is controversial. We are monitoring pore pressure from 2005 to measure the stress changes at Kamioka mine, Gifu prefecture, central Japan. Barometric pressure and strain are observed to correct the pore pressure data. In general, the pore pressure changes associate with the meteorological effects, Earth tides and crustal deformation. Increase of pore pressure depends on the precipitation which flows into the ground. Especially, snow effects are bigger than the usual rainfall because our observation site has heavy snow in winter season. Melted snow flows in the ground and pore pressure increases at the March to April every year. When the 2011 Tohoku earthquake (M9.0) occurred, pore pressure remarkably decreased because the permeability increases by crustal deformation at Kamioka region. Thus, we estimated the hydraulic diffusivity before and after the earthquake from pore pressure response to crustal deformation. We made separated analyses on three frequency bands. First is the high frequency band, especially, seismic response. Second is response to Earth tides. Third frequency band is that of barometric response which is lower than other two bands. At high frequency band, we confirmed that the deformation occurred under undrained condition and estimated the bulk modulus from pore pressure and strain data. Next, tidal response is extracted from pore pressure which applied to every three months data of pore pressure, barometric pressure and strain. Time window shifted every one day. As a result, amplitude of O1 and M2 constituents decreased after the Tohoku earthquake. M2 and O1 amplitudes were 0.575 hPa and 0.277 hPa before the earthquake, and decreased to 0.554 hPa and 0.184 hPa after the earthquake respectively. The phase between pore pressure and strain, changed after the event and soon recovered. We estimated the hydraulic diffusivity from the change in ratio of tidal response. We have no strain data due to apparatus problem, so we used synthetic strain. From one-dimensional diffusion equation and poroelastic constitutive relations, we could approximate the relation between pore pressure and strain by the exponential curve. Estimated hydraulic diffusivity of preseismic period is 8.0 m2/s and postseismic period is 19 m2/s, and these results correspond to pore pressure decreases. In the case of the barometric pressure response, we made the spectrum analysis and estimated the hydraulic diffusivity. The results from three frequency domain bands were integrated to show how the hydraulic diffusivity depends on to frequency.

  2. Modeling and Monitoring for Predictive Simulation of Earthquake Generation in the Japan Region

    NASA Astrophysics Data System (ADS)

    Matsu'Ura, M.; Noda, A.; Terakawa, T.; Hashimoto, C.; Fukuyama, E.

    2008-12-01

    We can regard earthquakes as releases of tectonically accumulated elastic strain energy through dynamic fault ruptures. Given this, the entire earthquake generation process generally consists of tectonic loading due to relative plate motion, quasi-static rupture nucleation, dynamic rupture propagation and stop, and fault strength recovery. In the 1990s earthquake generation physics has made great progress, and so we can now quantitatively describe the entire earthquake generation process with coupled nonlinear equations, consisting of a slip-response function that relates fault slip to shear stress change, a fault constitutive law that prescribes shear strength change with fault slip and contact time, and relative plate motion as driving forces. Recently, we completed a physics-based simulation system for the entire earthquake generation process in and around Japan, where the four plates of Pacific, North American, Philippine Sea and Eurasian are interacting with each other. The total system consists of three basic simulation models for quasi-static stress accumulation, dynamic rupture propagation and seismic wave propagation, developed on a realistic 3- D structure model. Then, given past slip histories and present stress states, we can now predict next step seismic/aseismic fault-slip motion through computation with the combined simulation system. We show two examples of the combined simulation for the 1968 Tokachi-oki earthquake (Mw=8.2) and the 2003 Tokachi- oki earthquake (Mw=8.1). The first example demonstrates that when the stress state is close to a critical level, dynamic rupture develops into a large earthquake, but when the stress state is much lower than the critical level, started rupture is not accelerated. The second example demonstrates that we can quantitatively evaluate the strong ground motions produced by potential interplate earthquakes through computer simulation, if the realistic plate-interface geometry, fault constitutive parameters and crustal structure are given. Thus, our problem is how to extract useful information to estimate the past slip history and the present stress state from observed seismic and geodetic data. To address this problem we developed two inversion methods using Akaike"fs Bayesian Information Criterion (ABIC), one of which is the method to estimate the spatiotemporal variation of interplate coupling from geodetic data, and another is the method to estimate tectonic stress fields from CMT data of seismic events. From the inversion analysis of GPS data we revealed slip-deficit rate distribution on the North American-Pacific plate interface off northeast Japan, which shows good correlation with the source regions of past large interplate events along the Kuril-Japan trench. From the inversion analysis of CMT data we revealed 3-D tectonic stress fields in and around Japan, which explains complex tectonics in Japan very well. Furthermore, we are now developing another inversion method to estimate 3-D elastic/inelastic strain fields from GPS data. Combining these inversion methods with the computer simulation of tectonic loading, we will be able to monitor the spatiotemporal variation of interplate coupling and seismogenic stress fields in the Japan region.

  3. The Northern California Earthquake Data Center: Seismic and Geophysical Data for Northern California and Beyond

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Klein, F.; Zuzlewski, S.; Gee, L.; Oppenheimer, D.; Romanowicz, B.

    2004-12-01

    The Northern California Earthquake Data Center (NCEDC) is an archive and distribution center for geophysical data for networks in northern and central California. The NCEDC provides timeseries data from seismic, strain, electro-magnetic, a variety of creep, tilt, and environmental sensors, and continuous and campaign GPS data in raw and RINEX formats. The NCEDC has a wide variety of interfaces for data retrieval. Timeseries data are available via a web interface and standard queued request methods such as NetDC (developed in collaboration with the IRIS DMC and other international data centers), BREQ_FAST, and EVT_FAST. Interactive data retrieval methods include STP, developed by the SCEDC, and FISSURES DHI (Data Handling Interface), an object-oriented interface developed by IRIS. The Sandia MATSEIS system is being adapted to use the FISSURES DHI interface to provide an enhanced GUI-based seismic analysis system for MATLAB. Northern California and prototype ANSS worldwide earthquake catalogs are searchable from web interfaces, and supporting phase and amplitude data can be retrieved when available. Future data sets planned for the NCEDC are seismic and strain data from the EarthScope Plate Boundary Observatory (PBO) and SAFOD. The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and USGS Menlo Park.

  4. Analogue models of subduction megathrust earthquakes: improving rheology and monitoring technique

    NASA Astrophysics Data System (ADS)

    Brizzi, Silvia; Corbi, Fabio; Funiciello, Francesca; Moroni, Monica

    2015-04-01

    Most of the world's great earthquakes (Mw > 8.5, usually known as mega-earthquakes) occur at shallow depths along the subduction thrust fault (STF), i.e., the frictional interface between the subducting and overriding plates. Spatiotemporal occurrences of mega-earthquakes and their governing physics remain ambiguous, as tragically demonstrated by the underestimation of recent megathrust events (i.e., 2011 Tohoku). To help unravel seismic cycle at STF, analogue modelling has become a key-tool. First properly scaled analogue models with realistic geometries (i.e., wedge-shaped) suitable for studying interplate seismicity have been realized using granular elasto-plastic [e.g., Rosenau et al., 2009] and viscoelastic materials [i.e., Corbi et al., 2013]. In particular, viscoelastic laboratory experiments realized with type A gelatin 2.5 wt% simulate, in a simplified yet robust way, the basic physics governing subduction seismic cycle and related rupture process. Despite the strength of this approach, analogue earthquakes are not perfectly comparable to their natural prototype. In this work, we try to improve subduction seismic cycle analogue models by modifying the rheological properties of the analogue material and adopting a new image analysis technique (i.e., PEP - ParticlE and Prediction velocity). We test the influence of lithosphere elasticity by using type A gelatin with greater concentration (i.e., 6 wt%). Results show that gelatin elasticity plays important role in controlling seismogenic behaviour of STF, tuning the mean and the maximum magnitude of analogue earthquakes. In particular, by increasing gelatin elasticity, we observe decreasing mean magnitude, while the maximum magnitude remains the same. Experimental results therefore suggest that lithosphere elasticity could be one of the parameters that tunes seismogenic behaviour of STF. To increase gelatin elasticity also implies improving similarities with their natural prototype in terms of coseismic duration and rupture width. Experimental monitoring has been performed by means of both PEP and PIV (i.e., Particle Image Velocimetry) algorithms. PEP differs from classic cross-correlation techniques (i.e., PIV) in its ability to provide sparse velocity vectors at points coincident with particle barycentre positions, allowing a lagrangian description of the velocity field and a better spatial resolution (i.e., ? 0.03 mm2) with respect to PIV. Results show that PEP algorithm is able to identify a greater number of analogue earthquakes (i.e., ? 20% more than PIV algorithm), decreasing the minimum detectable magnitude from 6.6 to 4.5. Furthermore, earthquake source parameters (e.g., hypocentre position, rupture limits and slip distribution) are more accurately defined. PEP algorithm is then suitable to potentially gain new insights on seismogenic process of STF, by extending the analysable magnitude range of analogue earthquakes and having implications on applicability of scaling relationship, such as Gutenberg - Richter law, to experimental results.

  5. Basin-centered asperities in great subduction zone earthquakes: A link between slip, subsidence, and subduction erosion?

    USGS Publications Warehouse

    Wells, R.E.; Blakely, R.J.; Sugiyama, Y.; Scholl, D. W.; Dinterman, P.A.

    2003-01-01

    Published areas of high coseismic slip, or asperities, for 29 of the largest Circum-Pacific megathrust earthquakes are compared to forearc structure revealed by satellite free-air gravity, bathymetry, and seismic profiling. On average, 71% of an earthquake's seismic moment and 79% of its asperity area occur beneath the prominent gravity low outlining the deep-sea terrace; 57% of an earthquake's asperity area, on average, occurs beneath the forearc basins that lie within the deep-sea terrace. In SW Japan, slip in the 1923, 1944, 1946, and 1968 earthquakes was largely centered beneath five forearc basins whose landward edge overlies the 350??C isotherm on the plate boundary, the inferred downdip limit of the locked zone. Basin-centered coseismic slip also occurred along the Aleutian, Mexico, Peru, and Chile subduction zones but was ambiguous for the great 1964 Alaska earthquake. Beneath intrabasin structural highs, seismic slip tends to be lower, possibly due to higher temperatures and fluid pressures. Kilometers of late Cenozoic subsidence and crustal thinning above some of the source zones are indicated by seismic profiling and drilling and are thought to be caused by basal subduction erosion. The deep-sea terraces and basins may evolve not just by growth of the outer arc high but also by interseismic subsidence not recovered during earthquakes. Basin-centered asperities could indicate a link between subsidence, subduction erosion, and seismogenesis. Whatever the cause, forearc basins may be useful indicators of long-term seismic moment release. The source zone for Cascadia's 1700 A.D. earthquake contains five large, basin-centered gravity lows that may indicate potential asperities at depth. The gravity gradient marking the inferred downdip limit to large coseismic slip lies offshore, except in northwestern Washington, where the low extends landward beneath the coast. Transverse gravity highs between the basins suggest that the margin is seismically segmented and could produce a variety of large earthquakes. Published in 2003 by the American Geophysical Union.

  6. Noise reduction in radon monitoring data using Kalman filter and application of results in earthquake precursory process research

    NASA Astrophysics Data System (ADS)

    Namvaran, Mojtaba; Negarestani, Ali

    2014-06-01

    Monitoring the concentration of radon gas is an established method for geophysical analyses and research, particularly in earthquake studies. A continuous radon monitoring station was implemented in Jooshan hotspring, Kerman province, south east Iran. The location was carefully chosen as a widely reported earthquake-prone zone. A common issue during monitoring of radon gas concentration is the possibility of noise disturbance by different environmental and instrumental parameters. A systematic mathematical analysis aiming at reducing such noises from data is reported here; for the first time, the Kalman filter (KF) has been used for radon gas concentration monitoring. The filtering is incorporated based on several seismic parameters of the area under study. A novel anomaly defined as "radon concentration spike crossing" is also introduced and successfully used in the study. Furthermore, for the first time, a mathematical pattern of a relationship between the radius of potential precursory phenomena and the distance between epicenter and the monitoring station is reported and statistically analyzed.

  7. Noise Reduction in Radon Monitoring Data Using Kalman Filter and Application of Results in Earthquake Precursory Process Research

    NASA Astrophysics Data System (ADS)

    Namvaran, Mojtaba; Negarestani, Ali

    2015-04-01

    Monitoring the concentration of radon gas is an established method for geophysical analyses and research, particularly in earthquake studies. A continuous radon monitoring station was implemented in Jooshan hotspring, Kerman province, south east Iran. The location was carefully chosen as a widely reported earthquake-prone zone. A common issue during monitoring of radon gas concentration is the possibility of noise disturbance by different environmental and instrumental parameters. A systematic mathematical analysis aiming at reducing such noises from data is reported here; for the first time, the Kalman filter (KF) has been used for radon gas concentration monitoring. The filtering is incorporated based on several seismic parameters of the area under study. A novel anomaly defined as "radon concentration spike crossing" is also introduced and successfully used in the study. Furthermore, for the first time, a mathematical pattern of a relationship between the radius of potential precursory phenomena and the distance between epicenter and the monitoring station is reported and statistically analyzed.

  8. Disasters; the 2010 Haitian earthquake and the evacuation of burn victims to US burn centers.

    PubMed

    Kearns, Randy D; Holmes, James H; Skarote, Mary Beth; Cairns, Charles B; Strickland, Samantha Cooksey; Smith, Howard G; Cairns, Bruce A

    2014-09-01

    Response to the 2010 Haitian earthquake included an array of diverse yet critical actions. This paper will briefly review the evacuation of a small group of patients with burns to burn centers in the southeastern United States (US). This particular evacuation brought together for the first time plans, groups, and organizations that had previously only exercised this process. The response to the Haitian earthquake was a glimpse at what the international community working together can do to help others, and relieve suffering following a catastrophic disaster. The international response was substantial. This paper will trace one evacuation, one day for one unique group of patients with burns to burn centers in the US and review the lessons learned from this process. The patient population with burns being evacuated from Haiti was very small compared to the overall operation. Nevertheless, the outcomes included a better understanding of how a larger event could challenge the limited resources for all involved. This paper includes aspects of the patient movement, the logistics needed, and briefly discusses reimbursement for the care provided. PMID:24411582

  9. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    NASA Astrophysics Data System (ADS)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  10. Towards real-time regional earthquake simulation I: real-time moment tensor monitoring (RMT) for regional events in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liang, Wen-Tzong; Cheng, Hui-Wen; Tu, Feng-Shan; Ma, Kuo-Fong; Tsuruoka, Hiroshi; Kawakatsu, Hitoshi; Huang, Bor-Shouh; Liu, Chun-Chi

    2014-01-01

    We have developed a real-time moment tensor monitoring system (RMT) which takes advantage of a grid-based moment tensor inversion technique and real-time broad-band seismic recordings to automatically monitor earthquake activities in the vicinity of Taiwan. The centroid moment tensor (CMT) inversion technique and a grid search scheme are applied to obtain the information of earthquake source parameters, including the event origin time, hypocentral location, moment magnitude and focal mechanism. All of these source parameters can be determined simultaneously within 117 s after the occurrence of an earthquake. The monitoring area involves the entire Taiwan Island and the offshore region, which covers the area of 119.3°E to 123.0°E and 21.0°N to 26.0°N, with a depth from 6 to 136 km. A 3-D grid system is implemented in the monitoring area with a uniform horizontal interval of 0.1° and a vertical interval of 10 km. The inversion procedure is based on a 1-D Green's function database calculated by the frequency-wavenumber (fk) method. We compare our results with the Central Weather Bureau (CWB) catalogue data for earthquakes occurred between 2010 and 2012. The average differences between event origin time and hypocentral location are less than 2 s and 10 km, respectively. The focal mechanisms determined by RMT are also comparable with the Broadband Array in Taiwan for Seismology (BATS) CMT solutions. These results indicate that the RMT system is realizable and efficient to monitor local seismic activities. In addition, the time needed to obtain all the point source parameters is reduced substantially compared to routine earthquake reports. By connecting RMT with a real-time online earthquake simulation (ROS) system, all the source parameters will be forwarded to the ROS to make the real-time earthquake simulation feasible. The RMT has operated offline (2010-2011) and online (since January 2012 to present) at the Institute of Earth Sciences (IES), Academia Sinica (http://rmt.earth.sinica.edu.tw). The long-term goal of this system is to provide real-time source information for rapid seismic hazard assessment during large earthquakes.

  11. Hydrochemical monitoring results in relation to the vogtland-nw bohemian earthquake swarm period 2000

    NASA Astrophysics Data System (ADS)

    Kämpf, H.; Bräuer, K.; Dulski, P.; Faber, E.; Koch, U.; Mrlina, J.; Strauch, G.; Weise, S. M.

    2003-04-01

    The Vogtland-NW Bohemian earthquake swarm area/Central Europe is characterised by carbon dioxide- rich mineral springs and mofetts. The August-December 2000 earthquake period was the strongest compared with the December 1985/86 swarms occurred in the area of Novy Kostel, Czech Republic. Here, we present first results of long-term hydrochemical monitoring studies before, during and after the 2000 swarm period. The swarm 2000 lasted from August 28 until December 26 and consisted of altogether nine sub-swarm episodes, each of them lasting for several days. At the mineral spring Wettinquelle, Bad Brambach/Germany the water chemistry and isotope (D, 18O) composition was monitored weekly and two-weekly, respectively, since May 2000. The mineral spring Wettinquelle is located in a distance of about 10 km from the epicentral area of Novy Kostel. The aim of our investigation was to look for seismic induced or seismic coupled changes of the chemical and isotope composition of the mineral water. We had to separate seismohydrological effects from seasonal and hydrological caused changes. The seasonal caused shifts were found for water temperature and alkaline elements (Li, Na, K, Rb and Cs) as well as for discharge, conductivity, hydrogenecarbonate- concentration, and the concentration of the alkaline earth's (Ca, Mg, Sr). Strain related anomalies which could influence the hydrogeochemistry of the mineral water seems to be visible in the iron- concentration of the spring water, in the methane- concentration of the free gas component and caused probably changes of the groundwater level of the well H3 located about 5 km SE of the Wettinquelle at Skalna.

  12. Advanced earthquake monitoring system for U.S. Department of Veterans Affairs medical buildings--instrumentation

    USGS Publications Warehouse

    Kalkan, Erol; Banga, Krishna; Ulusoy, Hasan S.; Fletcher, Jon Peter B.; Leith, William S.; Reza, Shahneam; Cheng, Timothy

    2012-01-01

    In collaboration with the U.S. Department of Veterans Affairs (VA), the National Strong Motion Project (NSMP; http://nsmp.wr.usgs.gov/) of the U.S. Geological Survey has been installing sophisticated seismic systems that will monitor the structural integrity of 28 VA hospital buildings located in seismically active regions of the conterminous United States, Alaska, and Puerto Rico during earthquake shaking. These advanced monitoring systems, which combine the use of sensitive accelerometers and real-time computer calculations, are designed to determine the structural health of each hospital building rapidly after an event, helping the VA to ensure the safety of patients and staff. This report presents the instrumentation component of this project by providing details of each hospital building, including a summary of its structural, geotechnical, and seismic hazard information, as well as instrumentation objectives and design. The structural-health monitoring component of the project, including data retrieval and processing, damage detection and localization, automated alerting system, and finally data dissemination, will be presented in a separate report.

  13. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    NASA Astrophysics Data System (ADS)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  14. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  15. Monitoring of the stress state variations of the Southern California for the purpose of earthquake prediction

    NASA Astrophysics Data System (ADS)

    Gokhberg, M.; Garagash, I.; Bondur, V.; Steblov, G. M.

    2014-12-01

    The three-dimensional geomechanical model of Southern California was developed, including a mountain relief, fault tectonics and characteristic internal features such as the roof of the consolidated crust and Moho surface. The initial stress state of the model is governed by the gravitational forces and horizontal tectonic motions estimated from GPS observations. The analysis shows that the three-dimensional geomechanical models allows monitoring of the changes in the stress state during the seismic process in order to constrain the distribution of the future places with increasing seismic activity. This investigation demonstrates one of possible approach to monitor upcoming seismicity for the periods of days - weeks - months. Continuous analysis of the stress state was carried out during 2009-2014. Each new earthquake with ?~1 and above from USGS catalog was considered as the new defect of the Earth crust which has some definite size and causes redistribution of the stress state. Overall calculation technique was based on the single function of the Earth crust damage, recalculated each half month. As a result each half month in the upper crust layers and partially in the middle layers we revealed locations of the maximal values of the stress state parameters: elastic energy density, shear stress, proximity of the earth crust layers to their strength limit. All these parameters exhibit similar spatial and temporal distribution. How follows from observations all four strongest events with ? ~ 5.5-7.2 occurred in South California during the analyzed period were prefaced by the parameters anomalies in peculiar advance time of weeks-months in the vicinity of 10-50 km from the upcoming earthquake. After the event the stress state source disappeared. The figure shows migration of the maximums of the stress state variations gradients (parameter D) in the vicinity of the epicenter of the earthquake 04.04.2010 with ?=7.2 in the period of 01.01.2010-01.05.2010. Grey lines show the major faults. In the table the values are sampled by 2 weeks, "-" indicates time before the event, "+" indicates time after the event.

  16. Southern California Earthquake Center - SCEC1: Final Report Summary Alternative Earthquake Source Characterization for the Los Angeles Region

    SciTech Connect

    Foxall, B

    2003-02-26

    The objective my research has been to synthesize current understanding of the tectonics and faults of the Los Angeles Basin and surrounding region to quantify uncertainty in the characterization of earthquake sources used for geologically- and geodetically-based regional earthquake likelihood models. This work has focused on capturing epistemic uncertainty; i.e. uncertainty stemming from ignorance of the true characteristics of the active faults in the region and of the tectonic forces that drive them. In the present context, epistemic uncertainty has two components: First, the uncertainty in source geometrical and occurrence rate parameters deduced from the limited geological, geophysical and geodetic observations available; and second. uncertainties that result from fundamentally different interpretations of regional tectonic deformation and faulting. Characterization of the large number of active and potentially active faults that need to be included in estimating earthquake occurrence likelihoods for the Los Angeles region requires synthesis and evaluation of large amounts of data and numerous interpretations. This was accomplished primarily through a series of carefully facilitated workshops, smaller meetings involving key researchers, and email groups. The workshops and meetings were made possible by the unique logistical and financial resources available through SCEC, and proved to be extremely effective forums for the exchange and critical debate of data and interpretations that are essential in constructing fully representative source models. The main products from this work are a complete source model that characterizes all know or potentially active faults in the greater Los Angeles region. which includes the continental borderland as far south as San Diego, the Ventura Basin, and the Santa Barbara Channel. The model constitutes a series of maps and representative cross-sections that define alternative fault geometries, a table containing rault geometrical and slip-rate parameters, including full uncertainty distributions, and a set of logic trees that define alternative source characterizations, particularly for sets of fault systems having inter-dependent geometries and kinematics resulting from potential intersection and interaction in the sub-surface. All of these products exist in a form suitable for input to earthquake likelihood and seismic hazard analyses. In addition, moment-balanced Poissonian earthquake rates for the alternative multi-segment characterizations of each fault system have been estimated. Finally, this work has served an important integrative function in that the exchange and debate of data, results and ideas that it has engendered has helped to focus SCEC research over the past six years on to key issues in tectonic deformation and faulting.

  17. Grand Canyon Monitoring and Research Center

    USGS Publications Warehouse

    Hamill, John F.

    2009-01-01

    The Grand Canyon of the Colorado River, one of the world's most spectacular gorges, is a premier U.S. National Park and a World Heritage Site. The canyon supports a diverse array of distinctive plants and animals and contains cultural resources significant to the region's Native Americans. About 15 miles upstream of Grand Canyon National Park sits Glen Canyon Dam, completed in 1963, which created Lake Powell. The dam provides hydroelectric power for 200 wholesale customers in six western States, but it has also altered the Colorado River's flow, temperature, and sediment-carrying capacity. Over time this has resulted in beach erosion, invasion and expansion of nonnative species, and losses of native fish. Public concern about the effects of Glen Canyon Dam operations prompted the passage of the Grand Canyon Protection Act of 1992, which directs the Secretary of the Interior to operate the dam 'to protect, mitigate adverse impacts to, and improve values for which Grand Canyon National Park and Glen Canyon National Recreation Area were established...' This legislation also required the creation of a long-term monitoring and research program to provide information that could inform decisions related to dam operations and protection of downstream resources.

  18. The IPOC Creepmeter Array in N-Chile: Monitoring Slip Accumulation Triggered By Local or Remote Earthquakes

    NASA Astrophysics Data System (ADS)

    Victor, P.; Schurr, B.; Oncken, O.; Sobiesiak, M.; Gonzalez, G.

    2014-12-01

    The Atacama Fault System (AFS) is an active trench-parallel fault, located above the down-dip end of coupling of the north Chilean subduction zone. About 3 M=7 Earthquakes in the past 10 ky have been documented in the paleoseismological record, demonstrating the potential of large events in the future. To investigate the current surface creep rate and to deduce the mode of strain accumulation, we deployed an array of 11 creepmeters along four branches of the AFS. This array monitors the interaction of earthquake activity on the subduction zone and a trench-parallel fault in the overriding forearc. The displacement across the fault is continuously monitored with 2 samples/min with a resolution of 1?m. Collocated seismometers record the seismicity at two of the creepmeters, whereas control of the regional seismicity is provided by the IPOC Seismological Networks. Continuous time series of the creepmeter stations since 2009 show that the shallow segments of the fault do not creep permanently. Instead the accumulation of permanent deformation occurs by triggered slip recorded as well-defined steps caused by local or remote earthquakes. The 2014 Mw=8.2 Pisagua Earthquake, located close to the creepmeter array, triggered large displacement events on all stations. Another event recorded on all stations was the 2010 Mw=8.8 Maule earthquake located 1500km south of the array. All of the stations showed a triggered displacement event 6-8 min after origin time of the main shock, at the same time as the arrival of the surface waves recorded at nearby IPOC stations. This points to a dynamic triggering process caused by transient stresses during passage of the surface wave. Investigation of seismic events with Magnitudes <6 show displacement events triggered during P and S wave passage, pointing to static as well as dynamic stress changes for proximal events. Analyzing the causative earthquakes we find that the most effective way to trigger displacement events on the AFS are deep (>100km) earthquakes on the subduction zone interface up to 300 km east of the array. Earthquakes located to the west of the AFS on the locked part of the subduction zone interface rarely trigger displacement events on the AFS. Only if such events are as large as the Pisagua Earthquake or its Mw= 7,6 aftershock, they trigger large displacement events.

  19. Web Services and Data Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.

    2013-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest quality waveform from the archive.

  20. Results of seismological monitoring in the Cascade Range 1962-1989: earthquakes, eruptions, avalanches and other curiosities

    USGS Publications Warehouse

    Weaver, C.S.; Norris, R.D.; Jonientz-Trisler, C.

    1990-01-01

    Modern monitoring of seismic activity at Cascade Range volcanoes began at Longmire on Mount Rainier in 1958. Since then, there has been an expansion of the regional seismic networks in Washington, northern Oregon and northern California. Now, the Cascade Range from Lassen Peak to Mount Shasta in the south and Newberry Volcano to Mount Baker in the north is being monitored for earthquakes as small as magnitude 2.0, and many of the stratovolcanoes are monitored for non-earthquake seismic activity. This monitoring has yielded three major observations. First, tectonic earthquakes are concentrated in two segments of the Cascade Range between Mount Rainier and Mount Hood and between Mount Shasta and Lassen Peak, whereas little seismicity occurs between Mount Hood and Mount Shasta. Second, the volcanic activity and associated phenomena at Mount St. Helens have produced intense and widely varied seismicity. And third, at the northern stratovolcanoes, signals generated by surficial events such as debris flows, icequakes, steam emissions, rockfalls and icefalls are seismically recorded. Such records have been used to alert authorities of dangerous events in progress. -Authors

  1. Early Results of Three-Year Monitoring of Red Wood Ants' Behavioral Changes and Their Possible Correlation with Earthquake Events.

    PubMed

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-01-01

    Short-term earthquake predictions with an advance warning of several hours or days are currently not possible due to both incomplete understanding of the complex tectonic processes and inadequate observations. Abnormal animal behaviors before earthquakes have been reported previously, but create problems in monitoring and reliability. The situation is different with red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae)). They have stationary mounds on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas. For three years (2009-2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras with both a color and an infrared sensor. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants' behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the more than 45,000 hours of video streams. Based on this automated approach, a statistical analysis of the ants' behavior will be carried out. In addition, other parameters (climate, geotectonic and biological), which may influence behavior, will be included in the analysis. PMID:26487310

  2. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 2, Radiation Monitoring and Sampling

    SciTech Connect

    NSTec Aerial Measurement Systems

    2012-07-31

    The FRMAC Monitoring and Sampling Manual, Volume 2 provides standard operating procedures (SOPs) for field radiation monitoring and sample collection activities that are performed by the Monitoring group during a FRMAC response to a radiological emergency.

  3. Report of Research Center for Urban Safety and Security, Kobe University Surface fault associated with the 2010 Darfield earthquake and disasters by the 2011 Christchurch earthquake,

    E-print Network

    Takiguchi, Tetsuya

    , Akihiko Hokugo Study on global use of integrated earthquake simulation Muneo Hori #12;MULTISCALE DYNAMIC fault associated with the 2010 Darfield earthquake and disasters by the 2011 Christchurch earthquake earthquakes around the Bungo Channel area and its implication for seismic hazard assessment Hitoshi Hirose

  4. First Results of 3 Year Monitoring of Red Wood Ants' Behavioural Changes and Their Possible Correlation with Earthquake Events

    NASA Astrophysics Data System (ADS)

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-04-01

    Short-term earthquake predictions with an advance warning of several hours or days can currently not be performed reliably and remain limited to only a few minutes before the event. Abnormal animal behaviours prior to earthquakes have been reported previously but their detection creates problems in monitoring and reliability. A different situation is encountered for red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae). They have stationary nest sites on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas and are simultaneously information channels deeply reaching into the crust. A particular advantage of monitoring RWA is their high sensitivity to environmental changes. Besides an evolutionarily developed extremely strong temperature sensitivity of 0.25 K, they have chemoreceptors for the detection of CO2 concentrations and a sensitivity for electromagnetic fields. Changes of the electromagnetic field are discussed or short-lived "thermal anomalies" are reported as trigger mechanisms for bioanomalies of impending earthquakes. For 3 years, we have monitored two Red Wood Ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), 24/7 by high-resolution cameras equipped with a colour and infrared sensor. In the Neuwied Basin, an average of about 100 earthquakes per year with magnitudes up to M 3.9 occur located on different tectonic fault regimes (strike-slip faults and/or normal or thrust faults). The RWA mounds are located on two different fault regimes approximately 30 km apart. First results show that the ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants' behaviour hours before the earthquake event: The nocturnal rest phase and daily activity are suppressed, and standard daily routine is continued not before the next day. Additional parameters that might have an effect on the ants' daily routine (including climate data, earth tides, lunar phases and biological parameters) are recorded and correlated with the analysed daily activity. Additionally, nest air measurements (CO2, Helium, Radon, H2S and CH4) are performed at intervals. At present, an automated image analysis routine is being applied to the acquired more than 45,000 hours of video stream data. It is a valuable tool to objectively identify and classify the ants' activity on top of mounds and to examine possible correlations with earthquakes. Based on this automated approach, a statistical analysis of the ants' behaviour is intended. The investigation and results presented here are a first access to a completely new research complex. The key question is whether the ants' behavioural changes and their correlation with earthquake events are statistically significant and if a detection by an automated system is possible. Long-term studies have to show whether confounding factors and climatic influences can clearly be distinguished. Although the first results suggest that it is promising to consolidate and extend the research to determine a pattern for exceptional situations, there is, however, still a long way to go for a usable automated earthquake warning system. References Berberich G (2010): Identifikation junger gasführender Störungszonen in der West- und Hocheifel mit Hilfe von Bioindikatoren. Dissertation. Essen, 293 S. Berberich G, Klimetzek D, Wöhler C., and Grumpe A (2012): Statistical Correlation between Red Wood Ant Sites and Neotectonic Strike-Slip Faults. Geophysical Research Abstracts Vol. 14, EGU2012-3518 Berberich G, Berberich M, Grumpe A, Wöhler C., and Schreiber U (2012): First Results of 3 Year Monitoring of Red Wood Ants' Behavioural Changes and Their Possible Correlation with Earthquake Events. Animals, ISSN 2076-2615,. Special Issue "Biological Anomalies Prior to Earthquakes") (in prep.) Dologlou E. (2010): Recent aspects on possible interrelation between precursory electric signals and anomalous bioeffects. Nat. Hazards Earth Syst. Sci., 10, 1951-1955. Kir

  5. Postseismic Deformation after the 1964 Great Alaskan Earthquake: Collaborative Research with Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Freymueller, Jeffrey T.

    1999-01-01

    The purpose of this project was to carry out GPS observations on the Kenai Peninsula, southern Alaska, in order to study the postseismic and contemporary deformation following the 1964 Alaska earthquake. All of the research supported in this grant was carried out in collaboration with Dr. Steven Cohen of Goddard Space Flight Center. The research funding from this grant primarily supported GPS fieldwork, along with the acquisition of computer equipment to allow analysis and modeling of the GPS data. A minor amount of salary support was provided by the PI, but the great majority of the salary support was provided by the Geophysical Institute. After the expiration of this grant, additional funding was obtained from the National Science Foundation to continue the work. This grant supported GPS field campaigns in August 1995, June 1996, May-June and September 1997, and May-June 1998. We initially began the work by surveying leveling benchmarks on the Kenai peninsula that had been surveyed after the 1964 earthquake. Changes in height from the 1964 leveling data to the 1995+ GPS data, corrected for the geoid-ellipsoid separation, give the total elevation change since the earthquake. Beginning in 1995, we also identified or established sites that were suitable for long-term surveying using GPS. In the subsequent annual GPS campaigns, we made regular measurements at these GPS marks, and steadily enhanced our set of points for which cumulative postseismic uplift data were available. From 4 years of Global Positioning System (GPS) measurements, we find significant spatial variations in present-day deformation between the eastern and western Kenai peninsula, Alaska. Sites in the eastern Kenai peninsula and Prince William Sound move to the NNW relative to North America, in the direction of Pacific-North America relative plate motion. Velocities decrease in magnitude from nearly the full plate rate in southern Prince William Sound to about 30 mm/yr at Seward and to about 5 mm/yr near Anchorage. In contrast, sites in the western Kenai peninsula move to the SW, in a nearly trenchward direction, with a velocity of about 20 mm/yr. The data are consistent with the shallow plate interface offshore and beneath the eastern Kenai and Prince William Sound being completely locked or nearly so, with elastic strain accumulation resulting in rapid motion in the direction of relative plate motion of sites in the overriding plate. The velocities of sites in the western Kenai, along strike to the southwest, are opposite in sign with those predicted from elastic strain accumulation. These data are incompatible with a significant locked region in this segment of the plate boundary. Trenchward velocities are found also for some sites in the Anchorage area. We interpret the trenchward velocities as being caused by a continuing postseismic transient from the 1964 great Alaska earthquake.

  6. New Continuous Timeseries Data at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Dietz, L.; Zuzlewski, S.; Kohler, W.; Gee, L.; Oppenheimer, D.; Romanowicz, B.

    2005-12-01

    The Northern California Earthquake Data Center (NCEDC) is an archive and distribution center for geophysical data for networks in northern and central California. Recent discovery of non-volcanic tremors in northern and central California has sparked user interest in access to a wider range of continuous seismic data in the region. The NCEDC has responded by expanding its archiving and distribution to all new available continuous data from northern California seismic networks (the USGS NCSN, the UC Berkeley BDSN, the Parkfield HRSN borehole network, and local USArray stations) at all available sample rates, to provide access to all recent real-time timeseries data, and to restore from tape and archive all NCSN continuous data from 2001-present. All new continuous timeseries data will also be available in near-real-time from the NCEDC via the DART (Data Available in Real Time) system, which allows users to directly download daily Telemetry MiniSEED files or to extract and retrieve the timeseries of their selection. The NCEDC will continue to create and distribute event waveform collections for all events detected by the Northern California Seismic System (NCSS), the northern California component of the California Integrated Seismic Network (CISN). All new continuous and event timeseries will be archived in daily intervals and are accessible via the same data request tools (NetDC, BREQ_FAST, EVT_FAST, FISSURES/DHI, STP) as previously archived waveform data. The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and USGS Menlo Park.

  7. An Architecture for Continuous Data Quality Monitoring in Medical Centers.

    PubMed

    Endler, Gregor; Schwab, Peter K; Wahl, Andreas M; Tenschert, Johannes; Lenz, Richard

    2015-01-01

    In the medical domain, data quality is very important. Since requirements and data change frequently, continuous and sustainable monitoring and improvement of data quality is necessary. Working together with managers of medical centers, we developed an architecture for a data quality monitoring system. The architecture enables domain experts to adapt the system during runtime to match their specifications using a built-in rule system. It also allows arbitrarily complex analyses to be integrated into the monitoring cycle. We evaluate our architecture by matching its components to the well-known data quality methodology TDQM. PMID:26262172

  8. The academic health center in complex humanitarian emergencies: lessons learned from the 2010 Haiti earthquake.

    PubMed

    Babcock, Christine; Theodosis, Christian; Bills, Corey; Kim, Jimin; Kinet, Melodie; Turner, Madeleine; Millis, Michael; Olopade, Olufunmilayo; Olopade, Christopher

    2012-11-01

    On January 12, 2010, a 7.0-magnitude earthquake struck Haiti. The event disrupted infrastructure and was marked by extreme morbidity and mortality. The global response to the disaster was rapid and immense, comprising multiple actors-including academic health centers (AHCs)-that provided assistance in the field and from home. The authors retrospectively examine the multidisciplinary approach that the University of Chicago Medicine (UCM) applied to postearthquake Haiti, which included the application of institutional structure and strategy, systematic deployment of teams tailored to evolving needs, and the actual response and recovery. The university mobilized significant human and material resources for deployment within 48 hours and sustained the effort for over four months. In partnership with international and local nongovernmental organizations as well as other AHCs, the UCM operated one of the largest and more efficient acute field hospitals in the country. The UCM's efforts in postearthquake Haiti provide insight into the role AHCs can play, including their strengths and limitations, in complex disasters. AHCs can provide necessary intellectual and material resources as well as technical expertise, but the cost and speed required for responding to an emergency, and ongoing domestic responsibilities, may limit the response of a large university and hospital system. The authors describe the strong institutional backing, the detailed predeployment planning and logistical support UCM provided, the engagement of faculty and staff who had previous experience in complex humanitarian emergencies, and the help of volunteers fluent in the local language which, together, made UCM's mission in postearthquake Haiti successful. PMID:23018336

  9. Comprehensive Nuclear-Test-Ban Treaty seismic monitoring: 2012 USNAS report and recent explosions, earthquakes, and other seismic sources

    SciTech Connect

    Richards, Paul G.

    2014-05-09

    A comprehensive ban on nuclear explosive testing is briefly characterized as an arms control initiative related to the Non-Proliferation Treaty. The work of monitoring for nuclear explosions uses several technologies of which the most important is seismology-a physics discipline that draws upon extensive and ever-growing assets to monitor for earthquakes and other ground-motion phenomena as well as for explosions. This paper outlines the basic methods of seismic monitoring within that wider context, and lists web-based and other resources for learning details. It also summarizes the main conclusions, concerning capability to monitor for test-ban treaty compliance, contained in a major study published in March 2012 by the US National Academy of Sciences.

  10. Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building

    USGS Publications Warehouse

    Kohler, M.D.; Davis, P.M.; Safak, E.

    2005-01-01

    Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.

  11. Federal Radiological Monitoring and Assessment Center Overview of FRMAC Operations

    SciTech Connect

    1998-03-01

    In the event of a major radiological emergency, 17 federal agencies with various statutory responsibilities have agreed to coordinate their efforts at the emergency scene under the umbrella of the Federal Radiological Emergency Response Plan. This cooperative effort will ensure that all federal radiological assistance fully supports their efforts to protect the public. the mandated federal cooperation ensures that each agency can obtain the data critical to its specific responsibilities. This Overview of Federal Radiological Monitoring and Assessment Center (FRMAC) describes the FRMAC response activities to a major radiological emergency. It also describes the federal assets and subsequent operational activities which provide federal radiological monitoring and assessment of the off-site areas.

  12. Ambient noise-based monitoring of seismic velocity changes associated with the 2014 Mw 6.0 South Napa earthquake

    NASA Astrophysics Data System (ADS)

    Taira, Taka'aki; Brenguier, Florent; Kong, Qingkai

    2015-09-01

    We perform an ambient noise-based monitoring to explore temporal variations of crustal seismic velocities before, during, and after the 24 August 2014 Mw 6.0 South Napa earthquake. A velocity drop of about 0.08% is observed immediately after the South Napa earthquake. Spatial variability of the velocity reduction is most correlated with the pattern of the peak ground velocity of the South Napa mainshock, which suggests that fracture damage in rocks induced by the dynamic strain is likely responsible for the coseismic velocity change. About 50% of the velocity reduction is recovered at the first 50 days following the South Napa mainshock. This postseismic velocity recovery may suggest a healing process of damaged rocks.

  13. Utilizing Changes in Repeating Earthquakes to Monitor Evolving Processes and Structure Before and During Volcanic Eruptions

    NASA Astrophysics Data System (ADS)

    Hotovec-Ellis, Alicia

    Repeating earthquakes are two or more earthquakes that share the same source location and source mechanism, which results in the earthquakes having highly similar waveforms when recorded at a seismic instrument. Repeating earthquakes have been observed in a wide variety of environments: from fault systems (such as the San Andreas and Cascadia subduction zone), to hydrothermal areas and volcanoes. Volcano seismologists are particularly concerned with repeating earthquakes, as they have been observed at volcanoes along the entire range of eruptive style and are often a prominent feature of eruption seismicity. The behavior of repeating earthquakes sometimes changes with time, which possibly reflects subtle changes in the mechanism creating the earthquakes. In Chapter 1, we document an example of repeating earthquakes during the 2009 eruption of Redoubt volcano that became increasingly frequent with time, until they blended into harmonic tremor prior to several explosions. We interpreted the source of the earthquakes as stick-slip on a fault near the conduit that slipped increasingly often as the explosion neared in response to the build-up of pressure in the system. The waveforms of repeating earthquakes may also change, even if the behavior does not. We can quantify changes in waveform using the technique of coda wave interferometry to differentiate between changes in source and medium. In Chapters 2 and 3, we document subtle changes in the coda of repeating earthquakes related to small changes in the near-surface velocity structure at Mount St. Helens before and during its eruption in 2004. Velocity changes have been observed prior to several volcanic eruptions, are thought to occur in response to volumetric strain and the opening or closing of cracks in the subsurface. We compared continuous records of velocity change against other geophysical data, and found that velocities at Mount St. Helens change in response to snow loading, fluid saturation, shaking from large distant earthquakes, shallow pressurization, and possibly lava extrusion. Velocity changes at Mount St. Helens are a complex mix of many different effects, and other complementary data are required to interpret the signal.

  14. Detection and monitoring of earthquake precursors: TwinSat, a Russia-UK satellite project

    NASA Astrophysics Data System (ADS)

    Chmyrev, Vitaly; Smith, Alan; Kataria, Dhiren; Nesterov, Boris; Owen, Christopher; Sammonds, Peter; Sorokin, Valery; Vallianatos, Filippos

    2013-09-01

    There is now a body of evidence to indicate that coupling occurs between the lithosphere-atmosphere-ionosphere prior to earthquake events. Nevertheless the physics of these phenomena and the possibilities of their use as part of an earthquake early warning system remain poorly understood. Proposed here is a programme to create a much greater understanding in this area through the deployment of a dedicated space asset along with coordinated ground stations, modelling and the creation of a highly accessible database. The space element would comprise 2 co-orbiting spacecraft (TwinSat) involving a microsatellite and a nanosatellite, each including a suite of science instruments appropriate to this study. Over a mission duration of 3 years ? 400 earthquakes in the range 6-6.9 on the Richter scale would be ‘observed’. Such a programme is a prerequisite for an effective earthquake early warning system.

  15. Early Results of Three-Year Monitoring of Red Wood Ants’ Behavioral Changes and Their Possible Correlation with Earthquake Events

    PubMed Central

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-01-01

    Simple Summary For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the video streams. Based on this automated approach, a statistical analysis of the ant behavior will be carried out. Abstract Short-term earthquake predictions with an advance warning of several hours or days are currently not possible due to both incomplete understanding of the complex tectonic processes and inadequate observations. Abnormal animal behaviors before earthquakes have been reported previously, but create problems in monitoring and reliability. The situation is different with red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae)). They have stationary mounds on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas. For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras with both a color and an infrared sensor. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the more than 45,000 hours of video streams. Based on this automated approach, a statistical analysis of the ants’ behavior will be carried out. In addition, other parameters (climate, geotectonic and biological), which may influence behavior, will be included in the analysis. PMID:26487310

  16. The effects of educational program on health volunteers’ knowledge regarding their approach to earthquake in health centers in Tehran

    PubMed Central

    JOUHARI, ZAHRA; PIRASTEH, AFSHAR; GHASSEMI, GHOLAM REZA; BAZRAFKAN, LEILA

    2015-01-01

    Introduction The people's mental, intellectual and physical non-readiness to confront earthquake may result in disastrous outcomes. This research aimed to study of effects of a training intervention on health connector’s knowledge regarding their approach to earthquake in health-training centers in East of Tehran. Methods This research which is a semi-experimental study was designed and executed in 2011, using a questionnaire with items based on the information of Crisis Management Org. After a pilot study and making the questionnaire valid and reliable, we determined the sample size. Then, the questionnaires were completed before and after the training program by 82 health connectors at health-treatment centers in the East of Tehran. Finally, the collected data were analyzed by SPSS 14, using paired sample t–test and Pearson's correlation coefficient. Results Health connectors were women with the mean age of 43.43±8.51 years. In this research, the mean score of connectors’ knowledge before and after the training was 35.15±4.3 and 43.73±2.91 out of 48, respectively. The difference was statistically significant (p=0.001). The classes were the most important source of information for the health connectors. Conclusion The people's knowledge to confront earthquake can be increased by holding training courses and workshops. Such training courses and workshops have an important role in data transfer and readiness of health connectors. PMID:25927068

  17. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  18. ENGLEKIRK STRUCTURAL ENGINEERING CENTER UCSD Jacobs School When it comes to earthquake safety,

    E-print Network

    Fainman, Yeshaiahu

    for the construction of high-rise office buildings, hospitals and apartment towers using cost-effective precast Francisco's skyline incorporates a new framing system tested at the Powell Labs. Flexible Frames Make High-Rise earthquake in Caracas, Venezuela killed 240 and caused $50 million in property damage. Here, a building tips

  19. The Irpinia Seismic Network: An Advanced Monitoring Infrastructure For Earthquake Early Warning in The Campania Region (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Iannaccone, G.; Zollo, A.; Bobbio, A.; Cantore, L.; Convertito, V.; Elia, L.; Festa, G.; Lancieri, M.; Martino, C.; Romeo, A.; Satriano, C.; Vassallo, M.

    2007-12-01

    A new seismic network (ISNet, Irpinia Seismic Network) is now operating in the Southern Italy. It is conceived as the core infrastructure for an Earthquake Early Warning System (EEWS) under development in Southern Italy. It is primarily aimed at providing an alert for moderate to large earthquakes (M>4) to selected target sites in Campania Region and it also provides data for rapid computation of regional ground-shaking maps. ISNet is deployed over an area of about 100×70 km2 covering the Apenninic active seismic zone where most of large earthquakes occurred during the last centuries, including the Ms=6.9, 1980 Irpinia earthquake. ISNet is composed of 29 seismic stations equipped with three components accelerometers and velocimeters aggregated in six smaller sub-nets. The sub-net stations are connected with a real-time communications to a central data- collector site (LCC, Local Control Center). The different LCCs are linked among them and to a Network Control Center (NCC) located in the city of Naples 100 km away from the network center, with different type of transmission systems chosen according their transmission mode robustness and reliability. The network is designed to provide estimates of the location and size of a potential destructive earthquake within few seconds from the earthquake detection, through an evolutionary and fully probabilistic approach. For the real time location we developed a methodology which extends and generalizes the one Horiuchi et al. (2005) by a) starting the location procedure after only one station has triggered, b) using the Equal Differential Time (EDT) approach to incorporate the triggered arrivals and the not-yet-triggered stations, c) estimating the hypocenter probabilistically as a pdf instead of as a point, and d) applying a full, non-linearized, global-search for each update of the location estimate. Following an evolutionary approach, the method evaluates, at each time step, the EDT equations considering not only each pair of triggered stations, but also those pairs where only one station has triggered. The size of earthquake is also evaluated by a real time, evolutionary algorithm based on a magnitude predictive model and a Bayesian formulation. It is aimed at evaluating the conditional probability density function of magnitude as a function of ground motion quantities measured on the early part of the acquired signals. The predictive models are empirical relationships which correlate the final event magnitude with the P-displacement amplitudes measured on first 2-4 seconds of record after the first-P arrival. The methods previously described for rapidly estimating the event's location and magnitude, are used to perform a real-time seismic hazard analysis allowing to compute the probabilistic distribution, or hazard curve, of ground motion intensity measures (IM) i.e. the peak ground acceleration (PGA) or the spectral acceleration (Sa), at selected sites of the Campania Region. We show the performances of the earthquake early warning system through applications to simulated large events and recorded low magnitude earthquakes.

  20. earthquake warning earthquake strikes

    E-print Network

    Emergency earthquake warning When an earthquake strikes Immediately after the earthquake Several-8550 For Students LargeLaargege Earthquake #12;According to tables explaining the JMA (Japan Meteorological Agency situation Wooden houses Ground situation Lifelines Slopes, etc. situation In preparation for an earthquake

  1. earthquake warning earthquake strikes

    E-print Network

    Emergency earthquake warning When an earthquake strikes Immediately after the earthquake Several, Kanagawa 226-8503 For Students LargeLaargege Earthquake #12;According to tables explaining the JMA (Japan for an earthquake with a seismic intensity of 5-Lower or greater, this manual summarizes what you should do

  2. Emergency radiological monitoring and analysis: Federal Radiological Monitoring and Assessment Center

    SciTech Connect

    Thome, D.J.

    1995-10-01

    The US Federal Radiological Emergency Response Plan (FRERP) provides the framework for integrating the various Federal agencies responding to a major radiological emergency. The FRERP authorizes the creation of the Federal Radiological Monitoring and Assessment Center (FRMAC), which is established to coordinate all Federal agencies involved in the monitoring and assessment of the off-site radiological conditions in support of the impacted State(s) and the Lead Federal Agency (LFA). Within the FRMAC, the Monitoring and Analysis Division (M&A) is responsible for coordinating all FRMAC assets involved in conducting a comprehensive program of environmental monitoring, sampling, radioanalysis, and quality assurance. To assure consistency, completeness, and the quality of the data produced, a methodology and procedures manual is being developed. This paper discusses the structure, assets, and operations of the FRMAC M&A and the content and preparation of the manual.

  3. Passive seismic monitoring of natural and induced earthquakes: case studies, future directions and socio-economic relevance

    USGS Publications Warehouse

    Bohnhoff, Marco; Dresen, Georg; Ellsworth, William L.; Ito, Hisao

    2010-01-01

    An important discovery in crustal mechanics has been that the Earth’s crust is commonly stressed close to failure, even in tectonically quiet areas. As a result, small natural or man-made perturbations to the local stress field may trigger earthquakes. To understand these processes, Passive Seismic Monitoring (PSM) with seismometer arrays is a widely used technique that has been successfully applied to study seismicity at different magnitude levels ranging from acoustic emissions generated in the laboratory under controlled conditions, to seismicity induced by hydraulic stimulations in geological reservoirs, and up to great earthquakes occurring along plate boundaries. In all these environments the appropriate deployment of seismic sensors, i.e., directly on the rock sample, at the earth’s surface or in boreholes close to the seismic sources allows for the detection and location of brittle failure processes at sufficiently low magnitude-detection threshold and with adequate spatial resolution for further analysis. One principal aim is to develop an improved understanding of the physical processes occurring at the seismic source and their relationship to the host geologic environment. In this paper we review selected case studies and future directions of PSM efforts across a wide range of scales and environments. These include induced failure within small rock samples, hydrocarbon reservoirs, and natural seismicity at convergent and transform plate boundaries. Each example represents a milestone with regard to bridging the gap between laboratory-scale experiments under controlled boundary conditions and large-scale field studies. The common motivation for all studies is to refine the understanding of how earthquakes nucleate, how they proceed and how they interact in space and time. This is of special relevance at the larger end of the magnitude scale, i.e., for large devastating earthquakes due to their severe socio-economic impact.

  4. Emergency radiological monitoring and analysis United States Federal Radiological Monitoring and Assessment Center

    SciTech Connect

    Thome, D.J.

    1994-09-01

    The United States Federal Radiological Emergency Response Plan (FRERP) provides the framework for integrating the various Federal agencies responding to a major radiological emergency. Following a major radiological incident the FRERP authorizes the creation of the Federal Radiological Monitoring and Assessment Center (FRMAC). The FRMAC is established to coordinate all Federal agencies involved in the monitoring and assessment of the off-site radiological conditions in support of the impacted states and the Lead Federal Agency (LFA). Within the FRMAC, the Monitoring and Analysis Division is responsible for coordinating all FRMAC assets involved in conducting a comprehensive program of environmental monitoring, sampling, radioanalysis and quality assurance. This program includes: (1) Aerial Radiological Monitoring - Fixed Wing and Helicopter, (2) Field Monitoring and Sampling, (3) Radioanalysis - Mobile and Fixed Laboratories, (4) Radiation Detection Instrumentation - Calibration and Maintenance, (5) Environmental Dosimetry, and (6) An integrated program of Quality Assurance. To assure consistency, completeness and the quality of the data produced, a methodology and procedures handbook is being developed. This paper discusses the structure, assets and operations of FRMAC monitoring and analysis and the content and preparation of this handbook.

  5. Self-Powered WSN for Distributed Data Center Monitoring.

    PubMed

    Brunelli, Davide; Passerone, Roberto; Rizzon, Luca; Rossi, Maurizio; Sartori, Davide

    2016-01-01

    Monitoring environmental parameters in data centers is gathering nowadays increasing attention from industry, due to the need of high energy efficiency of cloud services. We present the design and the characterization of an energy neutral embedded wireless system, prototyped to monitor perpetually environmental parameters in servers and racks. It is powered by an energy harvesting module based on Thermoelectric Generators, which converts the heat dissipation from the servers. Starting from the empirical characterization of the energy harvester, we present a power conditioning circuit optimized for the specific application. The whole system has been enhanced with several sensors. An ultra-low-power micro-controller stacked over the energy harvesting provides an efficient power management. Performance have been assessed and compared with the analytical model for validation. PMID:26729135

  6. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 1, Operations

    SciTech Connect

    NSTec Aerial Measurement Systems

    2012-07-31

    The Monitoring division is primarily responsible for the coordination and direction of: Aerial measurements to delineate the footprint of radioactive contaminants that have been released into the environment. Monitoring of radiation levels in the environment; Sampling to determine the extent of contaminant deposition in soil, water, air and on vegetation; Preliminary field analyses to quantify soil concentrations or depositions; and Environmental and personal dosimetry for FRMAC field personnel, during a Consequence Management Response Team (CMRT) and Federal Radiological Monitoring and Assessment Center (FRMAC) response. Monitoring and sampling techniques used during CM/FRMAC operations are specifically selected for use during radiological emergencies where large numbers of measurements and samples must be acquired, analyzed, and interpreted in the shortest amount of time possible. In addition, techniques and procedures are flexible so that they can be used during a variety of different scenarios; e.g., accidents involving releases from nuclear reactors, contamination by nuclear waste, nuclear weapon accidents, space vehicle reentries, or contamination from a radiological dispersal device. The Monitoring division also provides technicians to support specific Health and Safety Division activities including: The operation of the Hotline; FRMAC facility surveys; Assistance with Health and Safety at Check Points; and Assistance at population assembly areas which require support from the FRMAC. This volume covers deployment activities, initial FRMAC activities, development and implementation of the monitoring and assessment plan, the briefing of field teams, and the transfer of FRMAC to the EPA.

  7. Migration of seismicity and earthquake interactions monitored by GPS in SE Asia triple junction: Sulawesi, Indonesia

    E-print Network

    McCaffrey, Robert

    : Sulawesi, Indonesia Christophe Vigny,1 Hugo Perfettini,1,2 Andrea Walpersdorf,1,2 Anne Lemoine1 Wim Simons] Global Positioning System (GPS) measurements made in Sulawesi, Indonesia, from 1992 to 1999 detected, fault, fluids, seismotectonics, earthquake, Indonesia Citation: Vigny, C., et al., Migration

  8. Migration of seismicity and earthquake interactions monitored by GPS in SE Asia triple junction: Sulawesi, Indonesia

    E-print Network

    Vigny, Christophe

    : Sulawesi, Indonesia Christophe Vigny,1 Hugo Perfettini,1,2 Andrea Walpersdorf,1,2 Anne Lemoine,1 Wim Simons] Global Positioning System (GPS) measurements made in Sulawesi, Indonesia, from 1992 to 1999 detected, fault, fluids, seismotectonics, earthquake, Indonesia 1. Introduction [2] The Eurasian, Philippine Sea

  9. The Community Seismic Network and Quake-Catcher Network: Monitoring building response to earthquakes through community instrumentation

    NASA Astrophysics Data System (ADS)

    Cheng, M.; Kohler, M. D.; Heaton, T. H.; Clayton, R. W.; Chandy, M.; Cochran, E.; Lawrence, J. F.

    2013-12-01

    The Community Seismic Network (CSN) and Quake-Catcher Network (QCN) are dense networks of low-cost ($50) accelerometers that are deployed by community volunteers in their homes in California. In addition, many accelerometers are installed in public spaces associated with civic services, publicly-operated utilities, university campuses, and high-rise buildings. Both CSN and QCN consist of observation-based structural monitoring which is carried out using records from one to tens of stations in a single building. We have deployed about 150 accelerometers in a number of buildings ranging between five and 23 stories in the Los Angeles region. In addition to a USB-connected device which connects to the host's computer, we have developed a stand-alone sensor-plug-computer device that directly connects to the internet via Ethernet or WiFi. In the case of CSN, the sensors report data to the Google App Engine cloud computing service consisting of data centers geographically distributed across the continent. This robust infrastructure provides parallelism and redundancy during times of disaster that could affect hardware. The QCN sensors, however, are connected to netbooks with continuous data streaming in real-time via the distributed computing Berkeley Open Infrastructure for Network Computing software program to a server at Stanford University. In both networks, continuous and triggered data streams use a STA/LTA scheme to determine the occurrence of significant ground accelerations. Waveform data, as well as derived parameters such as peak ground acceleration, are then sent to the associated archives. Visualization models of the instrumented buildings' dynamic linear response have been constructed using Google SketchUp and MATLAB. When data are available from a limited number of accelerometers installed in high rises, the buildings are represented as simple shear beam or prismatic Timoshenko beam models with soil-structure interaction. Small-magnitude earthquake records are used to identify the first two pairs of horizontal vibrational frequencies, which are then used to compute the response on every floor of the building, constrained by the observed data. The approach has been applied to a CSN-instrumented 12-story reinforced concrete building near downtown Los Angeles. The frequencies were identified directly from spectra of the 8 August 2012 M4.5 Yorba Linda, California earthquake acceleration time series. When the basic dimensions and the first two frequencies are input into a prismatic Timoshenko beam model of the building, the model yields mode shapes that have been shown to match well with densely recorded data. For the instrumented 12-story building, comparisons of the predictions of responses on other floors using only the record from the 9th floor with actual data from the other floors shows this method to approximate the true response remarkably well.

  10. Monitoring of earthquake precursors by multi-parameter stations in Eskisehir region (Turkey)

    NASA Astrophysics Data System (ADS)

    Yuce, G.; Ugurluoglu, D. Y.; Adar, N.; Yalcin, T.; Yaltirak, C.; Streil, T.; Oeserd, V. O.

    2010-04-01

    The objective of this study was to investigate the geochemical and hydrogeological effects of earthquakes on fluids in aquifers, particularly in a seismically active area such as Eskisehir (Turkey) where the Thrace-Eskisehir Fault Zone stretches over the region. The study area is also close to the North Anatolian Fault Zone generating devastating earthquakes such as the ones experienced in 1999, reactivating the Thrace-Eskisehir Fault. In the studied area, Rn and CO2 gas concentrations, redox potential, electrical conductivity, pH, water level, water temperature, and the climatic parameters were continuously measured in five stations for about a year. Based on the gathered data from the stations, some ambiguous anomalies in geochemical parameters and Rn concentration of groundwater were observed as precursors several days prior to an earthquake. According to the mid-term observations of this study, well-water level changes were found to be a good indicator for seismic estimations in the area, as it comprises naturally filtered anomalies reflecting only the changes due to earthquakes. Also, the results obtained from this study suggest that both the changes in well-water level and gas-water chemistry need to be interpretated together for more accurate estimations. Valid for the studied area, it can be said that shallow earthquakes with epicentral distances of <30 km from the observation stations have more influence on hydrochemical parameters of groundwater and well-water level changes. Although some hydrochemical anomalies were observed in the area, it requires further observations in order to be able to identify them as precursors.

  11. Monitoring Local and Teleseismic Earthquakes Off--Shore San Diego(California) During an OBSIP Test Deployment

    NASA Astrophysics Data System (ADS)

    Laske, G.; Babcock, J.; Hollinshead, C.; Georgieff, P.; Allmann, B.; Orcutt, J.

    2004-12-01

    The Scripps OBS (Ocean Bottom Seismometer) team is one of three groups that provide instrumentation for the US National OBS Instrument Pool (OBSIP). The compact active source LC2000 instruments are being used successfully in numerous experiments, with excellent data quality and return rates. A set of five new passive seismic instruments was test--deployed from November 6th, 2003 through January 8th, 2004 in the San Diego Trough, about 1km below the sea surface, about 40km off--shore San Diego, California. These instruments are equipped with a Nanometrics Trillium 40s 3--component seismometer and a Cox--Webb differential pressure gauge. We recorded more than 30 teleseismic earthquakes suitable for a long-period surface wave study. The vertical--component seismometer recordings are of excellent quality and are often superior to those from similar sensors on land (Guralp CMG-40T). The signal--to--noise ratio on the DPGs depend strongly on the water depth and was expected to be low for the test deployment. Nevertheless, the December 22, 2003 San Simeon/ California earthquake was recorded with high fidelity and non--seismogenic signals are extremely coherent down to very long periods. We also recorded numerous local earthquakes. Many of these occurred off-shore and the OBSs were the closest stations by many tens of kilometers. For example, a magnitude 3.0 earthquake on the Coronado Banks Fault was recorded at station SOL in La Jolla at about 30km distance, with a signal-to-noise ratio too poor to pick the first arrival. The next closest stations were 60km and 80km away, while one of the OBSs was only 20km away. The co-deployment of DPGs allowed us to observe the first P arrival very clearly. We also recorded numerous events that were not recorded on land. About six months later, on June 15, 2004 the greater San Diego area was struck by a magnitude 5.2 earthquake on the San Clemente Fault, about 40km southwest of the OBS test deployment. Though no structural damage was reported, intensity 4 shaking occurred throughout the city, which prompted Amtrak and Sea World to shut down operations for inspections. These events are continous reminders that significant seismic hazard is caused by activity along the only poorly understood, off-shore faults in the California Borderland. Realtime seismic monitoring using cabled or moored seismic observatories is clearly needed.

  12. The Evolution of the Federal Monitoring and Assessment Center

    SciTech Connect

    NSTec Aerial Measurement System

    2012-07-31

    The Federal Radiological Monitoring and Assessment Center (FRMAC) is a federal emergency response asset whose assistance may be requested by the Department of Homeland Security (DHS), the Department of Defense (DoD), the Environmental Protection Agency (EPA), the Nuclear Regulatory Commission (NRC), and state and local agencies to respond to a nuclear or radiological incident. It is an interagency organization with representation from the Department of Energy’s National Nuclear Security Administration (DOE/NNSA), the Department of Defense (DoD), the Environmental Protection Agency (EPA), the Department of Health and Human Services (HHS), the Federal Bureau of Investigation (FBI), and other federal agencies. FRMAC, in its present form, was created in 1987 when the radiological support mission was assigned to the DOE’s Nevada Operations Office by DOE Headquarters. The FRMAC asset, including its predecessor entities, was created, grew, and evolved to function as a response to radiological incidents. Radiological emergency response exercises showed the need for a coordinated approach to managing federal emergency monitoring and assessment activities. The mission of FRMAC is to coordinate and manage all federal radiological environmental monitoring and assessment activities during a nuclear or radiological incident within the United States in support of state,local, tribal governments, DHS, and the federal coordinating agency. Radiological emergency response professionals with the DOE’s national laboratories support the Radiological Assistance Program (RAP), National Atmospheric Release Advisory Center (NARAC), the Aerial MeasuringSystem (AMS), and the Radiation Emergency Assistance Center/Training Site (REAC/TS). These teams support the FRMAC to provide: ? Atmospheric transport modeling ? Radiation monitoring ? Radiological analysis and data assessments ? Medical advice for radiation injuries In support of field operations, the FRMAC provides geographic information systems, communications, mechanical, electrical, logistics, and administrative support. The size of the FRMAC is tailored to the incident and is comprised of emergency response professionals drawn from across the federal government. State and local emergency response teams may also integrate their operations with FRMAC, but are not required to.

  13. Catalog of Earthquake Hypocenters at Alaskan Volcanoes: January 1 through December 31, 2007

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.

    2008-01-01

    Between January 1 and December 31, 2007, AVO located 6,664 earthquakes of which 5,660 occurred within 20 kilometers of the 33 volcanoes monitored by the Alaska Volcano Observatory. Monitoring highlights in 2007 include: the eruption of Pavlof Volcano, volcanic-tectonic earthquake swarms at the Augustine, Illiamna, and Little Sitkin volcanic centers, and the cessation of episodes of unrest at Fourpeaked Mountain, Mount Veniaminof and the northern Atka Island volcanoes (Mount Kliuchef and Korovin Volcano). This catalog includes descriptions of : (1) locations of seismic instrumentation deployed during 2007; (2) earthquake detection, recording, analysis, and data archival systems; (3) seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2007; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2007.

  14. The Savannah River Technology Center environmental monitoring field test platform

    SciTech Connect

    Rossabi, J.

    1993-03-05

    Nearly all industrial facilities have been responsible for introducing synthetic chemicals into the environment. The Savannah River Site is no exception. Several areas at the site have been contaminated by chlorinated volatile organic chemicals. Because of the persistence and refractory nature of these contaminants, a complete clean up of the site will take many years. A major focus of the mission of the Environmental Sciences Section of the Savannah River Technology Center is to develop better, faster, and less expensive methods for characterizing, monitoring, and remediating the subsurface. These new methods can then be applied directly at the Savannah River Site and at other contaminated areas in the United States and throughout the world. The Environmental Sciences Section has hosted field testing of many different monitoring technologies over the past two years primarily as a result of the Integrated Demonstration Program sponsored by the Department of Energy`s Office of Technology Development. This paper provides an overview of some of the technologies that have been demonstrated at the site and briefly discusses the applicability of these techniques.

  15. Federal Radiological Monitoring and Assessment Center Analytical Response

    SciTech Connect

    E.C. Nielsen

    2003-04-01

    The Federal Radiological Monitoring and Assessment Center (FRMAC) is authorized by the Federal Radiological Emergency Response Plan to coordinate all off-site radiological response assistance to state and local government s, in the event of a major radiological emergency in the United States. The FRMAC is established by the U.S. Department of Energy, National Nuclear Security Administration, to coordinate all Federal assets involved in conducting a comprehensive program of radiological environmental monitoring, sampling, radioanalysis, quality assurance, and dose assessment. During an emergency response, the initial analytical data is provided by portable field instrumentation. As incident responders scale up their response based on the seriousness of the incident, local analytical assets and mobile laboratories add additional capability and capacity. During the intermediate phase of the response, data quality objectives and measurement quality objectives are more rigorous. These higher objectives will require the use of larger laboratories, with greater capacity and enhanced capabilities. These labs may be geographically distant from the incident, which will increase sample management challenges. This paper addresses emergency radioanalytical capability and capacity and its utilization during FRMAC operations.

  16. The response of academic medical centers to the 2010 Haiti earthquake: the Mount Sinai School of Medicine experience.

    PubMed

    Ripp, Jonathan A; Bork, Jacqueline; Koncicki, Holly; Asgary, Ramin

    2012-01-01

    On January 12, 2010, Haiti was struck by a 7.0 earthquake which left the country in a state of devastation. In the aftermath, there was an enormous relief effort in which academic medical centers (AMC) played an important role. We offer a retrospective on the AMC response through the Mount Sinai School of Medicine (MSSM) experience. Over the course of the year that followed the Earthquake, MSSM conducted five service trips in conjunction with two well-established groups which have provided service to the Haitian people for over 15 years. MSSM volunteer personnel included nurses, resident and attending physicians, and specialty fellows who provided expertise in critical care, emergency medicine, wound care, infectious diseases and chronic disease management of adults and children. Challenges faced included stressful and potentially hazardous working conditions, provision of care with limited resources and cultural and language barriers. The success of the MSSM response was due largely to the strength of its human resources and the relationship forged with effective relief organizations. These service missions fulfilled the institution's commitment to social responsibility and provided a valuable training opportunity in advocacy. For other AMCs seeking to respond in future emergencies, we suggest early identification of a partner with field experience, recruitment of administrative and faculty support across the institution, significant pre-departure orientation and utilization of volunteers to fundraise and advocate. Through this process, AMCs can play an important role in disaster response. PMID:22232447

  17. The Response of Academic Medical Centers to the 2010 Haiti Earthquake: The Mount Sinai School of Medicine Experience

    PubMed Central

    Ripp, Jonathan A.; Bork, Jacqueline; Koncicki, Holly; Asgary, Ramin

    2012-01-01

    On January 12, 2010, Haiti was struck by a 7.0 earthquake which left the country in a state of devastation. In the aftermath, there was an enormous relief effort in which academic medical centers (AMC) played an important role. We offer a retrospective on the AMC response through the Mount Sinai School of Medicine (MSSM) experience. Over the course of the year that followed the Earthquake, MSSM conducted five service trips in conjunction with two well-established groups which have provided service to the Haitian people for over 15 years. MSSM volunteer personnel included nurses, resident and attending physicians, and specialty fellows who provided expertise in critical care, emergency medicine, wound care, infectious diseases and chronic disease management of adults and children. Challenges faced included stressful and potentially hazardous working conditions, provision of care with limited resources and cultural and language barriers. The success of the MSSM response was due largely to the strength of its human resources and the relationship forged with effective relief organizations. These service missions fulfilled the institution's commitment to social responsibility and provided a valuable training opportunity in advocacy. For other AMCs seeking to respond in future emergencies, we suggest early identification of a partner with field experience, recruitment of administrative and faculty support across the institution, significant pre-departure orientation and utilization of volunteers to fundraise and advocate. Through this process, AMCs can play an important role in disaster response. PMID:22232447

  18. Re-centering variable friction device for vibration control of structures subjected to near-field earthquakes

    NASA Astrophysics Data System (ADS)

    Ozbulut, Osman E.; Hurlebaus, Stefan

    2011-11-01

    This paper proposes a re-centering variable friction device (RVFD) for control of civil structures subjected to near-field earthquakes. The proposed hybrid device has two sub-components. The first sub-component of this hybrid device consists of shape memory alloy (SMA) wires that exhibit a unique hysteretic behavior and full recovery following post-transformation deformations. The second sub-component of the hybrid device consists of variable friction damper (VFD) that can be intelligently controlled for adaptive semi-active behavior via modulation of its voltage level. In general, installed SMA devices have the ability to re-center structures at the end of the motion and VFDs can increase the energy dissipation capacity of structures. The full realization of these devices into a singular, hybrid form which complements the performance of each device is investigated in this study. A neuro-fuzzy model is used to capture rate- and temperature-dependent nonlinear behavior of the SMA components of the hybrid device. An optimal fuzzy logic controller (FLC) is developed to modulate voltage level of VFDs for favorable performance in a RVFD hybrid application. To obtain optimal controllers for concurrent mitigation of displacement and acceleration responses, tuning of governing fuzzy rules is conducted by a multi-objective heuristic optimization. Then, numerical simulation of a multi-story building is conducted to evaluate the performance of the hybrid device. Results show that a re-centering variable friction device modulated with a fuzzy logic control strategy can effectively reduce structural deformations without increasing acceleration response during near-field earthquakes.

  19. Logic-centered architecture for ubiquitous health monitoring.

    PubMed

    Lewandowski, Jacek; Arochena, Hisbel E; Naguib, Raouf N G; Chao, Kuo-Ming; Garcia-Perez, Alexeis

    2014-09-01

    One of the key points to maintain and boost research and development in the area of smart wearable systems (SWS) is the development of integrated architectures for intelligent services, as well as wearable systems and devices for health and wellness management. This paper presents such a generic architecture for multiparametric, intelligent and ubiquitous wireless sensing platforms. It is a transparent, smartphone-based sensing framework with customizable wireless interfaces and plug'n'play capability to easily interconnect third party sensor devices. It caters to wireless body, personal, and near-me area networks. A pivotal part of the platform is the integrated inference engine/runtime environment that allows the mobile device to serve as a user-adaptable personal health assistant. The novelty of this system lays in a rapid visual development and remote deployment model. The complementary visual Inference Engine Editor that comes with the package enables artificial intelligence specialists, alongside with medical experts, to build data processing models by assembling different components and instantly deploying them (remotely) on patient mobile devices. In this paper, the new logic-centered software architecture for ubiquitous health monitoring applications is described, followed by a discussion as to how it helps to shift focus from software and hardware development, to medical and health process-centered design of new SWS applications. PMID:25192566

  20. On the Potential Uses of Static Offsets Derived From Low-Cost Community Instruments and Crowd-Sourcing for Earthquake Monitoring and Rapid Response

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Murray, J. R.; Iannucci, R. A.

    2013-12-01

    We explore the efficacy of low-cost community instruments (LCCIs) and crowd-sourcing to produce rapid estimates of earthquake magnitude and rupture characteristics which can be used for earthquake loss reduction such as issuing tsunami warnings and guiding rapid response efforts. Real-time high-rate GPS data are just beginning to be incorporated into earthquake early warning (EEW) systems. These data are showing promising utility including producing moment magnitude estimates which do not saturate for the largest earthquakes and determining the geometry and slip distribution of the earthquake rupture in real-time. However, building a network of scientific-quality real-time high-rate GPS stations requires substantial infrastructure investment which is not practicable in many parts of the world. To expand the benefits of real-time geodetic monitoring globally, we consider the potential of pseudorange-based GPS locations such as the real-time positioning done onboard cell phones or on LCCIs that could be distributed in the same way accelerometers are distributed as part of the Quake Catcher Network (QCN). While location information from LCCIs often have large uncertainties, their low cost means that large numbers of instruments can be deployed. A monitoring network that includes smartphones could collect data from potentially millions of instruments. These observations could be averaged together to substantially decrease errors associated with estimated earthquake source parameters. While these data will be inferior to data recorded by scientific-grade seismometers and GPS instruments, there are features of community-based data collection (and possibly analysis) that are very attractive. This approach creates a system where every user can host an instrument or download an application to their smartphone that both provides them with earthquake and tsunami warnings while also providing the data on which the warning system operates. This symbiosis helps to encourage people to both become users of the warning system and to contribute data to the system. Further, there is some potential to take advantage of the LCCI hosts' computing and communications resources to do some of the analysis required for the warning system. We will present examples of the type of data which might be observed by pseudorange-based positioning for both actual earthquakes and laboratory tests as well as performance tests of potential earthquake source modeling derived from pseudorange data. A highlight of these performance tests is a case study of the 2011 Mw 9 Tohoku-oki earthquake.

  1. Seismic Monitoring and Post-Seismic Investigations following the 12 January 2010 Mw 7.0 Haiti Earthquake (Invited)

    NASA Astrophysics Data System (ADS)

    Altidor, J.; Dieuseul, A.; Ellsworth, W. L.; Given, D. D.; Hough, S. E.; Janvier, M. G.; Maharrey, J. Z.; Meremonte, M. E.; Mildor, B. S.; Prepetit, C.; Yong, A.

    2010-12-01

    We report on ongoing efforts to establish seismic monitoring in Haiti. Following the devastating M7.0 Haiti earthquake of 12 January 2010, the Bureau des Mines et de l’Energie worked with the U.S. Geological Survey and other scientific institutions to investigate the earthquake and to better assess hazard from future earthquakes. We deployed several types of portable instruments to record aftershocks: strong-motion instruments within Port-au-Prince to investigate the variability of shaking due to local geological conditions, and a combination of weak-motion, strong-motion, and broadband instruments around the Enriquillo-Plaintain Garden fault (EPGF), primarily to improve aftershock locations and to lower the magnitude threshold of aftershock recording. A total of twenty instruments were deployed, including eight RefTek instruments and nine strong-motion (K2) accelerometers deployed in Port-au-Prince in collaboration with the USGS, and three additional broadband stations deployed in the epicentral region in collaboration with the University of Nice. Five K2s have remained in operation in Port-au-Prince since late June; in late June two instruments were installed in Cap-Haitien and Port de Paix in northern Haiti to provide monitoring of the Septentrional fault. A permanent strong-motion (NetQuakes) instrument was deployed in late June at the US Embassy. Five additional NetQuakes instruments will be deployed by the BME in late 2010/early 2011. Addionally, the BME has collaborated with other scientific institutions, including Columbia University, the Institut Géophysique du Globe, University of Nice, the University of Texas at Austin, and Purdue University, to conduct other types of investigations. These studies include, for example, sampling of uplifted corals to establish a chronology of prior events in the region of the Enriquillo-Plantain Garden fault, surveys of geotechnical properties to develop microzonation maps of metropolitan Port-au-Prince, surveys of damage to public buildings, and a continuation of GPS surveys to measure co- and post-seismic displacements in collaboration with researchers from Purdue University. Preliminary analysis of aftershock recordings and damage surveys reveals that local site effects contributed significantly to the damage in some neighborhoods of Port-au-Prince. However, in general, bad construction practices and high population density were the primary causes of the extent of the damage and the high number of fatalities.

  2. Taiwan Nantou County earthquake 0327 Taiwan Nantou County earthquake

    E-print Network

    Chen, Sheng-Wei

    Taiwan Nantou County earthquake 20130327 1 #12;0327 Taiwan Nantou County earthquake Source, Intensity 5 #12;I II III IV V VI VII Intensity Shake map of the March 27 Earthquake The peak ground and Technology Center for Disaster Reduction (NCDR) #12;Earthquake Response and Evacuation are a Part of Students

  3. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site

    EPA Science Inventory

    The presentation covers the following monitoring objectives at the demonstration site at Edison, NJ: Hydrologic performance, water quality performance, urban heat island effects, maintenance effects and infiltration water parameters. There will be a side by side monitoring of ...

  4. Earthquakes & Volcanoes, Volume 21, Number 1, 1989: Featuring the U.S. Geological Survey's National Earthquake Information Center in Golden, Colorado, USA

    USGS Publications Warehouse

    U.S. Geological Survey; Spall, Henry, (Edited By); Schnabel, Diane C.

    1989-01-01

    Earthquakes and Volcanoes is published bimonthly by the U.S. Geological Survey to provide current information on earthquakes and seismology, volcanoes, and related natural hazards of interest to both generalized and specialized readers. The Secretary of the Interior has determined that the publication of this periodical is necessary in the transaction of the public business required by law of this Department. Use of funds for printing this periodical has been approved by the Office of Management and Budget through June 30, 1989. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

  5. Continuous Video Electroencephalographic (EEG) Monitoring for Electrographic Seizure Diagnosis in Neonates: A Single-Center Study.

    E-print Network

    Wietstock, SO; Bonifacio, SL; Sullivan, JE; Nash, KB; Glass, HC

    2015-01-01

    of Child Neurology examine the yield of continuous video EEGvideo EEG affected clinical management in more than half of monitored children.Video Electroencephalographic (EEG) Monitoring for Electrographic Seizure Diagnosis in Neonates: A Single-Center Study Journal of Child

  6. 88 hours: The U.S. Geological Survey National Earthquake Information Center response to the 11 March 2011 Mw 9.0 Tohoku earthquake

    USGS Publications Warehouse

    Hayes, G.P.; Earle, P.S.; Benz, H.M.; Wald, D.J.; Briggs, R.W.

    2011-01-01

    This article presents a timeline of NEIC response to a major global earthquake for the first time in a formal journal publication. We outline the key observations of the earthquake made by the NEIC and its partner agencies, discuss how these analyses evolved, and outline when and how this information was released to the public and to other internal and external parties. Our goal in the presentation of this material is to provide a detailed explanation of the issues faced in the response to a rare, giant earthquake. We envisage that the timeline format of this presentation can highlight technical and procedural successes and shortcomings, which may in turn help prompt research by our academic partners and further improvements to our future response efforts. We have shown how NEIC response efforts have significantly improved over the past six years since the great 2004 Sumatra-Andaman earthquake. We are optimistic that the research spawned from this disaster, and the unparalleled dense and diverse data sets that have been recorded, can lead to similar-and necessary-improvements in the future.

  7. GONAF - A deep Geophysical Observatory at the North Anatolian Fault: Permanent downhole monitoring of a pending major earthquake

    NASA Astrophysics Data System (ADS)

    Bulut, Fatih; Bohnhoff, Marco; Dresen, Georg; Raub, Christina; Kilic, Tugbay; Kartal, Recai F.; Tuba Kadirioglu, F.; Nurlu, Murat; Ito, Hisao; Malin, Peter E.

    2014-05-01

    The North Anatolian Fault Zone (NAFZ hereafter) is a right-lateral transform plate boundary between the Anatolian plate and Eurasia accommodating a relative plate motion of ~25 mm/yr. Almost the entire fault zone has failed during the last century as a westward migrating sequence of destructive earthquakes leaving a very high probability of a forthcoming large event to the Sea of Marmara segments. This area did not host any M>7 earthquake since 1766. Therefore, listening to the Sea of Marmara segments at a very low detection threshold is required to address how the brittle deformation develops along a critically-stressed fault segment prior to a potential failure. GONAF-ICDP project has been developed to design a downhole seismic network surrounding the Sea of Marmara segments of the NAFZ deploying 300 m deep boreholes equipped with a chain of sensitive seismographs. Natural and city-induced noise is attenuated through the unconsolidated subsurface formation and therefore provides ideal boundary conditions for seismic monitoring within the intact rocks at greater depths. A typical GONAF borehole consists of 1 Hz vertical sensor at every 75 m depth increment and a combination of 1Hz, 2Hz and 15 Hz 3C sensors at 300 m depth. By now, three boreholes were successfully implemented in the Tuzla and Yalova-Ç?narc?k regions. The plan is to complete four more GONAF boreholes in 2014. Our preliminary results show that GONAF waveform recordings will broaden the magnitude range down to ~M -1 in the target area providing a better characterization of seismically active features in time and space.

  8. The continuous automatic monitoring network installed in Tuscany (Italy) since late 2002, to study earthquake precursory phenomena

    NASA Astrophysics Data System (ADS)

    Pierotti, Lisa; Cioni, Roberto

    2010-05-01

    Since late 2002, a continuous automatic monitoring network (CAMN) was designed, built and installed in Tuscany (Italy), in order to investigate and define the geochemical response of the aquifers to the local seismic activity. The purpose of the investigation was to identify eventual earthquake precursors. The CAMN is constituted by two groups of five measurement stations each. A first group has been installed in the Serchio and Magra graben (Garfagnana and Lunigiana Valleys, Northern Tuscany), while the second one, in the area of Mt. Amiata (Southern Tuscany), an extinct volcano. Garfagnana, Lunigiana and Mt. Amiata regions belong to the inner zone of the Northern Apennine fold-and-thrust belt. This zone has been involved in the post-collision extensional tectonics since the Upper Miocene-Pliocene. Such tectonic activity has produced horst and graben structures oriented from N-S to NW-SE that are transferred by NE-SW system. Both Garfagnana (Serchio graben) and Lunigiana (Magra graben) belong to the most inner sector of the belt where the seismic sources, responsible for the strongest earthquakes of the northern Apennine, are located (e.g. the M=6.5 earthquake of September 1920). The extensional processes in southern Tuscany have been accompanied by magmatic activity since the Upper Miocene, developing effusive and intrusive products traditionally attributed to the so-called Tuscan Magmatic Province. Mt. Amiata, whose magmatic activity ceased about 0.3 M.y. ago, belongs to the extensive Tyrrhenian sector that is characterized by high heat flow and crustal thinning. The whole zone is characterized by wide-spread but moderate seismicity (the maximum recorded magnitude has been 5.1 with epicentre in Piancastagnaio, 1919). The extensional regime in both the Garfagnana-Lunigiana and Mt. Amiata area is confirmed by the focal mechanisms of recent earthquakes. An essential phase of the monitoring activities has been the selection of suitable sites for the installation of monitoring stations. This has been carried out on the basis of: i) hydrogeologic and structural studies in order to assess the underground fluid circulation regime; ii) a detailed geochemical study of all the natural manifestations present in the selected territories, such as cold and hot springs and gas emission zones; iii) logistical aspects. Therefore, a detailed hydrogeochemical study was performed in 2002. A total of 150 water points were sampled in Garfagnana/Lunigiana area (N-W Tuscany) and analysed. Based on the results of this multidisciplinary study, five water points suitable for the installation of the monitoring stations, were selected. They are: Bagni di Lucca (Bernabò spring), Gallicano (Capriz spring) and Pieve Fosciana (Prà di Lama spring) in Garfagnana, Equi Terme (main spring feeding the swimming pool of the thermal resort) and Villafranca in Lunigiana (well feeding the public swimming pool). In the Amiata area, in the preliminary campaign, 69 water points were sampled and analyzed and five sites were selected. They are Piancastagnaio, Santa Fiora, Pian dei Renai and Bagnore, which are fed by the volcanic aquifer, and Bagno Vignoni borehole, which is fed by the evaporite carbonate aquifer. The installation and start-up process of the monitoring systems in the Garfagnana-Lunigiana area begun in November 2002; in the Monte Amiata region it begun in June 2003. From the day of installation, a periodic water sampling and manual measurement of the main physical and physicochemical parameters have been carried out on a monthly basis. Such activity has the double function of performing a cross-check of the monitoring instrumentation, and carrying out additional chemical and isotopic analysis. The continuous automatic monitoring stations operate with flowing water (about 5 litres per minute) and record the following parameters: temperature (T), pH, electrical conductivity (EC), redox potential (ORP) and the content of CO2 and CH4 dissolved in water. Data are acquired once per second; the average value, median value and variance of the samples collec

  9. Source spectra, moment, and energy for recent eastern mediterranean earthquakes: calibration of international monitoring system stations

    SciTech Connect

    Mayeda, K M; Hofstetter, A; Rodgers, A J; Walter, W R

    2000-07-26

    In the past several years there have been several large (M{sub w} > 7.0) earthquakes in the eastern Mediterranean region (Gulf of Aqaba, Racha, Adana, etc.), many of which have had aftershock deployments by local seismological organizations. In addition to providing ground truth data (GT << 5 km) that is used in regional location calibration and validation, the waveform data can be used to aid in calibrating regional magnitudes, seismic discriminants, and velocity structure. For small regional events (m{sub b} << 4.5), a stable, accurate magnitude is essential in the development of realistic detection threshold curves, proper magnitude and distance amplitude correction processing, formation of an M{sub s}:m{sub b} discriminant, and accurate yield determination of clandestine nuclear explosions. Our approach provides a stable source spectra from which M{sub w} and m{sub b} can be obtained without regional magnitude biases. Once calibration corrections are obtained for earthquakes, the coda-derived source spectra exhibit strong depth-dependent spectral peaking when the same corrections are applied to explosions at the Nevada Test Site (Mayeda and Walter, 1996), chemical explosions in the recent ''Depth of Burial'' experiment in Kazahkstan (Myers et al., 1999), and the recent nuclear test in India. For events in the western U.S. we found that total seismic energy, E, scales as M{sub o}{sup 0.25} resulting in more radiated energy than would be expected under the assumptions of constant stress-drop scaling. Preliminary results for events in the Middle East region also show this behavior, which appears to be the result of intermediate spectra fall-off (f{sup 1.5}) for frequencies ranging between {approx}0.1 and 0.8 Hz for the larger events. We developed a Seismic Analysis Code (SAC) coda processing command that reads in an ASCII flat file that contains calibration information specific for a station and surrounding region, then outputs a coda-derived source spectra, moment estimate, and energy estimate.

  10. Managing Multi-center Flow Cytometry Data for Immune Monitoring

    PubMed Central

    White, Scott; Laske, Karoline; Welters, Marij JP; Bidmon, Nicole; van der Burg, Sjoerd H; Britten, Cedrik M; Enzor, Jennifer; Staats, Janet; Weinhold, Kent J; Gouttefangeas, Cécile; Chan, Cliburn

    2014-01-01

    With the recent results of promising cancer vaccines and immunotherapy1–5, immune monitoring has become increasingly relevant for measuring treatment-induced effects on T cells, and an essential tool for shedding light on the mechanisms responsible for a successful treatment. Flow cytometry is the canonical multi-parameter assay for the fine characterization of single cells in solution, and is ubiquitously used in pre-clinical tumor immunology and in cancer immunotherapy trials. Current state-of-the-art polychromatic flow cytometry involves multi-step, multi-reagent assays followed by sample acquisition on sophisticated instruments capable of capturing up to 20 parameters per cell at a rate of tens of thousands of cells per second. Given the complexity of flow cytometry assays, reproducibility is a major concern, especially for multi-center studies. A promising approach for improving reproducibility is the use of automated analysis borrowing from statistics, machine learning and information visualization21–23, as these methods directly address the subjectivity, operator-dependence, labor-intensive and low fidelity of manual analysis. However, it is quite time-consuming to investigate and test new automated analysis techniques on large data sets without some centralized information management system. For large-scale automated analysis to be practical, the presence of consistent and high-quality data linked to the raw FCS files is indispensable. In particular, the use of machine-readable standard vocabularies to characterize channel metadata is essential when constructing analytic pipelines to avoid errors in processing, analysis and interpretation of results. For automation, this high-quality metadata needs to be programmatically accessible, implying the need for a consistent Application Programming Interface (API). In this manuscript, we propose that upfront time spent normalizing flow cytometry data to conform to carefully designed data models enables automated analysis, potentially saving time in the long run. The ReFlow informatics framework was developed to address these data management challenges. PMID:26085786

  11. Space Monitoring Data Center at Moscow State University

    NASA Astrophysics Data System (ADS)

    Kalegaev, Vladimir; Bobrovnikov, Sergey; Barinova, Vera; Myagkova, Irina; Shugay, Yulia; Barinov, Oleg; Dolenko, Sergey; Mukhametdinova, Ludmila; Shiroky, Vladimir

    Space monitoring data center of Moscow State University provides operational information on radiation state of the near-Earth space. Internet portal http://swx.sinp.msu.ru/ gives access to the actual data characterizing the level of solar activity, geomagnetic and radiation conditions in the magnetosphere and heliosphere in the real time mode. Operational data coming from space missions (ACE, GOES, ELECTRO-L1, Meteor-M1) at L1, LEO and GEO and from the Earth’s surface are used to represent geomagnetic and radiation state of near-Earth environment. On-line database of measurements is also maintained to allow quick comparison between current conditions and conditions experienced in the past. The models of space environment working in autonomous mode are used to generalize the information obtained from observations on the whole magnetosphere. Interactive applications and operational forecasting services are created on the base of these models. They automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons using data from LEO orbits. Special forecasting services give short-term forecast of SEP penetration to the Earth magnetosphere at low altitudes, as well as relativistic electron fluxes at GEO. Velocities of recurrent high speed solar wind streams on the Earth orbit are predicted with advance time of 3-4 days on the basis of automatic estimation of the coronal hole areas detected on the images of the Sun received from the SDO satellite. By means of neural network approach, Dst and Kp indices online forecasting 0.5-1.5 hours ahead, depending on solar wind and the interplanetary magnetic field, measured by ACE satellite, is carried out. Visualization system allows representing experimental and modeling data in 2D and 3D.

  12. Catalog of earthquake hypocenters at Alaskan Volcanoes: January 1 through December 31, 2010

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl K.

    2011-01-01

    Between January 1 and December 31, 2010, the Alaska Volcano Observatory (AVO) located 3,405 earthquakes, of which 2,846 occurred within 20 kilometers of the 33 volcanoes with seismograph subnetworks. There was no significant seismic activity in 2010 at these monitored volcanic centers. Seismograph subnetworks with severe outages in 2009 were repaired in 2010 resulting in three volcanic centers (Aniakchak, Korovin, and Veniaminof) being relisted in the formal list of monitored volcanoes. This catalog includes locations and statistics of the earthquakes located in 2010 with the station parameters, velocity models, and other files used to locate these earthquakes.

  13. A summary of ground motion effects at SLAC (Stanford Linear Accelerator Center) resulting from the Oct 17th 1989 earthquake

    SciTech Connect

    Ruland, R.E.

    1990-08-01

    Ground motions resulting from the October 17th 1989 (Loma Prieta) earthquake are described and can be correlated with some geologic features of the SLAC site. Recent deformations of the linac are also related to slow motions observed over the past 20 years. Measured characteristics of the earthquake are listed. Some effects on machine components and detectors are noted. 18 refs., 16 figs.

  14. Martin Marietta Energy Systems, Inc. comprehensive earthquake management plan: Emergency Operations Center training manual

    SciTech Connect

    Not Available

    1990-02-28

    The objective of this training is to: describe the responsibilities, resources, and goals of the Emergency Operations Center and be able to evaluate and interpret this information to best direct and allocate emergency, plant, and other resources to protect life and the Paducah Gaseous Diffusion Plant.

  15. ESTABLISHMENT OF THE WESTERN REGIONAL CENTER FOR BIOLOGICAL MONITORING AND ASSESSMENT OF FRESHWATER ECOSYSTEMS:

    EPA Science Inventory

    Initial Center Objectives 1. Coordinate the establishment of the Advisory Board for the newly formed Western Regional Center for Biological Monitoring and Assessment of Freshwater Ecosystems. The responsibility of the Advisory Board will be to set research, education, and outr...

  16. Monitoring of movement of potential earthquake areas with precise distance measuring and leveling systems

    NASA Astrophysics Data System (ADS)

    Staples, Jack E.

    1986-11-01

    Whether monitoring crustal movements in localized volcanic areas along known fault lines, or over large crustal-movement areas, the geodesist has been restricted by the measurement accuracy of the instruments used, the accumulation of errors, the lack of reliable air refraction information and the problem of finding proper measurement procedures and mathematical solutions to assure that the inherent errors of the measurement-mathematical procedures do not exceed any conceivable ground movement. Recent technological advances have placed new instruments and systems at the disposal of the geodesist, so that is now feasible to measure and analyze these micro and macro crustal movements within the accuracies required. The paper describes three such systems: (1) The Wild Electronic Theodolite T-2000 with a highly precise distance-measurement instrument, the DI-4S, together with a data collector, the GRE-3, which are connected to a computer and a plotter to measure and analyze both micro and macro crustal movements. (2) The Wild NAK-2 level with an antimagnetic compensator which increases the accuracy in the height/velocity monitoring of vertical crustal movements by virtual elimination of the influence of natural or man-made magnetic fields on the automatic level. (3) The use of analytical photogrammetry employing both terrestrial and aerial photography to monitor crustal movements. By taking advantage of these new instruments and systems, the scientists capability to provide crustal movement data for use in the analysis and prediction of micro or macro crustal movement is greatly enhanced.

  17. Real time of earthquakes prone areas by RST analysis of satellite TIR radiances: results of continuous monitoring over Italy and Turkey regions.

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Filizzola, C.; Genzano, N.; Lisi, M.; Paciello, R.; Pergola, N.

    2012-04-01

    Meteorological satellites offering global coverage, continuity of observations and long term time series (starting even 30 years ago) offer a unique possibility not only to learn from the past but also to guarantee continuous monitoring whereas other observation technologies are lacking because too expensive or (like in the case of earthquake precursor studies) or considered useless by decision-makers. Space-time fluctuations of Earth's emitted Thermal Infrared (TIR) radiation have been observed from satellite months to weeks before earthquakes occurrence. The general RST approach has been proposed (since 2001) in order to discriminate normal (i.e. related to the change of natural factor and/or observation conditions) TIR signal fluctuations from anomalous signal transient possibly associated to earthquake occurrence. Since then several earthquakes occurred in Europe, Africa and America have been studied by analyzing decades of satellite observations always using a validation/confutation approach in order to verify the presence/absence of anomalous space-time TIR transients in presence/absence of significant seismic activity. In the framework of PRE-EARTHQUAKES EU-FP7 Project (www.pre-earthquakes.org) , starting from October 2010 (still continuing) RST approach has been applied to MSG/SEVIRI data to generate TIR anomaly maps over Italian peninsula, continuously for all the midnight slots. Since September 2011 the same monitoring activity (still continuing) started for Turkey region. For the first time a similar analysis has been performed in real-time, systematically analyzing TIR anomaly maps in order to identify day by day possible significant (e.g. persistent in the space-time domain) thermal anomalies. During 2011 only in very few cases (1 in Italy in July and 2 in the Turkish region in September and November) the day by day analysis enhanced significant anomalies that in two cases were communicated to the other PRE-EARTHQUAKES partners asking for their attention. In this paper results of such analysis will be presented which seem to confirm results independently achieved (unfortunately without their knowledge) by other authors applying a similar approach to EOS/MODIS data over California region.

  18. Earthquake Testing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

  19. THE KASHMIR EARTHQUAKE OF OCTOBER 8, 2005 A QUICKLOOK REPORT

    E-print Network

    Masud, Arif

    THE KASHMIR EARTHQUAKE OF OCTOBER 8, 2005 A QUICKLOOK REPORT Ahmad Jan Durrani Amr Salah Elnashai Youssef M.A. Hashash Sung Jig Kim Arif Masud Mid-America Earthquake Center University of Illinois at Urbana-Champaign Mid-America Earthquake CenterMid-America Earthquake Center #12;2Mid-America Earthquake

  20. Structural Health Monitoring Sensor Development at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Prosser, W. H.; Wu, M. C.; Allison, S. G.; DeHaven, S. L.; Ghoshal, A.

    2002-01-01

    NASA is applying considerable effort on the development of sensor technology for structural health monitoring (SHM). This research is targeted toward increasing the safety and reliability of aerospace vehicles, while reducing operating and maintenance costs. Research programs are focused on applications to both aircraft and space vehicles. Sensor technologies under development span a wide range including fiber-optic sensing, active and passive acoustic sensors, electromagnetic sensors, wireless sensing systems, MEMS, and nanosensors. Because of their numerous advantages for aerospace applications, fiber-optic sensors are one of the leading candidates and are the major focus of this presentation. In addition, recent advances in active and passive acoustic sensing will also be discussed.

  1. Source Process of the Mw 5.0 Au Sable Forks, New York, Earthquake Sequence from Local Aftershock Monitoring Network Data

    NASA Astrophysics Data System (ADS)

    Kim, W.; Seeber, L.; Armbruster, J. G.

    2002-12-01

    On April 20, 2002, a Mw 5 earthquake occurred near the town of Au Sable Forks, northeastern Adirondacks, New York. The quake caused moderate damage (MMI VII) around the epicentral area and it is well recorded by over 50 broadband stations in the distance ranges of 70 to 2000 km in the Eastern North America. Regional broadband waveform data are used to determine source mechanism and focal depth using moment tensor inversion technique. Source mechanism indicates predominantly thrust faulting along 45° dipping fault plane striking due South. The mainshock is followed by at least three strong aftershocks with local magnitude (ML) greater than 3 and about 70 aftershocks are detected and located in the first three months by a 12-station portable seismographic network. The aftershock distribution clearly delineate the mainshock rupture to the westerly dipping fault plane at a depth of 11 to 12 km. Preliminary analysis of the aftershock waveform data indicates that orientation of the P-axis rotated 90° from that of the mainshock, suggesting a complex source process of the earthquake sequence. We achieved an important milestone in monitoring earthquakes and evaluating their hazards through rapid cross-border (Canada-US) and cross-regional (Central US-Northeastern US) collaborative efforts. Hence, staff at Instrument Software Technology, Inc. near the epicentral area joined Lamont-Doherty staff and deployed the first portable station in the epicentral area; CERI dispatched two of their technical staff to the epicentral area with four accelerometers and a broadband seismograph; the IRIS/PASSCAL facility shipped three digital seismographs and ancillary equipment within one day of the request; the POLARIS Consortium, Canada sent a field crew of three with a near real-time, satellite telemetry based earthquake monitoring system. The Polaris station, KSVO, powered by a solar panel and batteries, was already transmitting data to the central Hub in London, Ontario, Canada within a day after the field crew arrived in the Au Sable Forks area. This collaboration allowed us to maximize the scarce resources available for monitoring this damaging earthquake and its aftershocks in the Northeastern U.S.

  2. A survey conducted immediately after the 2011 Great East Japan Earthquake: evaluation of infectious risks associated with sanitary conditions in evacuation centers.

    PubMed

    Tokuda, Koichi; Kunishima, Hiroyuki; Gu, Yoshiaki; Endo, Shiro; Hatta, Masumitsu; Kanamori, Hajime; Aoyagi, Tetsuji; Ishibashi, Noriomi; Inomata, Shinya; Yano, Hisakazu; Kitagawa, Miho; Kaku, Mitsuo

    2014-08-01

    In cooperation with the Miyagi prefectural government, we conducted a survey of the management of sanitation at evacuation centers and the health of the evacuees by visiting 324 evacuation centers at two weeks after the 2011 Great East Japan Earthquake. The facilities often used as evacuation centers were community centers (36%), schools (32.7%) and Nursing homes (10.2%). It was more difficult to maintain a distance of at least 1 m between evacuees at the evacuation centers with a larger number of residents. At evacuation centers where the water supply was not restored, hygienic handling of food and the hand hygiene of the cooks were less than adequate. Among evacuation centers with ?50 evacuees, there was a significant difference in the prevalence rate of digestive symptoms between the centers with and without persons in charge of health matters (0.3% vs. 2.1%, respectively, p < 0.001). The following three factors had an important influence on the level of sanitation at evacuation centers and the health of evacuees: 1) the size of the evacuation center, 2) the status of the water supply, and 3) the allocation of persons in charge of health matters. Given that adjusting the number of evacuees to fit the size of the evacuation center and prompt restoration of the water supply are difficult to achieve immediately after an earthquake, promptly placing persons in charge of health matters at evacuation centers is a practicable and effective measure, and allocation of at least one such person per 50 evacuees is desirable. PMID:24861538

  3. (Stanford Linear Accelerator Center) annual environmental monitoring report, January--December 1989

    SciTech Connect

    Not Available

    1990-05-01

    This progress report discusses environmental monitoring activities at the Stanford Linear Accelerator Center for 1989. Topics include climate, site geology, site water usage, land use, demography, unusual events or releases, radioactive and nonradioactive releases, compliance summary, environmental nonradiological program information, environmental radiological program information, groundwater protection monitoring ad quality assurance. 5 figs., 7 tabs. (KJD)

  4. Monitoring of the Permeable Pavement Demonstration Site at the Edison Environmental Center (Poster)

    EPA Science Inventory

    This is a poster on the permeable pavement parking lot at the Edison Environmental Center. The monitoring scheme for the project is discussed in-depth with graphics explaining the instrumentation installed at the site.

  5. Hatfield Marine Science Center Dynamic Revetment Project DSL permit #45455-FP, Monitoring Report February, 2013

    EPA Science Inventory

    A Dynamic Revetment (gravel beach) was installed in November, 2011 on the shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) to mitigate erosion that threatened HMSC critical infrastructure. Shoreline topographic and biological monitoring was init...

  6. Hatfield Marine Science Center Dynamic Revetment Project DSL permit #45455-FP, Monitoring Report February 2012

    EPA Science Inventory

    A Dynamic Revetment (gravel beach) was installed in November, 2011 on the shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) to mitigate erosion that threatened HMSC critical infrastructure. Shoreline topographic and biological monitoring was init...

  7. GPS Monitoring of Surface Change During and Following the Fortuitous Occurrence of the M(sub w) = 7.3 Landers Earthquake in our Network

    NASA Technical Reports Server (NTRS)

    Miller, M. Meghan

    1998-01-01

    Accomplishments: (1) Continues GPS monitoring of surface change during and following the fortuitous occurrence of the M(sub w) = 7.3 Landers earthquake in our network, in order to characterize earthquake dynamics and accelerated activity of related faults as far as 100's of kilometers along strike. (2) Integrates the geodetic constraints into consistent kinematic descriptions of the deformation field that can in turn be used to characterize the processes that drive geodynamics, including seismic cycle dynamics. In 1991, we installed and occupied a high precision GPS geodetic network to measure transform-related deformation that is partitioned from the Pacific - North America plate boundary northeastward through the Mojave Desert, via the Eastern California shear zone to the Walker Lane. The onset of the M(sub w) = 7.3 June 28, 1992, Landers, California, earthquake sequence within this network poses unique opportunities for continued monitoring of regional surface deformation related to the culmination of a major seismic cycle, characterization of the dynamic behavior of continental lithosphere during the seismic sequence, and post-seismic transient deformation. During the last year, we have reprocessed all three previous epochs for which JPL fiducial free point positioning products available and are queued for the remaining needed products, completed two field campaigns monitoring approx. 20 sites (October 1995 and September 1996), begun modeling by development of a finite element mesh based on network station locations, and developed manuscripts dealing with both the Landers-related transient deformation at the latitude of Lone Pine and the velocity field of the whole experiment. We are currently deploying a 1997 observation campaign (June 1997). We use GPS geodetic studies to characterize deformation in the Mojave Desert region and related structural domains to the north, and geophysical modeling of lithospheric behavior. The modeling is constrained by our existing and continued GPS measurements, which will provide much needed data on far-field strain accumulation across the region and on the deformational response of continental lithosphere during and following a large earthquake, forming the basis for kinematic and dynamic modeling of secular and seismic-cycle deformation. GPS geodesy affords both regional coverage and high precision that uniquely bear on these problems.

  8. Biometric Monitoring as a Persuasive Technology: Ensuring Patients Visit Health Centers in India's Slums

    E-print Network

    Toronto, University of

    Biometric Monitoring as a Persuasive Technology: Ensuring Patients Visit Health Centers in India. This problem is especially acute for tuberculosis patients, who in India are required to visit a center over 40 of observation, uses biometric fingerprint scanning to ensure that tuberculosis patients receive and take medi

  9. GREENHOUSE GAS (GHG) MITIGATION AND MONITORING TECHNOLOGY PERFORMANCE: ACTIVITIES OF THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper discusses greenhouse gas (GHG) mitigation and monitoring technology performance activities of the GHG Technology Verification Center. The Center is a public/private partnership between Southern Research Institute and the U.S. EPA's Office of Research and Development. It...

  10. Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Bachmann, C. E.; Wiemer, S.; Woessner, J.; Hainzl, S.

    2011-08-01

    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was reduced and then stopped after another ML 2.5 event. A few hours later, an earthquake with ML 3.4, felt within the city, occurred, which led to bleed-off of the well. A risk study was later issued with the outcome that the experiment could not be resumed. We analyse the statistical features of the sequence and show that the sequence is well modelled with the Omori-Utsu law following the termination of water injection. Based on this model, the sequence will last 31+29/-14 years to reach the background level. We introduce statistical models based on Reasenberg and Jones and Epidemic Type Aftershock Sequence (ETAS) models, commonly used to model aftershock sequences. We compare and test different model setups to simulate the sequences, varying the number of fixed and free parameters. For one class of the ETAS models, we account for the flow rate at the injection borehole. We test the models against the observed data with standard likelihood tests and find the ETAS model accounting for the on flow rate to perform best. Such a model may in future serve as a valuable tool for designing probabilistic alarm systems for EGS experiments.

  11. Near real-time model to monitor SST anomalies related to undersea earthquakes and SW monsoon phenomena from TRMM-AQUA satellite data

    NASA Astrophysics Data System (ADS)

    Chakravarty, Subhas

    Near real-time interactive computer model has been developed to extract daily mean global Sea Surface Temperature (SST) values of 1440x720 pixels, each one covering 0.25° x0.25° lat-long area and SST anomalies from longer period means pertaining to any required oceanic grid size of interest. The core MATLAB code uses the daily binary files (3-day aggregate values) of global SST data (derived from TRMM/TMI-AQUA/AMSRE satellite sensors) available on near real-time basis through the REMSS/NASA website and converts these SSTs into global/regional maps and displays as well as digitised text data tables for further analysis. As demonstrated applications of the model, the SST data for the period between 2003-2009 has been utilised to study (a) SST anomalies before, during and after the occurrence of two great under-sea earthquakes of 26 December 2004 and 28 March 2005 near the western coast of Sumatra and (b) variation of pixel numbers with SSTs between 27-31° C within (i) Nino 4 region and (ii) a broader western Pacific region (say Nino-BP) affected by ENSO events before (January-May) and during (June-October) Monsoon onset/progress. Preliminary results of these studies have been published (Chakravarty, The Open Oceanography Journal, 2009 and Chakravarty, IEEE Xplore, 2009). The results of the SST-earthquake analysis indicate a small but consistent warming of 0.2-0.3° C in the 2° x2° grid area near the earthquake epicentre starting a week earlier to a week later for the event of 26 December 2004. The changes observed in SST for the second earthquake is also indicated but with less clarity owing to the mixing of land and ocean surfaces and hence less number of SST pixels available within the 2° x 2° grid area near the corresponding epicen-tre. Similar analysis for the same period of non-earthquake years did not show any such SST anomalies. These results have far reaching implications to use SST as a possible parameter to be monitored for signalling occurrence of impending under-sea earthquakes sometimes leading to tsunamis. The results of the analysis for the ENSO-Monsoon rainfall relation show that the time series of SST distribution within the Nino 4 or Nino-BP regions with larger number of pixels with SSTs between 27-28° C is generally a favourable condition for normal rainfall condi-tion. While both Nino 4 and Nino-BP provide similar results, Nino-BP region is found to be a more sensitive region for such assessment of monitoring the trend of SW monsoon rainfall over India. This result has the potential to be used in the prognosis of overall rainfall pattern of the monsoon season at weekly intervals which may serve as vital information for Indian agricul-tural production. While simple geophysical models are able to explain the above correlations, more detailed modelling of the plate tectonics and heat fluxes (for undersea earthquakes) and ocean-cloud interaction/dynamics (for ENSO and Monsoon rainfall pattern) would need to be undertaken.

  12. Testing hypotheses of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of the second, and beta, the probability that the second would be wrongly rejected in favor of the first. Computing alpha and beta requires knowing the theoretical distribution of likelihood scores under each hypothesis, which we will estimate by simulations. Each forecast is given equal status; there is no "null hypothesis" which would be accepted by default. Forecasts and test results would be archived and posted on the RELM web site. The same methods can be applied to any region with adequate monitoring and sufficient earthquakes. If fewer than ten events are forecasted, the likelihood tests may not give definitive results. The tests do force certain requirements on the forecast models. Because the tests are based on absolute rates, stress models must be explicit about how stress increments affect past seismicity rates. Aftershocks of triggered events must be accounted for. Furthermore, the tests are sensitive to magnitude, so forecast models must specify the magnitude distribution of triggered events. Models should account for probable errors in magnitude and location by appropriate smoothing of the probabilities, as the tests will be "cold hearted:" near misses won't count.

  13. The Canadian National Calibration Reference Center for Bioassay and in-vivo Monitoring: A program summary

    SciTech Connect

    Kramer, G.H.; Zamora, M.L.

    1994-08-01

    The Canadian National Calibration Reference Center for Bioassay and in-vivo Monitoring is part of the Radiation Protection Bureau, Department of Health. The Reference Center operates a variety of different intercomparison programs that are designed to confirm that workplace monitoring results are accurate and provide the necessary external verification required by the Canadian regulators. The programs administered by the Reference Center currently include urinalysis intercomparisons for tritium, natural uranium, and {sup 14}C, and in-vivo programs for whole-body, thorax, and thyroid monitoring. The benefits of the intercomparison programs to the participants are discussed by example. Future programs that are planned include dual spiked urine sample which contain both tritium and {sup 14}C and the in-vivo measurement of {sup 99m}Tc. 18 refs., 1 fig., 2 tabs.

  14. Appraising the Early-est earthquake monitoring system for tsunami alerting at the Italian candidate Tsunami Service Provider

    NASA Astrophysics Data System (ADS)

    Bernardi, F.; Lomax, A.; Michelini, A.; Lauciani, V.; Piatanesi, A.; Lorito, S.

    2015-04-01

    In this paper we present the procedure for earthquake location and characterization implemented in the Italian candidate Tsunami Service Provider at INGV in Roma. Following the ICG/NEAMTWS guidelines, the first tsunami warning messages are based only on seismic information, i.e. epicenter location, hypocenter depth and magnitude, which are automatically computed by the software Early-est. Early-est is a package for rapid location and seismic/tsunamigenic characterization of earthquakes. The Early-est software package operates on offline-event or continuous-realtime seismic waveform data to perform trace processing and picking, and, at a regular report interval, phase association, event detection, hypocenter location, and event characterization. In this paper we present the earthquake parameters computed by Early-est from the beginning of 2012 till the end of December 2014 at global scale for events with magnitude M ≥ 5.5, and the detection timeline. The earthquake parameters computed automatically by Early-est are compared with reference manually revised/verified catalogs. From our analysis the epicenter location and hypocenter depth parameters do not differ significantly from the values in the reference catalogs. The epicenter coordinates generally differ less than 20 ∓ 20 km from the reference epicenter coordinates; focal depths are less well constrained and differ generally less than 0 ∓ 30 km. Early-est also provides mb, Mwp and Mwpd magnitude estimations. mb magnitudes are preferred for events with Mwp ≲ 5.8, while Mwpd are valid for events with Mwp ≳ 7.2. The magnitude mb show wide differences with respect to the reference catalogs, we thus apply a linear correction mbcorr = mb · 0.52 + 2.46, such correction results into ?mb ? 0.0 ∓ 0.2 uncertainty with respect the reference catalogs. As expected the Mwp show distance dependency. Mwp values at stations with epicentral distance ? ≲ 30° are significantly overestimated with respect the CMT-global solutions, whereas Mwp values at stations with epicentral distance ? ≳ 90° are slightly underestimated. We thus apply a 3rd degree polynomial distance correction. After applying the distance correction, the Mwp provided by Early-est differs from CMT-global catalog values of about ? Mwp ? 0.0 ∓ 0.2. Early-est continuously acquires time series data and updates the earthquake source parameters. Our analysis shows that the epicenter coordinates and the magnitude values converge rather quickly toward the final values. Generally we can provide robust and reliable earthquake source parameters to compile tsunami warning message within less than about 15 min after event origin time.

  15. EARTHQUAKES POWEROUTAGES

    E-print Network

    TSUNAMI HURRICANES EARTHQUAKES POWEROUTAGES FIRE FOREMERGENCIES PREPAREFOREMERGENCIES Sign up in the event of a widespread disaster, such as a hurricane or earthquake. If a large area is affected, phone

  16. Parkfield, California, earthquake prediction experiment

    SciTech Connect

    Bakun, W.H.; Lindh, A.G.

    1985-08-16

    Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and aseismic slip associated with the last moderate Parkfield earthquake in 1966 constitute much of the basis of the design of the experiment. 46 references, 4 figures.

  17. Program Evaluation of Remote Heart Failure Monitoring: Healthcare Utilization Analysis in a Rural Regional Medical Center

    PubMed Central

    Keberlein, Pamela; Sorenson, Gigi; Mohler, Sailor; Tye, Blake; Ramirez, A. Susana; Carroll, Mark

    2015-01-01

    Abstract Background: Remote monitoring for heart failure (HF) has had mixed and heterogeneous effects across studies, necessitating further evaluation of remote monitoring systems within specific healthcare systems and their patient populations. “Care Beyond Walls and Wires,” a wireless remote monitoring program to facilitate patient and care team co-management of HF patients, served by a rural regional medical center, provided the opportunity to evaluate the effects of this program on healthcare utilization. Materials and Methods: Fifty HF patients admitted to Flagstaff Medical Center (Flagstaff, AZ) participated in the project. Many of these patients lived in underserved and rural communities, including Native American reservations. Enrolled patients received mobile, broadband-enabled remote monitoring devices. A matched cohort was identified for comparison. Results: HF patients enrolled in this program showed substantial and statistically significant reductions in healthcare utilization during the 6 months following enrollment, and these reductions were significantly greater compared with those who declined to participate but not when compared with a matched cohort. Conclusions: The findings from this project indicate that a remote HF monitoring program can be successfully implemented in a rural, underserved area. Reductions in healthcare utilization were observed among program participants, but reductions were also observed among a matched cohort, illustrating the need for rigorous assessment of the effects of HF remote monitoring programs in healthcare systems. PMID:25025239

  18. Forecasting Earthquakes

    NASA Technical Reports Server (NTRS)

    1994-01-01

    In this video there are scenes of damage from the Northridge Earthquake and interviews with Dr. Andrea Donnelan, Geophysics at JPL, and Dr. Jim Dolan, earthquake geologist from Cal. Tech. The interviews discuss earthquake forecasting by tracking changes in the earth's crust using antenna receiving signals from a series of satellites called the Global Positioning System (GPS).

  19. Real-time prediction of earthquake ground motion using real-time monitoring, and improvement strategy of JMA EEW based on the lessons from M9 Tohoku Earthquake (Invited)

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.

    2013-12-01

    In this presentation, a new approach of real-time prediction of seismic ground motion for Earthquake Early Warning (EEW) is explained, in which real-time monitor is used but hypocentral location and magnitude are not required. Improvement strategy of the Japan Meteorological Agency (JMA) is also explained based on the lessons learned from the 2011 Tohoku Earthquake (Mw9.0). During the Tohoku Earthquake, EEW system of JMA issued warnings before the S-wave arrival and more than 15 s earlier than the strong ground motion in the Tohoku district. So it worked well as rapidly as designed. However, it under-predicted the seismic intensity for the Kanto district due to the very large extent of the fault rupture, and it issued some false alarms due to multiple simultaneous aftershocks. To address these problems, a new method of time-evolutional prediction is proposed that uses the real-time monitor of seismic wave propagation. This method makes it possible to predict ground motion without a hypocenter and magnitude. Effects of rupture directivity, source extent and simultaneous multiple events are substantially included in this method. In the time evolutional prediction, future wavefield is predicted from the wavefield at a certain time, that is u(x, t+?t)=P(u(x, t)), where u is the wave motion at location x at lapse time t, and P is the prediction operator. The determination of detailed distribution of current wavefield is an important key, so that dense seismic observation network is required. Here, current wavefield, u(x, t), observed by the real time monitoring is used as the initial condition, and then wave propagation is predicted based on time evolutional approach. The method is based on the following three techniques. To enhance the estimation of the current wavefield, data assimilation is applied. The data assimilation is a technique to produce artificially denser network, which is widely used for numerical weather forecast and oceanography. Propagation is predicted using P from the distribution of current wave motion, u(x, t), estimated from the data assimilation technique. For P, finite difference technique or boundary integral equation method, such as Kirchhoff integral, is used. Kirchhoff integral is qualitatively approximated by Huygens principle. Site amplification is an important factor to determine the seismic ground motion in addition to source and propagation factors. Site factor is usually frequency-dependent, and should be corrected in real time manner for EEW. The frequency-dependence is reproduced using a causal filter in the time domain applying bilinear transform and pre-warping techniques. Our final goal is the time evolutional prediction of seismic waveforms. Instead of the waveforms, prediction of the seismic intensity is applied in a preliminary version of this method, in which real-time observation of seismic intensities is used. JMA intends to introduce the preliminary version into their system within a couple of years, and integrate it with the current method which is based on the hypocenter and magnitude.

  20. Appraising the Early-est earthquake monitoring system for tsunami alerting at the Italian Candidate Tsunami Service Provider

    NASA Astrophysics Data System (ADS)

    Bernardi, F.; Lomax, A.; Michelini, A.; Lauciani, V.; Piatanesi, A.; Lorito, S.

    2015-09-01

    In this paper we present and discuss the performance of the procedure for earthquake location and characterization implemented in the Italian Candidate Tsunami Service Provider at the Istituto Nazionale di Geofisica e Vulcanologia (INGV) in Rome. Following the ICG/NEAMTWS guidelines, the first tsunami warning messages are based only on seismic information, i.e., epicenter location, hypocenter depth, and magnitude, which are automatically computed by the software Early-est. Early-est is a package for rapid location and seismic/tsunamigenic characterization of earthquakes. The Early-est software package operates using offline-event or continuous-real-time seismic waveform data to perform trace processing and picking, and, at a regular report interval, phase association, event detection, hypocenter location, and event characterization. Early-est also provides mb, Mwp, and Mwpd magnitude estimations. mb magnitudes are preferred for events with Mwp ≲ 5.8, while Mwpd estimations are valid for events with Mwp ≳ 7.2. In this paper we present the earthquake parameters computed by Early-est between the beginning of March 2012 and the end of December 2014 on a global scale for events with magnitude M ≥ 5.5, and we also present the detection timeline. We compare the earthquake parameters automatically computed by Early-est with the same parameters listed in reference catalogs. Such reference catalogs are manually revised/verified by scientists. The goal of this work is to test the accuracy and reliability of the fully automatic locations provided by Early-est. In our analysis, the epicenter location, hypocenter depth and magnitude parameters do not differ significantly from the values in the reference catalogs. Both mb and Mwp magnitudes show differences to the reference catalogs. We thus derived correction functions in order to minimize the differences and correct biases between our values and the ones from the reference catalogs. Correction of the Mwp distance dependency is particularly relevant, since this magnitude refers to the larger and probably tsunamigenic earthquakes. Mwp values at stations with epicentral distance ? ≲ 30° are significantly overestimated with respect to the CMT-global solutions, whereas Mwp values at stations with epicentral distance ? ≳ 90° are slightly underestimated. After applying such distance correction the Mwp provided by Early-est differs from CMT-global catalog values of about ? Mwp ? 0.0 ∓ 0.2. Early-est continuously acquires time-series data and updates the earthquake source parameters. Our analysis shows that the epicenter coordinates and the magnitude values converge within less than 10 min (5 min in the Mediterranean region) toward the stable values. Our analysis shows that we can compute Mwp magnitudes that do not display short epicentral distance dependency overestimation, and we can provide robust and reliable earthquake source parameters to compile tsunami warning messages within less than 15 min after the event origin time.

  1. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - presentation

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch has been monitoring an instrumented 110-space pervious pavement parking lot. The lot is used by EPA personnel and visitors to the Edison Environmental Center. The design includes 28-space rows of three permeable pavement types: asphal...

  2. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - Abstract

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch (UWMB) is monitoring an instrumented, working, 110-space pervious pavement parking at EPA’s Edison Environmental Center (EEC). Permeable pavement systems are classified as stormwater best management practices (BMPs) which reduce runo...

  3. January, 2015 Fisheries Monitoring and Analysis Division, Alaska Fisheries Science Center

    E-print Network

    ! ! ! ! ! ! ! ! ! ! ! ! ! Fisheries Monitoring and Analysis Division, Alaska Fisheries Science Center National Marine Fisheries Service 7600 Sand Point Way NE Seattle, WA 98115 National Marine Fisheries Service, Alaska Regional Office................................................................. 6! 2.2.1! Management data under a catch share program (near-real time

  4. Seismotectonics and Seismic Structure of the Alboran Sea, Western Mediterranean - Constraints from Local Earthquake Monitoring and Seismic Refraction and Wide-Angle Profiling

    NASA Astrophysics Data System (ADS)

    Leuchters, W.; Grevemeyer, I.; Ranero, C. R.; Villasenor, A.; Booth-Rea, G.; Gallart, J.

    2011-12-01

    The Alboran Basin is located in the western-most Mediterranean Sea and is surrounded by the Gibraltar-Betic and Rif orogenic arc. Geological evidence suggests that the most important phase of formation started in the early-to-mid-Miocene. Currently two conflicting models are discussed for its formation: One model proposes contractive tectonics producing strike-slip faults and folds with sedimentation occurring in synclinal basins and in regions of subsidiary extension in transtensional fault segments. A second model proposes slab roll back that caused contraction at the front of the arc and coeval overriding plate bending and extension and associated arc magmatism. However, this phase has been partially masked by late Miocene to present contractive structures, caused by the convergence of Africa and Iberia. Two German/Spanish collaborative research projects provided excellent new seismological and seismic data. Onshore/offshore earthquake monitoring received a wealth of local earthquake data to study seismotectonics and yielded the average 1D velocity structure of the Alboran/Betics/Rif domain. In the Alboran Basin most earthquakes occur below 20 km along a diffuse fault zone, crossing the Alboran Sea from the Moroccan to the Spanish coast. Further, earthquakes along the northern portion of the Alboran Ridge show thrust mechanisms and compression roughly normal to the vector of plate convergence between Africa and Iberia. A 250 km long seismic refraction and wide-angle profile was acquired coincident with the existing multi-channel seismic (MCS) ESCI-Alb2 line using the German research vessel Meteor. Shots fired with a 64-litre airgun array were recorded on 24 ocean-bottom seismometer (OBS) and ocean-bottom hydrophone (OBH) stations. The profile run roughly along the axis of the basin, circa 65 km off the coast of Morocco, north of the Alboran Ridge. It continues in an ENE direction to end north of the Algeria coast. Using seismic tomography we mapped the crustal and upper mantle structure of the eastern Alboran Sea and the westernmost Algero-Balearic basin. The easternmost part of the profile indicates crust in the order of 5-5.5 km, possibly created by back-arc spreading. Towards the west, crust thickens to 11-13 km, and crustal velocities tend to be lower than in the eastern domain, falling into the range of continental crust. However, a number of intrusive bodies could be identified, favouring the interpretation that the crust was strongly modified by arc magnetism in the mid-Miocene.

  5. Cost-effective monitoring of ground motion related to earthquakes, landslides, or volcanic activity by joint use of a single-frequency GPS and a MEMS accelerometer

    NASA Astrophysics Data System (ADS)

    Tu, R.; Wang, R.; Ge, M.; Walter, T. R.; Ramatschi, M.; Milkereit, C.; Bindi, D.; Dahm, T.

    2013-08-01

    detection and precise estimation of strong ground motion are crucial for rapid assessment and early warning of geohazards such as earthquakes, landslides, and volcanic activity. This challenging task can be accomplished by combining GPS and accelerometer measurements because of their complementary capabilities to resolve broadband ground motion signals. However, for implementing an operational monitoring network of such joint measurement systems, cost-effective techniques need to be developed and rigorously tested. We propose a new approach for joint processing of single-frequency GPS and MEMS (microelectromechanical systems) accelerometer data in real time. To demonstrate the performance of our method, we describe results from outdoor experiments under controlled conditions. For validation, we analyzed dual-frequency GPS data and images recorded by a video camera. The results of the different sensors agree very well, suggesting that real-time broadband information of ground motion can be provided by using single-frequency GPS and MEMS accelerometers.

  6. Lessons learned from the introduction of autonomous monitoring to the EUVE science operations center

    NASA Technical Reports Server (NTRS)

    Lewis, M.; Girouard, F.; Kronberg, F.; Ringrose, P.; Abedini, A.; Biroscak, D.; Morgan, T.; Malina, R. F.

    1995-01-01

    The University of California at Berkeley's (UCB) Center for Extreme Ultraviolet Astrophysics (CEA), in conjunction with NASA's Ames Research Center (ARC), has implemented an autonomous monitoring system in the Extreme Ultraviolet Explorer (EUVE) science operations center (ESOC). The implementation was driven by a need to reduce operations costs and has allowed the ESOC to move from continuous, three-shift, human-tended monitoring of the science payload to a one-shift operation in which the off shifts are monitored by an autonomous anomaly detection system. This system includes Eworks, an artificial intelligence (AI) payload telemetry monitoring package based on RTworks, and Epage, an automatic paging system to notify ESOC personnel of detected anomalies. In this age of shrinking NASA budgets, the lessons learned on the EUVE project are useful to other NASA missions looking for ways to reduce their operations budgets. The process of knowledge capture, from the payload controllers for implementation in an expert system, is directly applicable to any mission considering a transition to autonomous monitoring in their control center. The collaboration with ARC demonstrates how a project with limited programming resources can expand the breadth of its goals without incurring the high cost of hiring additional, dedicated programmers. This dispersal of expertise across NASA centers allows future missions to easily access experts for collaborative efforts of their own. Even the criterion used to choose an expert system has widespread impacts on the implementation, including the completion time and the final cost. In this paper we discuss, from inception to completion, the areas where our experiences in moving from three shifts to one shift may offer insights for other NASA missions.

  7. Response to the great East Japan earthquake of 2011 and the Fukushima nuclear crisis: the case of the Laboratory Animal Research Center at Fukushima Medical University.

    PubMed

    Katahira, Kiyoaki; Sekiguchi, Miho

    2013-01-01

    A magnitude 9.0 great earthquake, the 2011 off the Pacific coast of Tohoku Earthquake, occurred on March 11, 2011, and subsequent Fukushima Daiichi Nuclear Power Station (Fukushima NPS) accidents stirred up natural radiation around the campus of Fukushima Medical University (FMU). FMU is located in Fukushima City, and is 57 km to the northwest of Fukushima NPS. Due to temporary failure of the steam boilers, the air conditioning system for the animal rooms, all autoclaves, and a cage washer could not be used at the Laboratory Animal Research Center (LARC) of FMU. The outside air temperature dropped to zero overnight, and the temperature inside the animal rooms fell to 10°C for several hours. We placed sterilized nesting materials inside all cages to encourage rodents to create nests. The main water supply was cut off for 8 days in all, while supply of steam and hot water remained unavailable for 12 days. It took 20 days to restore the air conditioning system to normal operation at the facility. We measured radiation levels in the animal rooms to confirm the safety of care staff and researchers. On April 21, May 9, and June 17, the average radiation levels at a central work table in the animal rooms with HEPA filters were 46.5, 44.4, and 43.4 cpm, respectively, which is equal to the background level of the equipment. We sincerely hope our experiences will be a useful reference regarding crisis management for many institutes having laboratory animals. PMID:23615301

  8. The earth's absolute gravitation potential function in the prospect 'gravitational potential metering' of geological objects and earthquake centers

    E-print Network

    Aleksandr Fridrikson; Marina Kasatochkina

    2009-04-08

    The direct problem of the detection of the Earth's absolute gravitation potential maximum value (MGP) was solved. The inverse problem finding of the Earth maximum gravitation (where there is a maximum of gravitation field intensity and a potential function has a 'bending point') with the help of MGP was solved as well. The obtained results show that the revealed Earth maximum gravitation coincides quite strictly with the cseismic D" layer on the border of the inner and outer (liquid) core. The validity of the method of an absolute gravitation potential detection by the equal- potential velocity was proved as 'gravitation potential measurement' or 'Vs-gravity method'. The prospects of this method for detecting of low-power or distant geological objects with abnormal density and the possible earthquakes with low density was shown.

  9. Environmental assessment of the Carlsbad Environmental Monitoring and Research Center Facility

    SciTech Connect

    1995-10-01

    This Environmental Assessment has been prepared to determine if the Carlsbad Environmental Monitoring and Research Center (the Center), or its alternatives would have significant environmental impacts that must be analyzed in an Environmental Impact Statement. DOE`s proposed action is to continue funding the Center. While DOE is not funding construction of the planned Center facility, operation of that facility is dependent upon continued funding. To implement the proposed action, the Center would initially construct a facility of approximately 2,300 square meters (25,000 square feet). The Phase 1 laboratory facilities and parking lot will occupy approximately 1.2 hectares (3 acres) of approximately 8.9 hectares (22 acres) of land which were donated to New Mexico State University (NMSU) for this purpose. The facility would contain laboratories to analyze chemical and radioactive materials typical of potential contaminants that could occur in the environment in the vicinity of the DOE Waste Isolation Pilot Plant (WIPP) site or other locations. The facility also would have bioassay facilities to measure radionuclide levels in the general population and in employees of the WIPP. Operation of the Center would meet the DOE requirement for independent monitoring and assessment of environmental impacts associated with the planned disposal of transuranic waste at the WIPP.

  10. The “NetBoard”: Network Monitoring Tools Integration for INFN Tier-1 Data Center

    NASA Astrophysics Data System (ADS)

    De Girolamo, D.; dell'Agnello and, L.; Zani, S.

    2012-12-01

    The monitoring and alert system is fundamental for the management and the operation of the network in a large data center such as an LHC Tier-1. The network of the INFN Tier-1 at CNAF is a multi-vendor environment: for its management and monitoring several tools have been adopted and different sensors have been developed. In this paper, after an overview on the different aspects to be monitored and the tools used for them (i.e. MRTG, Nagios, Arpwatch, NetFlow, Syslog, etc), we will describe the “NetBoard”, a monitoring toolkit developed at the INFN Tier-1. NetBoard, developed for a multi-vendor network, is able to install and auto-configure all tools needed for its monitoring, either via network devices discovery mechanism or via configuration file or via wizard. In this way, we are also able to activate different types of sensors and Nagios checks according to the equipment vendor specifications. Moreover, when a new device is connected in the LAN, NetBoard can detect where it is plugged. Finally the NetBoard web interface allows to have the overall status of the entire network “at a glance”, both the local and the geographical (including the LHCOPN and the LHCONE) link utilization, health status of network devices (with active alerts) and flow analysis.

  11. Wilson Corners SWMU 001 2014 Annual Long Term Monitoring Report Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Langenbach, James

    2015-01-01

    This document presents the findings of the 2014 Long Term Monitoring (LTM) that was completed at the Wilson Corners site, located at the National Aeronautics and Space Administration (NASA) John F. Kennedy Space Center (KSC), Florida. The goals of the 2014 annual LTM event were to evaluate the groundwater flow direction and gradient and to monitor the vertical and downgradient horizontal extent of the volatile organic compounds (VOCs) in groundwater at the site. The LTM activities consisted of an annual groundwater sampling event in December 2014, which included the collection of water levels from the LTM wells. During the annual groundwater sampling event, depth to groundwater was measured and VOC samples were collected using passive diffusion bags (PDBs) from 30 monitoring wells. In addition to the LTM sampling, additional assessment sampling was performed at the site using low-flow techniques based on previous LTM results and assessment activities. Assessment of monitoring well MW0052DD was performed by collecting VOC samples using low-flow techniques before and after purging 100 gallons from the well. Monitoring well MW0064 was sampled to supplement shallow VOC data north of Hot Spot 2 and east of Hot Spot 4. Monitoring well MW0089 was sampled due to its proximity to MW0090. MW0090 is screened in a deeper interval and had an unexpected detection of trichloroethene (TCE) during the 2013 LTM, which was corroborated during the March 2014 verification sampling. Monitoring well MW0130 was sampled to provide additional VOC data beneath the semi-confining clay layer in the Hot Spot 2 area.

  12. Expectation of ground motion in Earthquake Early Waning using real time monitoring of wavefield : a method based on Kirchhoff-Fresnel integral without information of hypocenter and magnitude

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.

    2011-12-01

    In this presentation, I propose a new method for expectation of ground motion in Earthquake Early Waning (EEW), based on Kirchhoff Fresnel integral using real time monitoring of seismic wavefield. EEW of Japan Meteorological Agency (JMA) basically adopts a network method, in which hypocenter and magnitude (source parameters) are determined quickly, and then the ground motions are expected, and warnings are issued depending on the strength of the expected ground motion. In this network method, though we can expect ground motions using a few parameters (location of hypocenter, magnitude, site factors) at any points, it is necessary to determine the hypocenter and magnitude at first for the warning, and error of the source parameters leads directly to the error of the expectation, and it is not easy to take the effects of rupture directivity and source extent into account. For the 2011 of the Pacific coast Tohoku earthquake (Mw9.0) , JMA EEW was earlier than the S wave arrival and more than 15 s earlier than the strong ground motion everywhere in the Tohoku district. However, in the Tokyo region (approximately 400km from the epicenter), expected intensity was smaller than the actual observation. The underestimation can be attributed to the large extent of the later fault rupture. For several weeks after the mainshock, when earthquakes sometimes occurred simultaneously over the wide source region, the system became confused, and did not always determine the location and magnitude correctly, which leaded to many false alarms. To solve above problems, I propose a method for expectation of ground motion based on Kirchhoff integral method (representation theorem) or Kirchhoff (Huygens) Fresnel integral method for high frequency approximation. The ground motion is expected from real observation of ground motions at stations in the direction of the waves coming from. In this method, real time monitoring of wavefield and propagation direction of the waves are important, but source parameters (hypocenter and magnitude) are not necessarily required. The key of this method is dense observation stations which send the waveform data in real time manner. Because the source parameters are not required, it is possible to expect ground motion even when the hypocenter and magnitude have not yet determined, and the preciseness of expectation of ground motion is not affected by the error of the source parameters. The effects of rupture directivity, source extent and simultaneous multiple events can be automatically included in this method. In the presentation, I will show examples applying this method to real data. One is typical case in which the station in the direction of coming waves is very near from target station, that is borehole case (depth is 3,500m). The other example is the case of Tokyo region during the Mw9.0 earthquake. In both cases, the preciseness of the expectation is better than the method based on hypocenter and magnitude.

  13. Advancing Research Methodology for Measuring & Monitoring Patient-centered Communication in Cancer Care

    Cancer.gov

    A critical step in facilitating the delivery of patient-centered communication (PCC) as part of routine cancer care delivery is creating a measurement and monitoring system that will allow for the ongoing assessment, tracking, and improvement of these six functions of patient-centered communication. To build the foundation of such a system and to advance research methodology in this area, the ORB has collaborated with the Agency for Healthcare Research and Quality (AHRQ) on a research project conducted within AHRQ's DEcIDE network.

  14. Rapid Estimates of Rupture Extent for Large Earthquakes Using Aftershocks

    NASA Astrophysics Data System (ADS)

    Polet, J.; Thio, H. K.; Kremer, M.

    2009-12-01

    The spatial distribution of aftershocks is closely linked to the rupture extent of the mainshock that preceded them and a rapid analysis of aftershock patterns therefore has potential for use in near real-time estimates of earthquake impact. The correlation between aftershocks and slip distribution has frequently been used to estimate the fault dimensions of large historic earthquakes for which no, or insufficient, waveform data is available. With the advent of earthquake inversions that use seismic waveforms and geodetic data to constrain the slip distribution, the study of aftershocks has recently been largely focused on enhancing our understanding of the underlying mechanisms in a broader earthquake mechanics/dynamics framework. However, in a near real-time earthquake monitoring environment, in which aftershocks of large earthquakes are routinely detected and located, these data may also be effective in determining a fast estimate of the mainshock rupture area, which would aid in the rapid assessment of the impact of the earthquake. We have analyzed a considerable number of large recent earthquakes and their aftershock sequences and have developed an effective algorithm that determines the rupture extent of a mainshock from its aftershock distribution, in a fully automatic manner. The algorithm automatically removes outliers by spatial binning, and subsequently determines the best fitting “strike” of the rupture and its length by projecting the aftershock epicenters onto a set of lines that cross the mainshock epicenter with incremental azimuths. For strike-slip or large dip-slip events, for which the surface projection of the rupture is recti-linear, the calculated strike correlates well with the strike of the fault and the corresponding length, determined from the distribution of aftershocks projected onto the line, agrees well with the rupture length. In the case of a smaller dip-slip rupture with an aspect ratio closer to 1, the procedure gives a measure of the rupture extent and dimensions, but not necessarily the strike. We found that using standard earthquake catalogs, such as the National Earthquake Information Center catalog, we can constrain the rupture extent, rupture direction, and in many cases the type of faulting, of the mainshock with the aftershocks that occur within the first hour after the mainshock. However, this data may not be currently available in near real-time. Since our results show that these early aftershock locations may be used to estimate first order rupture parameters for large global earthquakes, the near real-time availability of these data would be useful for fast earthquake damage assessment.

  15. RAPID: Collaboration Results from Three NASA Centers in Commanding/Monitoring Lunar Assets

    NASA Technical Reports Server (NTRS)

    Torres, R. Jay; Allan, Mark; Hirsh, Robert; Wallick, Michael N.

    2009-01-01

    Three NASA centers are working together to address the challenge of operating robotic assets in support of human exploration of the Moon. This paper describes the combined work to date of the Ames Research Center (ARC), Jet Propulsion Laboratory (JPL) and Johnson Space Center (JSC) on a common support framework to control and monitor lunar robotic assets. We discuss how we have addressed specific challenges including time-delayed operations, and geographically distributed collaborative monitoring and control, to build an effective architecture for integrating a heterogeneous collection of robotic assets into a common work. We describe the design of the Robot Application Programming Interface Delegate (RAPID) architecture that effectively addresses the problem of interfacing a family of robots including the JSC Chariot, ARC K-10 and JPL ATHLETE rovers. We report on lessons learned from the June 2008 field test in which RAPID was used to monitor and control all of these assets. We conclude by discussing some future directions to extend the RAPID architecture to add further support for NASA's lunar exploration program.

  16. Estimated airborne release of plutonium from the 102 Building at the General Electric Vallecitos Nuclear Center, Vallecitos, California, as a result of damage from severe wind and earthquake hazard

    SciTech Connect

    Mishima, J.; Ayer, J.E.; Hays, I.D.

    1980-12-01

    This report estimates the potential airborne releases of plutonium as a consequence of various severities of earthquake and wind hazard postulated for the 102 Building at the General Electric Vallecitos Nuclear Center in California. The releases are based on damage scenarios developed by other specialists. The hazard severities presented range up to a nominal velocity of 230 mph for wind hazard and are in excess of 0.8 g linear acceleration for earthquakes. The consequences of thrust faulting are considered. The approaches and factors used to estimate the releases are discussed. Release estimates range from 0.003 to 3 g Pu.

  17. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  18. The Observing System Monitoring Center: an Emerging Source for Integrated In-Situ Ocean Data

    NASA Astrophysics Data System (ADS)

    Hankin, S.; Habermann, T.; Kern, K.; Little, M.; Mendelssohn, R.; Neufeld, D.; O'Brien, K.; Simons, B.

    2011-12-01

    The Observing System Monitoring Center (OSMC) was originally conceived to serve as a tool to assist managers in monitoring the performance of the integrated global in-situ ocean observing system. For much of the past decade, the OSMC has been storing real time data and metadata from ocean observation sources such as the Global Telecommunications System (GTS), IOC sea level monitoring center, and others. The goal of the OSMC has been to maintain a record of all of the observations that represent the global climate data record. Though the initial purpose of the OSMC was mainly to track platform and observing subsystem performance, it has become clear that the data represented in the OSMC would be a valuable source for anyone interested in ocean processes. This presentation will discuss the implementation details involved in making the OSMC data available to the general public. We'll also discuss how we leveraged the NOAA-led Unified Access Framework (UAF), which defines a framework built upon community-accepted standards and conventions, in order to assist in the creation of the data services. By adhering to these well known and widely used standards and conventions, we ensure that the OSMC data will be available to users through many popular tools, including both web-based services and desktop clients. Additionally, we will also be discussing the modernized OSMC suite of user interfaces which intends to provide access to both ocean data and platform metrics for people ranging from ocean novices to scientific experts.

  19. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    NASA Technical Reports Server (NTRS)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  20. An Evaluation of North Korea’s Nuclear Test by Belbasi Nuclear Tests Monitoring Center-KOERI

    NASA Astrophysics Data System (ADS)

    Necmioglu, O.; Meral Ozel, N.; Semin, K.

    2009-12-01

    Bogazici University and Kandilli Observatory and Earthquake Research Institute (KOERI) is acting as the Turkish National Data Center (NDC) and responsible for the operation of the International Monitoring System (IMS) Primary Seismic Station (PS-43) under Belbasi Nuclear Tests Monitoring Center for the verification of compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) since February 2000. The NDC is responsible for operating two arrays which are part of the IMS, as well as for transmitting data from these stations to the International Data Centre (IDC) in Vienna. The Belbasi array was established in 1951, as a four-element (Benioff 1051) seismic array as part of the United States Atomic Energy Detection System (USAEDS). Turkish General Staff (TGS) and U.S. Air Force Technical Application Center (AFTAC) under the Defense and Economic Cooperation Agreement (DECA) jointly operated this short period array. The station was upgraded and several seismometers were added to array during 1951 and 1994 and the station code was changed from BSRS (Belbasi Seismic Research Station) to BRTR-PS43 later on. PS-43 is composed of two sub-arrays (Ankara and Keskin): the medium-period array with a ~40 km radius located in Ankara and the short-period array with a ~3 km radius located in Keskin. Each array has a broadband element located at the middle of the circular geometry. Short period instruments are installed at depth 30 meters from the surface while medium and broadband instruments are installed at depth 60 meters from surface. On 25 May 2009, The Democratic People’s Republic of Korea (DPRK) claimed that it had conducted a nuclear test. Corresponding seismic event was recorded by IMS and IDC released first automatic estimation of time (00:54:43 GMT), location (41.2896°N and 129.0480°E) and the magnitude (4.52 mb) of the event in less than two hours time (USGS: 00:54:43 GMT; 41.306°N, 129.029°E; 4.7 mb) During our preliminary analysis of the 25th May 2009 DPRK event, we saw a very clear P arrival at 01:05:47 (GMT) at BRTR SP array. The result of the f-k analysis performed in Geotool software, installed at NDC facilities in 2008 and is in full use currently, was also indicating that the arrival belongs to the DPRK event. When comparing our f-k results (calculated at 1-2 Hz) with IDC-REB, however, we have noticed that our calculation and therefore corresponding residuals (calculated with reference to REB residuals) are much better in comparison to REB. The reasons of this ambiguity have been explored and for the first time a comprehensive seismological analysis of a Nuclear Test has been conducted in Turkey. CTBT has an important role for the implementation of the non-proliferation of nuclear weapons and it is a key element for the pursuit of nuclear disarmament. In this study, we would like to reflect the technical and scientific aspects of the 25 May 2009 DPRK event analysis, together with our involvement in CTBT(O) affairs, which we believe it brings new dimensions to Turkey especially in the area of Geophysics.

  1. Federal Radiological Monitoring and Assessment Center (FRMAC) overview of FRMAC operations

    SciTech Connect

    1996-02-01

    In the event of a major radiological emergency, 17 federal agencies with various statutory responsibilities have agreed to coordinate their efforts at the emergency scene under the umbrella of the Federal Radiological Emergency Response plan (FRERP). This cooperative effort will assure the designated Lead Federal Agency (LFA) and the state(s) that all federal radiological assistance fully supports their efforts to protect the public. The mandated federal cooperation ensures that each agency can obtain the data critical to its specific responsibilities. This Overview of the Federal Radiological Monitoring and Assessment Center (FRMAC) Operations describes the FRMAC response activities to a major radiological emergency. It also describes the federal assets and subsequent operational activities which provide federal radiological monitoring and assessment of the off-site areas. These off-site areas may include one or more affected states.

  2. U.S. Geological Survey and The National Academies; USGS OF-2007-1047, Extended Abstract 011 Hydroacoustic monitoring of the Bransfield Strait and Drake Passage,

    E-print Network

    Bohnenstiehl, Delwayne

    recorded hundreds of earthquakes from the seafloor spreading centers and submarine volcanoes within Hydroacoustic monitoring of the Bransfield Strait and Drake Passage, Antarctica: A first analysis of seafloor: A first analysis of seafloor seismicity, cryogenic acoustic sources, and cetacean vocalizations

  3. Anthropogenic Triggering of Large Earthquakes

    PubMed Central

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1?MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1–10?MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor “foreshocks”, since the induction may occur with a delay up to several years. PMID:25156190

  4. Anthropogenic Triggering of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Bizzarri, Andrea

    2014-08-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor ``foreshocks'', since the induction may occur with a delay up to several years.

  5. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    USGS Publications Warehouse

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    On March 27, 1964, at 5:36 p.m., a magnitude 9.2 earthquake, the largest recorded earthquake in U.S. history, struck southcentral Alaska (fig. 1). The Great Alaska Earthquake (also known as the Good Friday Earthquake) occurred at a pivotal time in the history of earth science, and helped lead to the acceptance of plate tectonic theory (Cox, 1973; Brocher and others, 2014). All large subduction zone earthquakes are understood through insights learned from the 1964 event, and observations and interpretations of the earthquake have influenced the design of infrastructure and seismic monitoring systems now in place. The earthquake caused extensive damage across the State, and triggered local tsunamis that devastated the Alaskan towns of Whittier, Valdez, and Seward. In Anchorage, the main cause of damage was ground shaking, which lasted approximately 4.5 minutes. Many buildings could not withstand this motion and were damaged or collapsed even though their foundations remained intact. More significantly, ground shaking triggered a number of landslides along coastal and drainage valley bluffs underlain by the Bootlegger Cove Formation, a composite of facies containing variably mixed gravel, sand, silt, and clay which were deposited over much of upper Cook Inlet during the Late Pleistocene (Ulery and others, 1983). Cyclic (or strain) softening of the more sensitive clay facies caused overlying blocks of soil to slide sideways along surfaces dipping by only a few degrees. This guide is the document version of an interactive web map that was created as part of the commemoration events for the 50th anniversary of the 1964 Great Alaska Earthquake. It is accessible at the U.S. Geological Survey (USGS) Alaska Science Center website: http://alaska.usgs.gov/announcements/news/1964Earthquake/. The website features a map display with suggested tour stops in Anchorage, historical photographs taken shortly after the earthquake, repeat photography of selected sites, scanned documents, and small-scale maps, as well as links to slideshows of additional photographs and Google Street View™ scenes. Buildings in Anchorage that were severely damaged, sites of major landslides, and locations of post-earthquake engineering responses are highlighted. The web map can be used online as a virtual tour or in a physical self-guided tour using a web-enabled Global Positioning System (GPS) device. This publication serves the purpose of committing most of the content of the web map to a single distributable document. As such, some of the content differs from the online version.

  6. 76 FR 61115 - Migrant and Seasonal Farmworkers (MSFWs) Monitoring Report and One-Stop Career Center Complaint...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ...MSFWs) Monitoring Report and One-Stop Career Center Complaint/Referral Record: Comments...revision for ETA Form 8429, One-Stop Career Center Complaint/ Referral Record, to...MSFWs. The ETA Form 8429, One-Stop Career Center Complaint/Referral Record,...

  7. Earthquakes and the urban environment. Volume I

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 1 contains chapters on earthquake parameters and hazards.

  8. Earthquakes and the urban environment. Volume II

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 2 contains chapters on earthquake prediction, control, building design and building response.

  9. Photovoltaic Performance and Reliability Database: A Gateway to Experimental Data Monitoring Projects for PV at the Florida Solar Energy Center

    DOE Data Explorer

    This site is the gateway to experimental data monitoring projects for photovoltaic (PV) at the Florida Solar Energy Center. The website and the database were designed to facilitate and standardize the processes for archiving, analyzing and accessing data collected from dozens of operational PV systems and test facilities monitored by FSEC's Photovoltaics and Distributed Generation Division. [copied from http://www.fsec.ucf.edu/en/research/photovoltaics/data_monitoring/index.htm

  10. Seismological investigation of earthquakes in the New Madrid Seismic Zone. Final report, September 1986--December 1992

    SciTech Connect

    Herrmann, R.B.; Nguyen, B.

    1993-08-01

    Earthquake activity in the New Madrid Seismic Zone had been monitored by regional seismic networks since 1975. During this time period, over 3,700 earthquakes have been located within the region bounded by latitudes 35{degrees}--39{degrees}N and longitudes 87{degrees}--92{degrees}W. Most of these earthquakes occur within a 1.5{degrees} x 2{degrees} zone centered on the Missouri Bootheel. Source parameters of larger earthquakes in the zone and in eastern North America are determined using surface-wave spectral amplitudes and broadband waveforms for the purpose of determining the focal mechanism, source depth and seismic moment. Waveform modeling of broadband data is shown to be a powerful tool in defining these source parameters when used complementary with regional seismic network data, and in addition, in verifying the correctness of previously published focal mechanism solutions.

  11. The Deep Impact Network Experiment Operations Center Monitor and Control System

    NASA Technical Reports Server (NTRS)

    Wang, Shin-Ywan (Cindy); Torgerson, J. Leigh; Schoolcraft, Joshua; Brenman, Yan

    2009-01-01

    The Interplanetary Overlay Network (ION) software at JPL is an implementation of Delay/Disruption Tolerant Networking (DTN) which has been proposed as an interplanetary protocol to support space communication. The JPL Deep Impact Network (DINET) is a technology development experiment intended to increase the technical readiness of the JPL implemented ION suite. The DINET Experiment Operations Center (EOC) developed by JPL's Protocol Technology Lab (PTL) was critical in accomplishing the experiment. EOC, containing all end nodes of simulated spaces and one administrative node, exercised publish and subscribe functions for payload data among all end nodes to verify the effectiveness of data exchange over ION protocol stacks. A Monitor and Control System was created and installed on the administrative node as a multi-tiered internet-based Web application to support the Deep Impact Network Experiment by allowing monitoring and analysis of the data delivery and statistics from ION. This Monitor and Control System includes the capability of receiving protocol status messages, classifying and storing status messages into a database from the ION simulation network, and providing web interfaces for viewing the live results in addition to interactive database queries.

  12. Earthquake Analysis.

    ERIC Educational Resources Information Center

    Espinoza, Fernando

    2000-01-01

    Indicates the importance of the development of students' measurement and estimation skills. Analyzes earthquake data recorded at seismograph stations and explains how to read and modify the graphs. Presents an activity for student evaluation. (YDS)

  13. Broadband characteristics of earthquakes recorded during a dome-building eruption at Mount St. Helens, Washington, between October 2004 and May 2005: Chapter 5 in A volcano rekindled: the renewed eruption of Mount St. Helens, 2004-2006

    USGS Publications Warehouse

    Horton, Stephen P.; Norris, Robert D.; Moran, Seth C.

    2008-01-01

    From October 2004 to May 2005, the Center for Earthquake Research and Information of the University of Memphis operated two to six broadband seismometers within 5 to 20 km of Mount St. Helens to help monitor recent seismic and volcanic activity. Approximately 57,000 earthquakes identified during the 7-month deployment had a normal magnitude distribution with a mean magnitude of 1.78 and a standard deviation of 0.24 magnitude units. Both the mode and range of earthquake magnitude and the rate of activity varied during the deployment. We examined the time domain and spectral characteristics of two classes of events seen during dome building. These include volcano-tectonic earthquakes and lower-frequency events. Lower-frequency events are further classified into hybrid earthquakes, low-frequency earthquakes, and long-duration volcanic tremor. Hybrid and low-frequency earthquakes showed a continuum of characteristics that varied systematically with time. A progressive loss of high-frequency seismic energy occurred in earthquakes as magma approached and eventually reached the surface. The spectral shape of large and small earthquakes occurring within days of each other did not vary with magnitude. Volcanic tremor events and lower-frequency earthquakes displayed consistent spectral peaks, although higher frequencies were more favorably excited during tremor than earthquakes.

  14. Southeast Indian Ocean-Ridge earthquake sequences from cross-correlation analysis of hydroacoustic data

    NASA Astrophysics Data System (ADS)

    Yun, Sukyoung; Ni, Sidao; Park, Minkyu; Lee, Won Sang

    2009-10-01

    Parameters of earthquake sequences, for instance location and timing of foreshocks and aftershocks, are critical for understanding dynamics of mid-ocean ridge and transform faults. Whole sequences including small earthquakes in the ocean cannot be well recorded by land-based seismometers due to large epicentral distances. Recent hydroacoustic studies have demonstrated that T waves are very effective in detecting small submarine earthquakes because of little energy loss during propagation in Sound Fixing and Ranging channel. For example, an (2006 March 6, 40.11°S/78.49°E) transform-fault earthquake occurred at the Southeastern Indian Ocean Ridge, but National Earthquake Information Center only reported three aftershocks in the first following week. We applied cross-correlation method to hydroacoustic data from the International Monitoring System arrays in the Indian Ocean to examine the whole earthquake sequence. We detected 14 aftershocks and none foreshock for the earthquake and locations of these aftershocks show an irregular pattern. From the observation, we suggest that the feature could be caused by complicated transcurrent plate-boundary dynamics between two overlapped spreading ridges that is possibly explained by the bookshelf faulting model.

  15. Predicting Iceland's earthquakes

    NASA Astrophysics Data System (ADS)

    Bovarsson, Reynir; Einarsson, Pall

    A research project involving seismologists from the Nordic countries (Denmark, Finland, Iceland, Norway, and Sweden) started recently. The long-term project goal is to reduce earthquake losses and strengthen the physical basis for earthquake prediction.The project originated in the Council of Europe. In a resolution passed in 1980, the council stressed the importance of concentrating earthquake prediction research in five test areas in Europe, one of which was the South Iceland Seismic Zone (see Figure 1). The resolution led to discussion among Nordic seismologists on cooperative work in the Icelandic test area. They agreed that an appropriate first step would be to establish a digital data acquisition system around which geophysical monitoring and other experiments could be designed. A proposal for a 5 - 6-year project, which was submitted to the Nordic Council and the Council of Nordic Ministers, was approved in 1987. The participants are planning a data acquisition system and geophysical experiments that can give information about changes in physical properties of the crust before large earthquakes.

  16. Earthquake Education and Outreach in Haiti

    USGS Multimedia Gallery

    Following the devastating 2010 Haiti earthquake, the USGS has been helping with earthquake awareness and monitoring in the country, with continued support from the U.S. Agency for International Development (USAID). This assistance has helped the Bureau des Mines et de l'Energie (BME) in Port-au-Prin...

  17. Rapid characterization of the 2015 Mw 7.8 Gorkha, Nepal, earthquake sequence and its seismotectonic context

    USGS Publications Warehouse

    Hayes, Gavin; Briggs, Richard; Barnhart, William D.; Yeck, William; McNamara, Daniel E.; Wald, David J.; Nealy, Jennifer; Benz, Harley M.; Gold, Ryan D.; Jaiswal, Kishor S.; Marano, Kristin; Earle, Paul; Hearne, Mike; Smoczyk, Gregory M.; Wald, Lisa A.; Samsonov, Sergey

    2015-01-01

    Earthquake response and related information products are important for placing recent seismic events into context and particularly for understanding the impact earthquakes can have on the regional community and its infrastructure. These tools are even more useful if they are available quickly, ahead of detailed information from the areas affected by such earthquakes. Here we provide an overview of the response activities and related information products generated and provided by the U.S. Geological Survey National Earthquake Information Center in association with the 2015 M 7.8 Gorkha, Nepal, earthquake. This group monitors global earthquakes 24??hrs/day and 7??days/week to provide rapid information on the location and size of recent events and to characterize the source properties, tectonic setting, and potential fatalities and economic losses associated with significant earthquakes. We present the timeline over which these products became available, discuss what they tell us about the seismotectonics of the Gorkha earthquake and its aftershocks, and examine how their information is used today, and might be used in the future, to help mitigate the impact of such natural disasters.

  18. Data Management Coordinators Monitor STS-78 Mission at the Huntsville Operations Support Center

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Launched on June 20, 1996, the STS-78 mission's primary payload was the Life and Microgravity Spacelab (LMS), which was managed by the Marshall Space Flight Center (MSFC). During the 17 day space flight, the crew conducted a diverse slate of experiments divided into a mix of life science and microgravity investigations. In a manner very similar to future International Space Station operations, LMS researchers from the United States and their European counterparts shared resources such as crew time and equipment. Five space agencies (NASA/USA, European Space Agency/Europe (ESA), French Space Agency/France, Canadian Space Agency /Canada, and Italian Space Agency/Italy) along with research scientists from 10 countries worked together on the design, development and construction of the LMS. This photo represents Data Management Coordinators monitoring the progress of the mission at the Huntsville Operations Support Center (HOSC) Spacelab Payload Operations Control Center (SL POCC) at MSFC. Pictured are assistant mission scientist Dr. Dalle Kornfeld, Rick McConnel, and Ann Bathew.

  19. The Swift X-ray monitoring campaign of the center of the Milky Way

    NASA Astrophysics Data System (ADS)

    Degenaar, N.; Wijnands, R.; Miller, J. M.; Reynolds, M. T.; Kennea, J.; Gehrels, N.

    2015-09-01

    In 2006 February, shortly after its launch, Swift began monitoring the center of the Milky Way with the on board X-Ray Telescope using short 1-ks exposures performed every 1-4 days. Between 2006 and 2014 over 1200 observations have been obtained, accumulating to ? 1.3 Ms of exposure time. This has yielded a wealth of information about the long-term X-ray behavior of the supermassive black hole Sgr A*, and numerous transient X-ray binaries that are located within the 25? ×25? region covered by the campaign. In this review we highlight the discoveries made during these first nine years, which include 1) the detection of seven bright X-ray flares from Sgr A*, 2) the discovery of the magnetar SGR J1745-29, 3) the first systematic analysis of the outburst light curves and energetics of the peculiar class of very-faint X-ray binaries, 4) the discovery of three new transient X-ray sources, 5) the exposure of low-level accretion in otherwise bright X-ray binaries, and 6) the identification of a candidate X-ray binary/millisecond radio pulsar transitional object. We also reflect on future science to be done by continuing this Swift's legacy campaign, such as high-cadence monitoring to study how the interaction between the gaseous object 'G2' and Sgr A* plays out in the future.

  20. Activation and implementation of a Federal Radiological Monitoring and Assessment Center

    SciTech Connect

    Doyle, J.F. III

    1989-01-01

    The Nevada Operations Office of the U.S. Department of Energy (DOE/NV) has been assigned the primary responsibility for responding to a major radiological emergency. The initial response to any radiological emergency, however, will probably be conducted under the DOE regional radiological assistance plan (RAP). If the dimensions of the crisis demand federal assistance, the following sequence of events may be anticipated: (1) DOE regional RAP response, (2) activation of the Federal Radiological Monitoring and Assistance Center (FRMAC) requested, (3) aerial measuring systems and DOE/NV advance party respond, (4) FRMAC activated, (5) FRMAC responds to state(s) and cognizant federal agency (CFA), and (6) management of FRMAC transferred to the Environmental Protection Agency (EPA). The paper discusses activation channels, authorization, notification, deployment, and interfaces.

  1. The parkfield, california, earthquake prediction experiment.

    PubMed

    Bakun, W H; Lindh, A G

    1985-08-16

    Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and aseismic slip associated with the last moderate Parkfield earthquake in 1966 constitute much of the basis of the design of the experiment. PMID:17739363

  2. Impending ionospheric anomaly preceding the Iquique Mw8.2 earthquake in Chile on 2014 April 1

    NASA Astrophysics Data System (ADS)

    Guo, Jinyun; Li, Wang; Yu, Hongjuan; Liu, Zhimin; Zhao, Chunmei; Kong, Qiaoli

    2015-12-01

    To investigate the coupling relationship between great earthquake and ionosphere, the GPS-derived total electron contents (TECs) by the Center for Orbit Determination in Europe and the foF2 data from the Space Weather Prediction Center were used to analyse the impending ionospheric anomalies before the Iquique Mw8.2 earthquake in Chile on 2014 April 1. Eliminating effects of the solar and geomagnetic activities on ionosphere by the sliding interquartile range with the 27-day window, the TEC analysis results represent that there were negative anomalies occurred on 15th day prior to the earthquake, and positive anomalies appeared in 5th day before the earthquake. The foF2 analysis results of ionosonde stations Jicamarca, Concepcion and Ramey show that the foF2 increased by 40, 50 and 45 per cent, respectively, on 5th day before the earthquake. The TEC anomalous distribution indicates that there was a widely TEC decrement over the epicentre with the duration of 6 hr on 15th day before the earthquake. On 5th day before the earthquake, the TEC over the epicentre increased with the amplitude of 15 TECu, and the duration exceeded 6 hr. The anomalies occurred on the side away from the equator. All TEC anomalies in these days were within the bounds of equatorial anomaly zone where should be the focal area to monitor ionospheric anomaly before strong earthquakes. The relationship between ionospheric anomalies and geomagnetic activity was detected by the cross wavelet analysis, which implied that the foF2 was not affected by the magnetic activities on 15th day and 5th day prior to the earthquake, but the TECs were partially affected by anomalous magnetic activity during some periods of 5th day prior to the earthquake.

  3. Earthquakes and the urban environment. Volume III

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 3 contains chapters on seismic planning, social aspects and future prospects.

  4. Incorporating Fundamentals of Climate Monitoring into Climate Indicators at the National Climatic Data Center

    NASA Astrophysics Data System (ADS)

    Arndt, D. S.

    2014-12-01

    In recent years, much attention has been dedicated to the development, testing and implementation of climate indicators. Several Federal agencies and academic groups have commissioned suites of indicators drawing upon and aggregating information available across the spectrum of climate data stewards and providers. As a long-time participant in the applied climatology discipline, NOAA's National Climatic Data Center (NCDC) has generated climate indicators for several decades. Traditionally, these indicators were developed for sectors with long-standing relationships with, and needs of, the applied climatology field. These have recently been adopted and adapted to meet the needs of sectors who have newfound sensitivities to climate and needs for climate data. Information and indices from NOAA's National Climatic Data Center have been prominent components of these indicator suites, and in some cases have been drafted in toto by these aggregators, often with improvements to the communicability and aesthetics of the indicators themselves. Across this history of supporting needs for indicators, NCDC climatologists developed a handful of practical approaches and philosophies that inform a successful climate monitoring product. This manuscript and presentation will demonstrate the utility this set of practical applications that translate raw data into useful information.

  5. Earthquake physics from small to global scales Eric G. Daub

    E-print Network

    Daub,Eric G.

    Earthquake physics from small to global scales Eric G. Daub Geophysics Group/Center for Nonlinear,;,05,9$##%?,>@7,0Global scales ­ megaquakes #12;Goal: improve our understanding of the basic physics of earthquakes. Interdisciplinary problem -- draws

  6. Satellite relay telemetry of seismic data in earthquake prediction and control

    USGS Publications Warehouse

    Jackson, Wayne H.; Eaton, Jerry P.

    1971-01-01

    The Satellite Telemetry Earthquake Monitoring Program was started in FY 1968 to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research (NCER) in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project.

  7. Data and Safety Monitoring Plan The University of Chicago Comprehensive Cancer Center

    E-print Network

    Sherman, S. Murray

    Trials VI. Data Quality Control: Audit Program ............................................................................................ 5-6 A. Clinical Trials Review Committee -Risk Determination B. Scientific and Accrual Monitoring................................................................................................................................ 7-10 A. Phase I and II Conferences B. Other Safety Monitoring Conferences C. High Risk Protocol

  8. Interoperable Access to Near Real Time Ocean Observations with the Observing System Monitoring Center

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Hankin, S.; Mendelssohn, R.; Simons, R.; Smith, B.; Kern, K. J.

    2013-12-01

    The Observing System Monitoring Center (OSMC), a project funded by the National Oceanic and Atmospheric Administration's Climate Observations Division (COD), exists to join the discrete 'networks' of In Situ ocean observing platforms -- ships, surface floats, profiling floats, tide gauges, etc. - into a single, integrated system. The OSMC is addressing this goal through capabilities in three areas focusing on the needs of specific user groups: 1) it provides real time monitoring of the integrated observing system assets to assist management in optimizing the cost-effectiveness of the system for the assessment of climate variables; 2) it makes the stream of real time data coming from the observing system available to scientific end users into an easy-to-use form; and 3) in the future, it will unify the delayed-mode data from platform-focused data assembly centers into a standards- based distributed system that is readily accessible to interested users from the science and education communities. In this presentation, we will be focusing on the efforts of the OSMC to provide interoperable access to the near real time data stream that is available via the Global Telecommunications System (GTS). This is a very rich data source, and includes data from nearly all of the oceanographic platforms that are actively observing. We will discuss how the data is being served out using a number of widely used 'web services' (including OPeNDAP and SOS) and downloadable file formats (KML, csv, xls, netCDF), so that it can be accessed in web browsers and popular desktop analysis tools. We will also be discussing our use of the Environmental Research Division's Data Access Program (ERDDAP), available from NOAA/NMFS, which has allowed us to achieve our goals of serving the near real time data. From an interoperability perspective, it's important to note that access to the this stream of data is not just for humans, but also for machine-to-machine requests. We'll also delve into how we configured access to the near real time ocean observations in accordance with the Climate and Forecast (CF) metadata conventions describing the various 'feature types' associated with particular in situ observation types, or discrete sampling geometries (DSG). Wrapping up, we'll discuss some of the ways this data source is already being used.

  9. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  10. America's faulty earthquake plans

    SciTech Connect

    Rosen, J

    1989-10-01

    In this article, the author discusses the liklihood of major earthquakes in both the western and eastern United States as well as the level of preparedness of each region of the U.S. for a major earthquake. Current technology in both earthquake-resistance design and earthquake detection is described. Governmental programs for earthquake hazard reduction are outlined and critiqued.

  11. Chlorine dioxide: a new agent for dialysis monitor disinfection in a pediatric center.

    PubMed

    Palo, T D; Atti, M; Bellantuono, R; Giordano, M; Caringella, D A

    1997-01-01

    In order to evaluate the bacterial and endotoxin contamination in the dialysis fluids of our pediatric center and the effectiveness of chlorine dioxide (CD) compared with a conventional method, (1) deionized water, (2) dialysate fluid, (3) basic concentrate, and (4) acid concentrate were tested in 4 dialysis machines. Monitor sterilization was made using CD in protocol A and sodium hypochlorite/acetic acid in protocol B. Once every 2 weeks the deionized water set of distribution was routinely disinfected with peracetic acid. Each protocol lasted 1 months and the samples were taken, under aseptic conditions, on the 15th, 22nd and 27th day. All samples, at all stages of the study, showed an endotoxin concentration below the limits recommended by the Canadian Standard Association. Fifty-nine out of 72 samples in A and 62 out of 72 samples in B showed a bacterial count within the range recommended by the Association for the Advancement of Medical Instrumentation. The data show that both protocols produced the same results. However, protocol A is to be preferred for its simultaneous disinfecting-cleaning and descaling activity which proves time-saving. PMID:9262845

  12. Multi-Year Combination of Tide Gauge Benchmark Monitoring (TIGA) Analysis Center Products

    NASA Astrophysics Data System (ADS)

    Hunegnaw, A.; Teferle, F. N.

    2014-12-01

    In 2013 the International GNSS Service (IGS) Tide Gauge Benchmark Monitoring (TIGA) Working Group (WG) started their reprocessing campaign, which proposes to re-analyze all relevant Global Positioning System (GPS) observations from 1994 to 2013. This re-processed dataset will provide high quality estimates of land motions, enabling regional and global high-precision geophysical/geodetic studies. Several of the individual TIGA Analysis Centers (TACs) have completed processing the full history of GPS observations recorded by the IGS global network, as well as, many other GPS stations at or close to tide gauges, which are available from the TIGA data centre at the University of La Rochelle (www.sonel.org). The TAC solutions contain a total of over 700 stations. Following the recent improvements in processing models and strategies, this is the first complete reprocessing attempt by the TIGA WG to provide homogeneous position time series. The TIGA Combination Centre (TCC) at the University of Luxembourg (UL) has computed a first multi-year weekly combined solution using two independent combination software packages: CATREF and GLOBK. These combinations allow an evaluation of any effects from the combination software and of the individual TAC contributions and their influences on the combined solution. In this study we will present the first UL TIGA multi-year combination results and discuss these in terms of geocentric sea level changes

  13. The Swift X-ray monitoring campaign of the center of the Milky Way

    E-print Network

    Degenaar, N; Miller, J M; Reynolds, M T; Kennea, J; Gehrels, N

    2015-01-01

    In 2006 February, shortly after its launch, Swift began monitoring the center of the Milky Way with the onboard X-Ray Telescope using short 1-ks exposures performed every 1-4 days. Between 2006 and 2014, over 1200 observations have been obtained, amounting to ~1.2 Ms of exposure time. This has yielded a wealth of information about the long-term X-ray behavior of the supermassive black hole Sgr A*, and numerous transient X-ray binaries that are located within the 25'x25' region covered by the campaign. In this review we highlight the discoveries made during these first nine years, which includes 1) the detection of seven bright X-ray flares from Sgr A*, 2) the discovery of the magnetar SGR J1745-29, 3) the first systematic analysis of the outburst light curves and energetics of the peculiar class of very-faint X-ray binaries, 4) the discovery of three new transient X-ray sources, 5) exposing low-level accretion in otherwise bright X-ray binaries, and 6) the identification of a candidate X-ray binary/millisecon...

  14. Discrimination of quarry blasts and earthquakes in the vicinity of Istanbul using soft computing techniques

    NASA Astrophysics Data System (ADS)

    Y?ld?r?m, Eray; Gülba?, Ali; Horasan, Gündüz; Do?an, Emrah

    2011-09-01

    The purpose of this article is to demonstrate the use of feedforward neural networks (FFNNs), adaptive neural fuzzy inference systems (ANFIS), and probabilistic neural networks (PNNs) to discriminate between earthquakes and quarry blasts in Istanbul and vicinity (the Marmara region). The tectonically active Marmara region is affected by the Thrace-Eski?ehir fault zone and especially the North Anatolian fault zone (NAFZ). Local MARNET stations, which were established in 1976 and are operated by the Kandilli Observatory and Earthquake Research Institute (KOERI), record not only earthquakes that occur in the region, but also quarry blasts. There are a few quarry-blasting areas in the Gaziosmanpa?a, Çatalca, Ömerli, and Hereke regions. Analytical methods were applied to a set of 175 seismic events (2001-2004) recorded by the stations of the local seismic network (ISK, HRT, and CTT stations) operated by the KOERI National Earthquake Monitoring Center (NEMC). Out of a total of 175 records, 148 are related to quarry blasts and 27 to earthquakes. The data sets were divided into training and testing sets for each region. In all the models developed, the input vectors consist of the peak amplitude ratio (S/P ratio) and the complexity value, and the output is a determination of either earthquake or quarry blast. The success of the developed models on regional test data varies between 97.67% and 100%.

  15. Cooperative Monitoring Center Occasional Paper/11: Cooperative Environmental Monitoring in the Coastal Regions of India and Pakistan

    SciTech Connect

    Rajen, Gauray

    1999-06-01

    The cessation of hostilities between India and Pakistan is an immediate need and of global concern, as these countries have tested nuclear devices, and have the capability to deploy nuclear weapons and long-range ballistic missiles. Cooperative monitoring projects among neighboring countries in South Asia could build regional confidence, and, through gradual improvements in relations, reduce the threat of war and the proliferation of weapons of mass destruction. This paper discusses monitoring the trans-border movement of flow and sediment in the Indian and Pakistani coastal areas. Through such a project, India and Pakistan could initiate greater cooperation, and engender movement towards the resolution of the Sir Creek territorial dispute in their coastal region. The Joint Working Groups dialogue being conducted by India and Pakistan provides a mechanism for promoting such a project. The proposed project also falls within a regional framework of cooperation agreed to by several South Asian countries. This framework has been codified in the South Asian Seas Action Plan, developed by Bangladesh, India, Maldives, Pakistan and Sri Lanka. This framework provides a useful starting point for Indian and Pakistani cooperative monitoring in their trans-border coastal area. The project discussed in this paper involves computer modeling, the placement of in situ sensors for remote data acquisition, and the development of joint reports. Preliminary computer modeling studies are presented in the paper. These results illustrate the cross-flow connections between Indian and Pakistani coastal regions and strengthen the argument for cooperation. Technologies and actions similar to those suggested for the coastal project are likely to be applied in future arms control and treaty verification agreements. The project, therefore, serves as a demonstration of cooperative monitoring technologies. The project will also increase people-to-people contacts among Indian and Pakistani policy makers and scientists. In the perceptions of the general public, the project will crystallize the idea that the two countries share ecosystems and natural resources, and have a vested interest in increased collaboration.

  16. Earthquakes and plate tectonics.

    USGS Publications Warehouse

    Spall, H.

    1982-01-01

    Earthquakes occur at the following three kinds of plate boundary: ocean ridges where the plates are pulled apart, margins where the plates scrape past one another, and margins where one plate is thrust under the other. Thus, we can predict the general regions on the earth's surface where we can expect large earthquakes in the future. We know that each year about 140 earthquakes of magnitude 6 or greater will occur within this area which is 10% of the earth's surface. But on a worldwide basis we cannot say with much accuracy when these events will occur. The reason is that the processes in plate tectonics have been going on for millions of years. Averaged over this interval, plate motions amount to several mm per year. But at any instant in geologic time, for example the year 1982, we do not know, exactly where we are in the worldwide cycle of strain build-up and strain release. Only by monitoring the stress and strain in small areas, for instance, the San Andreas fault, in great detail can we hope to predict when renewed activity in that part of the plate tectonics arena is likely to take place. -from Author

  17. Earthquakes in Arkansas and vicinity 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Ausbrooks, Scott M.

    2011-01-01

    This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  18. Using graphics and expert system technologies to support satellite monitoring at the NASA Goddard Space Flight Center

    NASA Astrophysics Data System (ADS)

    Hughes, Peter M.; Shirah, Gregory W.; Luczak, Edward C.

    1994-11-01

    At NASA's Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analysts Assistant (GenSAA), was developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. This paper describes GenSAA's capabilities and how it is supporting monitoring functions of current and future NASA missions for a variety of satellite monitoring applications ranging from subsystem health and safety to spacecraft attitude. Finally, this paper addresses efforts to generalize GenSAA's data interface for more widespread usage throughout the space and commercial industry.

  19. Using graphics and expert system technologies to support satellite monitoring at the NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Shirah, Gregory W.; Luczak, Edward C.

    1994-01-01

    At NASA's Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analysts Assistant (GenSAA), was developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. This paper describes GenSAA's capabilities and how it is supporting monitoring functions of current and future NASA missions for a variety of satellite monitoring applications ranging from subsystem health and safety to spacecraft attitude. Finally, this paper addresses efforts to generalize GenSAA's data interface for more widespread usage throughout the space and commercial industry.

  20. Improvements of the RST (Robust Satellite Techniques) approach for the thermal monitoring of the earthquake prone areas: an analysis on Italian peninsula in the period 2004-2012

    NASA Astrophysics Data System (ADS)

    Genzano, N.; Paciello, R.; Pergola, N.; Tramutoli, V.

    2013-12-01

    In the past, a Robust Satellite data analysis Technique (RST) was proposed to investigate possible relations between earthquake occurrence and space-time fluctuations of Earth's emitted TIR radiation observed from satellite. Based on a statistically definition of 'TIR anomalies' it allowed their identification even in very different natural (e.g. related to atmosphere and/or surface) and observational (e.g. related to time/season, but also to solar and satellite zenithal angles) conditions. RST approach has been implemented on different, polar and geostationary satellite systems (e.g. MSG/SEVIRI, GOES/IMAGER, EOS/MODIS, NOAA/AVHRR, etc.) and to earthquakes with a wide range of magnitudes (from 4.0 to 7.9) occurred in different tectonic contexts in all the world. In this paper, in order to further reduce false positives due to particular meteorological conditions, a refined RST approach is presented and validated on a long time series (9 years) of TIR satellite records collected by the geostationary satellite sensor MSG/SEVIRI over the Italian peninsula. The space-time persistence analysis performed on TIR anomaly maps shows: - a significant reduction of false positives; - several sequences of TIR anomalies, in a significant space-time relation with earthquakes with M>4. The relations among particular features of TIR anomalies (e.g. space-time extension and intensity) and earthquakes (e.g. magnitude, depth, focal mechanism) will be also discussed.

  1. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by our actions. Using these global datasets will help to make the model as uniform as possible. The model must be built by scientists in the affected countries with GEM's support, augmented by their insights and data. The model will launch in 2014; to succeed it must be open, international, independent, and continuously tested. But the mission of GEM is not just the likelihood of ground shaking, but also gaging the economic and social consequences of earthquakes, which greatly amplify the losses. For example, should the municipality of Istanbul retrofit schools, or increase its insurance reserves and recovery capacity? Should a homeowner in a high-risk area move or strengthen her building? This is why GEM is a public-private partnership. GEM's fourteen public sponsors and eight non-governmental organization members are standing for the developing world. To extend GEM into the financial world, we draw upon the expertise of companies. GEM's ten private sponsors have endorsed the acquisition of public knowledge over private gain. In a competitive world, this is a courageous act. GEM is but one link in a chain of preparedness: from earth science and engineering research, through groups like GEM, to mitigation, retrofit or relocate decisions, building codes and insurance, and finally to prepared hospitals, schools, and homes. But it is a link that our community can make strong.

  2. RST (Robust Satellite Techiniques) analysis for monitoring earth emitted radiation at the time of the Hector Mine 16th October 1999 earthquake

    NASA Astrophysics Data System (ADS)

    Lisi, M.; Filizzola, C.; Genzano, N.; Mazzeo, G.; Pergola, N.; Tramutoli, V.

    2009-12-01

    Several studies have been performed, in the past years, reporting the appearance of space-time anomalies in TIR satellite imagery, from weeks to days, before severe earthquakes. Different authors, in order to explain the appearance of anomalously high TIR records near the place and the time of earthquake occurrence, attributed their appearance to the increase of green-house gas (such as CO2, CH4, etc.) emission rates, to the modification of ground water regime and/or to the increase of convective heat flux. Among the others, a Robust Satellite data analysis Technique (RST), based on the RAT - Robust AVHRR (Advanced Very High Resolution Radiometer) Techniques - approach, was proposed to investigate possible relations between earthquake occurrence and space-time fluctuations of Earth’s emitted TIR radiation observed from satellite. The RST analysis is based on a statistically definition of “TIR anomalies” allowing their identification even in very different natural (e.g. related to atmosphere and/or surface) and observational (e.g. related to time/season, but also to solar and satellite zenithal angles) conditions. The correlation analysis (in the space-time domain) with earthquake occurrence is always carried out by using a validation/confutation approach, in order to verify the presence/absence of anomalous space-time TIR transients in the presence/absence of significant seismic activity. The RST approach was already tested in the case of tens of earthquakes occurred in different continents (Europe, Asia, America and Africa), in various geo-tectonic settings (compressive, extensional and transcurrent) and with a wide range of magnitudes (from 4.0 to 7.9). In this paper, the results of RST analysis performed over 7 years of TIR satellite records collected over the western part of the United States of America at the time of Hector Mine earthquake (16th October 1999, M 7.1) are presented and compared with an identical analysis (confutation) performed in different years (characterized by the absence of earthquakes of similar magnitude over the same area), in order to verify the presence /absence of anomalous space-time TIR transients in both cases.

  3. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected level of shaking intensity with empirical models of fatality losses calibrated on past earthquakes in each country. Non-seismic detections and macroseismic questionnaires collected online are combined to identify as many as possible of the felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the US Geological Survey, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. All together, we estimate that the number of detected felt earthquakes is around 1 000 per year, compared with the 35 000 earthquakes annually reported by the EMSC! Felt events are already the subject of the web page "Latest significant earthquakes" on EMSC website (http://www.emsc-csem.org/Earthquake/significant_earthquakes.php) and of a dedicated Twitter service @LastQuake. We will present the identification process of the earthquakes that matter, the smartphone application itself (to be released in May) and its future evolutions.

  4. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-31

    ... Geological Survey National Earthquake Prediction Evaluation Council (NEPEC) AGENCY: Department of the... National Earthquake Prediction Evaluation Council (NEPEC) will hold a 1\\1/2\\ day meeting on September 17 and 18, 2012, at the U.S. Geological Survey National Earthquake Information Center (NEIC),...

  5. A Parallel Visualization Pipeline for Terascale Earthquake Simulations

    E-print Network

    Ma, Kwan-Liu

    A Parallel Visualization Pipeline for Terascale Earthquake Simulations Hongfeng Yu Kwan-Liu Ma at the Pittsburgh Supercomputing Center (PSC) for studying the largest earthquake simulation ever performed visualization, vol- ume rendering 1. INTRODUCTION Large-scale computer modeling of the earthquake-induced ground

  6. 4th International Conference on Earthquake Engineering Taipei, Taiwan

    E-print Network

    Bruneau, Michel

    4th International Conference on Earthquake Engineering Taipei, Taiwan October 12-13, 2006 Paper No Center for Research on Earthquake Engineering. The paper focuses on the design procedures, experimental accelerations, which were recorded in the 1999 Chi-Chi earthquake and scaled up to represent seismic hazards

  7. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-31

    ...USGS-GX12GG00995NP00] National Earthquake Prediction Evaluation Council (NEPEC...to Public Law 96-472, the National Earthquake Prediction Evaluation Council (NEPEC...the U.S. Geological Survey National Earthquake Information Center (NEIC), 1711...

  8. Cooperative Monitoring Center Occasional Paper/4: Missile Control in South Asia and the Role of Cooperative Monitoring Technology

    SciTech Connect

    Kamal, N.; Sawhney, P.

    1998-10-01

    The succession of nuclear tests by India and Pakistan in May 1998 has changed the nature of their missile rivalry, which is only one of numerous manifestations of their relationship as hardened adversaries, deeply sensitive to each other's existing and evolving defense capabilities. The political context surrounding this costly rivalry remains unmediated by arms control measures or by any nascent prospect of detente. As a parallel development, sensible voices in both countries will continue to talk of building mutual confidence through openness to avert accidents, misjudgments, and misinterpretations. To facilitate a future peace process, this paper offers possible suggestions for stabilization that could be applied to India's and Pakistan's missile situation. Appendices include descriptions of existing missile agreements that have contributed to better relations for other countries as well as a list of the cooperative monitoring technologies available to provide information useful in implementing subcontinent missile regimes.

  9. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for recurrence, duration, and frequency response. At the Southern California field sites, one loop antenna was positioned for omni-directional reception and also detected a strong First Schumann Resonance; however, additional Schumann Resonances were absent. At the Timpson, TX field sites, loop antennae were positioned for directional reception, due to earthquake-induced, hydraulic fracturing activity currently conducted by the oil and gas industry. Two strong signals, one moderately strong signal, and approximately 6-8 weaker signals were detected in the immediate vicinity. The three stronger signals were mapped by a biangulation technique, followed by a triangulation technique for confirmation. This was the first antenna mapping technique ever performed for determining possible earthquake epicenters. Six and a half months later, Timpson experienced two M4 (M4.1 and M4.3) earthquakes on September 2, 2013 followed by a M2.4 earthquake three days later, all occurring at a depth of five kilometers. The Timpson earthquake activity now has a cyclical rate and a forecast was given to the proper authorities. As a result, the Southern California and Timpson, TX field results led to an improved design and construction of a third prototype antenna. With a loop antenna array, a viable communication system, and continuous monitoring, a full fracture cycle can be established and observed in real-time. In addition, field data could be reviewed quickly for assessment and lead to a much more improved earthquake forecasting capability. The EM precursors determined by this method appear to surpass all prior precursor claims, and the general public will finally receive long overdue forecasting.

  10. Earthquakes in Mississippi and vicinity 1811-2010

    USGS Publications Warehouse

    Dart, Richard L.; Bograd, Michael B.E.

    2011-01-01

    This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  11. Cooperative Monitoring Center Occasional Paper/7: A Generic Model for Cooperative Border Security

    SciTech Connect

    Netzer, Colonel Gideon

    1999-03-01

    This paper presents a generic model for dealing with security problems along borders between countries. It presents descriptions and characteristics of various borders and identifies the threats to border security, while emphasizing cooperative monitoring solutions.

  12. The development of a remote monitoring system for the Nuclear Science Center reactor 

    E-print Network

    Jiltchenkov, Dmitri Victorovich

    2002-01-01

    With funding provided by Nuclear Energy Research Initiative (NERI), design of Secure, Transportable, Autonomous Reactors (STAR) to aid countries with insufficient energy supplies is underway. The development of a new monitoring system that allows...

  13. Analysis of Instrumentation to Monitor the Hydrologic Performance of Green Infrastructure at the Edison Environmental Center

    EPA Science Inventory

    Infiltration is one of the primary functional mechanisms of green infrastructure stormwater controls, so this study explored selection and placement of embedded soil moisture and water level sensors to monitor surface infiltration and infiltration into the underlying soil for per...

  14. Research on Earthquake Precursor in E-TEC: A Study on Land Surface Thermal Anomalies Using MODIS LST Product in Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, W. Y.; Wu, M. C.

    2014-12-01

    Taiwan has been known as an excellent natural laboratory characterized by rapid active tectonic rate and high dense seismicity. The Eastern Taiwan Earthquake Research Center (E-TEC) is established on 2013/09/24 in National Dong Hwa University and collaborates with Central Weather Bureau (CWB), National Center for Research on Earthquake Engineering (NCREE), National Science and Technology Center for Disaster Reduction (NCDR), Institute of Earth Science of Academia Sinica (IES, AS) and other institutions (NCU, NTU, CCU) and aims to provide an integrated platform for researchers to conduct the new advances on earthquake precursors and early warning for seismic disaster prevention in the eastern Taiwan, as frequent temblors are most common in the East Taiwan rift valley. E-TEC intends to integrate the multi-disciplinary observations and is equipped with stations to monitor a wide array of factors of quake precursors, including seismicity, GPS, strain-meter, ground water, geochemistry, gravity, electromagnetic, ionospheric density, thermal infrared remote sensing, gamma radiation etc, and will maximize the value of the data for researches with the range of monitoring equipment that enable to predict where and when the next devastated earthquake will strike Taiwan and develop reliable earthquake prediction models. A preliminary study on earthquake precursor using monthly Moderate Resolution Imaging Spectroradiometer (MODIS) Land Surface Temperature (LST) data before 2013/03/27 Mw6.2 Nantou earthquake in Taiwan is presented. Using the statistical analysis, the result shows the peak of the anomalous LST that exceeds a standard deviation of LST appeared on 2013/03/09 and became less or none anomalies observed on 2013/03/16 before the main-shock, which is in consist with the phenomenon observed by other researchers. This preliminary experimental result shows that the thermal anomalies reveal the possibility to associate surface thermal phenomena before the strong earthquakes.

  15. Efficacy of wildlife rehabilitation centers in surveillance and monitoring of pathogen activity: a case study with West Nile virus.

    PubMed

    Randall, Natalie J; Blitvich, Bradley J; Blanchong, Julie A

    2012-07-01

    Surveillance is critical for identifying and monitoring pathogen activity in wildlife populations, but often is cost- and time-prohibitive and logistically challenging. We tested the hypothesis that wildlife rehabilitation centers are useful for monitoring pathogen activity using West Nile virus (WNV) as a case study. We hypothesized that birds submitted to wildlife rehabilitation centers would have a similar prevalence of antibody to WNV as free-ranging birds. From 2008 to 2010, we collected sera from peridomestic birds submitted to the Wildlife Care Clinic (WCC), a wildlife rehabilitation center in central Iowa, and tested them for antibodies to WNV. We also collected and tested sera from free-ranging peridomestic birds in the area from which approximately 50% of WCC submissions historically originated. Prevalences of WNV antibodies in free-ranging birds and in peridomestic WCC birds were 2.3% (44/1,936) and 2.8% (2/72), respectively. However, none of the birds submitted to the WCC from the area where we captured free-ranging birds had antibodies (0/29). Our results indicate that rehabilitation facilities are not likely to be useful for monitoring WNV activity at small spatial scales or over short-time periods due to the low endemic prevalence of WNV, and low and variable submission rates. However, at larger spatial scales (ca. nine Iowa counties) WNV antibody prevalence in peridomestic birds submitted to the WCC was similar to that of free-ranging birds. Although limitations to using rehabilitation birds to monitor WNV must be considered, testing these birds could be useful for monitoring WNV activity regionally, especially with many states limiting surveillance due to budgetary constraints. PMID:22740530

  16. EARTHQUAKE PREPAREDNESS FOR LABORATORIES

    E-print Network

    Polly, David

    EARTHQUAKE PREPAREDNESS FOR LABORATORIES By: Christopher E. Kohler (Environmental Health and Safety) and Walter E. Gray (Indiana Geological Survey) Earthquakes occur with little or no warning, and so planning of an earthquake. While most historical earthquakes were minor, Indiana's proximity to two seismic zones

  17. 2011 TOHOKUCHIHOTAIHEIYOU OKI EARTHQUAKE

    E-print Network

    Guillas, Serge

    2011 TOHOKUCHIHOTAIHEIYOU OKI EARTHQUAKE M. HORI Earthquake Research Institute, University of Tokyo Seminar on the Honshu Earthquake & Tsunami UCL Institute for Risk & Disaster Reduction March 24, 2011 #12;Earthquake Details · Magnitude in Richter scale 9.0 · Moment Magnitude 9.0 · Location 38.03N, 143.15E · Depth

  18. Long Period Earthquakes Beneath California's Young and Restless Volcanoes

    NASA Astrophysics Data System (ADS)

    Pitt, A. M.; Dawson, P. B.; Shelly, D. R.; Hill, D. P.; Mangan, M.

    2013-12-01

    The newly established USGS California Volcano Observatory has the broad responsibility of monitoring and assessing hazards at California's potentially threatening volcanoes, most notably Mount Shasta, Medicine Lake, Clear Lake Volcanic Field, and Lassen Volcanic Center in northern California; and Long Valley Caldera, Mammoth Mountain, and Mono-Inyo Craters in east-central California. Volcanic eruptions occur in California about as frequently as the largest San Andreas Fault Zone earthquakes-more than ten eruptions have occurred in the last 1,000 years, most recently at Lassen Peak (1666 C.E. and 1914-1917 C.E.) and Mono-Inyo Craters (c. 1700 C.E.). The Long Valley region (Long Valley caldera and Mammoth Mountain) underwent several episodes of heightened unrest over the last three decades, including intense swarms of volcano-tectonic (VT) earthquakes, rapid caldera uplift, and hazardous CO2 emissions. Both Medicine Lake and Lassen are subsiding at appreciable rates, and along with Clear Lake, Long Valley Caldera, and Mammoth Mountain, sporadically experience long period (LP) earthquakes related to migration of magmatic or hydrothermal fluids. Worldwide, the last two decades have shown the importance of tracking LP earthquakes beneath young volcanic systems, as they often provide indication of impending unrest or eruption. Herein we document the occurrence of LP earthquakes at several of California's young volcanoes, updating a previous study published in Pitt et al., 2002, SRL. All events were detected and located using data from stations within the Northern California Seismic Network (NCSN). Event detection was spatially and temporally uneven across the NCSN in the 1980s and 1990s, but additional stations, adoption of the Earthworm processing system, and heightened vigilance by seismologists have improved the catalog over the last decade. LP earthquakes are now relatively well-recorded under Lassen (~150 events since 2000), Clear Lake (~60 events), Mammoth Mountain (~320 events), and Long Valley Caldera (~40 events). LP earthquakes are notably absent under Mount Shasta. With the exception of Long Valley Caldera where LP earthquakes occur at depths of ?5 km, hypocenters are generally between 15-25 km. The rates of LP occurrence over the last decade have been relatively steady within the study areas, except at Mammoth Mountain, where years of gradually declining LP activity abruptly increased after a swarm of unusually deep (20 km) VT earthquakes in October 2012. Epicenter locations relative to the sites of most recent volcanism vary across volcanic centers, but most LP earthquakes fall within 10 km of young vents. Source models for LP earthquakes often involve the resonance of fluid-filled cracks or nonlinear flow of fluids along irregular cracks (reviewed in Chouet and Matoza, 2013, JVGR). At mid-crustal depths the relevant fluids are likely to be low-viscosity basaltic melt and/or exsolved CO2-rich volatiles (Lassen, Clear Lake, Mammoth Mountain). In the shallow crust, however, hydrothermal waters/gases are likely involved in the generation of LP seismicity (Long Valley Caldera).

  19. Groundwater monitoring program plan and conceptual site model for the Al-Tuwaitha Nuclear Research Center in Iraq.

    SciTech Connect

    Copland, John Robin; Cochran, John Russell

    2013-07-01

    The Radiation Protection Center of the Iraqi Ministry of Environment is developing a groundwater monitoring program (GMP) for the Al-Tuwaitha Nuclear Research Center located near Baghdad, Iraq. The Al-Tuwaitha Nuclear Research Center was established in about 1960 and is currently being cleaned-up and decommissioned by Iraq's Ministry of Science and Technology. This Groundwater Monitoring Program Plan (GMPP) and Conceptual Site Model (CSM) support the Radiation Protection Center by providing:A CSM describing the hydrogeologic regime and contaminant issues,recommendations for future groundwater characterization activities, anddescriptions of the organizational elements of a groundwater monitoring program. The Conceptual Site Model identifies a number of potential sources of groundwater contamination at Al-Tuwaitha. The model also identifies two water-bearing zones (a shallow groundwater zone and a regional aquifer). The depth to the shallow groundwater zone varies from approximately 7 to 10 meters (m) across the facility. The shallow groundwater zone is composed of a layer of silty sand and fine sand that does not extend laterally across the entire facility. An approximately 4-m thick layer of clay underlies the shallow groundwater zone. The depth to the regional aquifer varies from approximately 14 to 17 m across the facility. The regional aquifer is composed of interfingering layers of silty sand, fine-grained sand, and medium-grained sand. Based on the limited analyses described in this report, there is no severe contamination of the groundwater at Al-Tuwaitha with radioactive constituents. However, significant data gaps exist and this plan recommends the installation of additional groundwater monitoring wells and conducting additional types of radiological and chemical analyses.

  20. A National Tracking Center for Monitoring Shipments of HEU, MOX, and Spent Nuclear Fuel: How do we implement?

    SciTech Connect

    Mark Schanfein

    2009-07-01

    Nuclear material safeguards specialists and instrument developers at US Department of Energy (USDOE) National Laboratories in the United States, sponsored by the National Nuclear Security Administration (NNSA) Office of NA-24, have been developing devices to monitor shipments of UF6 cylinders and other radioactive materials , . Tracking devices are being developed that are capable of monitoring shipments of valuable radioactive materials in real time, using the Global Positioning System (GPS). We envision that such devices will be extremely useful, if not essential, for monitoring the shipment of these important cargoes of nuclear material, including highly-enriched uranium (HEU), mixed plutonium/uranium oxide (MOX), spent nuclear fuel, and, potentially, other large radioactive sources. To ensure nuclear material security and safeguards, it is extremely important to track these materials because they contain so-called “direct-use material” which is material that if diverted and processed could potentially be used to develop clandestine nuclear weapons . Large sources could be used for a dirty bomb also known as a radioactive dispersal device (RDD). For that matter, any interdiction by an adversary regardless of intent demands a rapid response. To make the fullest use of such tracking devices, we propose a National Tracking Center. This paper describes what the attributes of such a center would be and how it could ultimately be the prototype for an International Tracking Center, possibly to be based in Vienna, at the International Atomic Energy Agency (IAEA).

  1. Using of Remote Sensing Techniques for Monitoring the Earthquakes Activities Along the Northern Part of the Syrian Rift System (LEFT-LATERAL),SYRIA

    NASA Astrophysics Data System (ADS)

    Dalati, Moutaz

    Earthquake mitigation can be achieved with a better knowledge of a region's infra-and substructures. High resolution Remote Sensing data can play a significant role to implement Geological mapping and it is essential to learn about the tectonic setting of a region. It is an effective method to identify active faults from different sources of Remote Sensing and compare the capability of some satellite sensors in active faults survey. In this paper, it was discussed a few digital image processing approaches to be used for enhancement and feature extraction related to faults. Those methods include band ratio, filtering and texture statistics . The experimental results show that multi-spectral images have great potentials in large scale active faults investigation. It has also got satisfied results when deal with invisible faults. Active Faults have distinct features in satellite images. Usually, there are obvious straight lines, circular structures and other distinct patterns along the faults locations. Remotely Sensed imagery Landsat ETM and SPOT XS /PAN are often used in active faults mapping. Moderate and high resolution satellite images are the best choice, because in low resolution images, the faults features may not be visible in most cases. The area under study is located Northwest of Syria that is part of one of the very active deformation belt on the Earth today. This area and the western part of Syria are located along the great rift system (Left-Lateral or African- Syrian Rift System). Those areas are tectonically active and caused a lot of seismically events. The AL-Ghab graben complex is situated within this wide area of Cenozoic deformation. The system formed, initially, as a result of the break up of the Arabian plate from the African plate. This action indicates that these sites are active and in a continual movement. In addition to that, the statistic analysis of Thematic Mapper data and the features from a digital elevation model ( DEM )produced from SAR interferometer show the existence of spectral structures at the same sites. The Arabian plate is moving in a NNW direction, whereas the African plate is moving to the North. The left-lateral motion along the Dead Sea Fault accommodates the difference in movement rate between both plates. The analysis of TM Space Imagery and digital image processing of spectral data show that the lineaments along AL-Ghab graben maybe considered as linear conjunctions accompanied with complex fracturing system. This complex is affected by distance stresses accompanied with intensive forces. The digital image processing of Radar imagery showing the presence of active and fresh faulting zones along the AL-Ghab graben. TM and SAR-DTM data, also showed a gradual color tone and interruptions of linear-ellipse shapes which reflecting the presence of discontinuity contours along the fault zone extension .This features refer to abundance of surface morphological features indicate to Fresh Faults. Recent faulting is expressed as freshly exposed soil within the colluvial apron visible by its light tone color. These indicators had been proved by field checks. Furthermore, the statistic digital analysis of the spectral data show that there are distribution of spectral plumes. These plumes are decreasing in intensity and color contrast from the center of the site to the direction of its edges.

  2. Patient-Centered Technological Assessment and Monitoring of Depression for Low-Income Patients

    PubMed Central

    Wu, Shinyi; Vidyanti, Irene; Liu, Pai; Hawkins, Caitlin; Ramirez, Magaly; Guterman, Jeffrey; Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Ell, Kathleen

    2014-01-01

    Depression is a significant challenge for ambulatory care because it worsens health status and outcomes, increases health care utilizations and costs, and elevates suicide risk. An automatic telephonic assessment (ATA) system that links with tasks and alerts to providers may improve quality of depression care and increase provider productivity. We used ATA system in a trial to assess and monitor depressive symptoms of 444 safety-net primary care patients with diabetes. We assessed system properties, evaluated preliminary clinical outcomes, and estimated cost savings. The ATA system is feasible, reliable, valid, safe, and likely cost-effective for depression screening and monitoring for low-income primary care population. PMID:24525531

  3. Earthquake Information System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  4. Time-Clustering Behavior of Spreading-Center Seismicity Between 15-35 N on the Mid-Atlantic Ridge: Observations from Hydroacoustic Monitoring

    NASA Astrophysics Data System (ADS)

    Bohnenstiehl, D. R.; Tolstoy, M.; Smith, D. K.; Fox, C. G.; Dziak, R. P.

    2002-12-01

    An earthquake catalog derived from the detection of seismically-generated Tertiary (T) waves is used to study the time-clustering behavior of moderate-size (> 3.0 M) earthquakes along the north-central Mid-Atlantic Ridge. Because T-waves propagate efficiently within the ocean's sound channel, these data represent a significant improvement relative to the detection capabilities of land-based seismic stations. In addition, hydroacoustic monitoring overcomes many of the spatial and temporal limitations associated with ocean-bottom seismometer data, with the existing array being deployed continuously between 15-35 degrees N during the period February 1999-Februrary 2001.Within this region, the distribution of inter-event times is consistent with a non-random clustered process, with a coefficient of variation greater than 1.0. The clustered behavior is power-law in nature with temporal fluctuations characterized by a power spectral density that decays as 1/f? . Using Allan Factor analysis, ? is found to range from 0.12-0.55 for different regions of the spreading axis. This scaling is negligible at time scales less than 3.5 x 103 s, and earthquake occurrence becomes less clustered (smaller ? ) as increasing size thresholds are applied to the catalog. The highest degrees of clustering are associated temporally with large mainshock-aftershock sequences; however, some swarm-like activity also is evident. The distribution of acoustic magnitudes, or source levels, is consistent with a power-law size-frequency scaling for earthquakes. Although such behavior has been linked closely to the fractal nature of the underlying fault population in other environments, power-law fault size distributions have not been widely observed in the mid-ocean ridge setting.

  5. CTEPP NC DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the child’s daily activities and potential exposures to pollutants at their homes. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on the child’s han...

  6. CTEPP-OH DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the child’s daily activities and potential exposures to pollutants at their homes for CTEPP-OH. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on th...

  7. EVALUATION OF ENVIROSCAN CAPACITANCE PROBES FOR MONITORING SOIL MOISTURE IN CENTER PIVOT IRRIGATED POTATOES

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Careful irrigation scheduling is the key to providing adequate water to minimize potential leaching losses below the rootzone, while supplying adequate water to minimize negative effects of water stress. Capacitance probes were used for real-time continuous monitoring of soil moisture content at va...

  8. Monitoring of the permeable pavement demonstration site at Edison Environmental Center

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch has installed an instrumented, working full-scale 110-space pervious pavement parking lot and has been monitoring several environmental stressors and runoff. This parking lot demonstration site has allowed the investigation of differenc...

  9. CTEPP DATA COLLECTION FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data collection form is used to provide information on the child's daily activities and potential exposures to pollutants at their homes. It includes questions on chemicals applied and cigarettes smoked at the home over the 48-hr monitoring period. It also collects informati...

  10. User-centered development and testing of a monitoring system that provides feedback regarding physical functioning to elderly people

    PubMed Central

    Vermeulen, Joan; Neyens, Jacques CL; Spreeuwenberg, Marieke D; van Rossum, Erik; Sipers, Walther; Habets, Herbert; Hewson, David J; de Witte, Luc P

    2013-01-01

    Purpose To involve elderly people during the development of a mobile interface of a monitoring system that provides feedback to them regarding changes in physical functioning and to test the system in a pilot study. Methods and participants The iterative user-centered development process consisted of the following phases: (1) selection of user representatives; (2) analysis of users and their context; (3) identification of user requirements; (4) development of the interface; and (5) evaluation of the interface in the lab. Subsequently, the monitoring and feedback system was tested in a pilot study by five patients who were recruited via a geriatric outpatient clinic. Participants used a bathroom scale to monitor weight and balance, and a mobile phone to monitor physical activity on a daily basis for six weeks. Personalized feedback was provided via the interface of the mobile phone. Usability was evaluated on a scale from 1 to 7 using a modified version of the Post-Study System Usability Questionnaire (PSSUQ); higher scores indicated better usability. Interviews were conducted to gain insight into the experiences of the participants with the system. Results The developed interface uses colors, emoticons, and written and/or spoken text messages to provide daily feedback regarding (changes in) weight, balance, and physical activity. The participants rated the usability of the monitoring and feedback system with a mean score of 5.2 (standard deviation 0.90) on the modified PSSUQ. The interviews revealed that most participants liked using the system and appreciated that it signaled changes in their physical functioning. However, usability was negatively influenced by a few technical errors. Conclusion Involvement of elderly users during the development process resulted in an interface with good usability. However, the technical functioning of the monitoring system needs to be optimized before it can be used to support elderly people in their self-management. PMID:24039407

  11. Report of Research Center for Urban Safety and Security, Kobe University A southeasterly-dipping static fault model of the 2007 Niigata-ken Chuetsu-oki, Japan, earthquake based on

    E-print Network

    Takiguchi, Tetsuya

    A southeasterly-dipping static fault model of the 2007 Niigata-ken Chuetsu-oki, Japan, earthquake based on crustal Ishibashi 11 On strong motions in the vicinity of large earthquakes Toru Ouchi On a simple way of simulating earthquake fires - The realities and the evaluation of fire extinguishing activities by residents - Mariko

  12. Long-term monitoring of creep rate along the Hayward fault and evidence for a lasting creep response to 1989 Loma Prieta earthquake

    USGS Publications Warehouse

    Lienkaemper, J.J.; Galehouse, J.S.; Simpson, R.W.

    2001-01-01

    We present results from over 30 yr of precise surveys of creep along the Hayward fault. Along most of the fault, spatial variability in long-term creep rates is well determined by these data and can help constrain 3D-models of the depth of the creeping zone. However, creep at the south end of the fault stopped completely for more than 6 years after the M7 1989 Loma Prieta Earthquake (LPEQ), perhaps delayed by stress drop imposed by this event. With a decade of detailed data before LPEQ and a decade after it, we report that creep response to that event does indeed indicate the expected deficit in creep.

  13. Interpretations on the Geologic Setting of Yogyakarta Earthquakes 2006 (Central Java, Indonesia) Based on Integration of Aftershock Monitoring and Existing Geologic, Geophysical and Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Setijadji, L. D.; Watanabe, K.; Fukuoka, K.; Ehara, S.; Setiadji, Y.; Rahardjo, W.; Susilo, A.; Barianto, D. H.; Harijoko, A.; Sudarno, I.; Pramumijoyo, S.; Hendrayana, H.; Akmalludin, A.; Nishijima, J.; Itaya, T.

    2007-05-01

    The unprecedented 26 May 2006 Yogyakarta earthquake (central Java, Indonesia) that took victims of 5,700 lives was generally accepted to have a depth of about 10 km and moment magnitude of 6.4. However, the definition of location of active fault is still under debate as the epicenter of mainshock was reported quite differently by several institutions. Many researchers believe that the Opak fault which is located at the eastern boundary of Yogyakarta low-land area (or Yogyakarta Basin) and the high-land region of Southern Mountains was the source of year 2006 earthquakes. However, our result of aftershocks observation suggests that the ruptured zone was not located along the Opak fault but from an unknown fault located about 10 km to the east from it and within the Southern Mountains domain. Unfortunately, surface geologic manifestations are scarce as this area is now largely covered by limestone. Therefore the suspected active fault system must be studied through interpretations of the subsurface geology and evaluation of the Cenozoic geo-history of the region utilizing existing geologic, geophysical and remote sensing data. This work suggests that the Yogyakarta Basin is a volcano-tectonic depression formed gradually since the early Tertiary period (Oligo-Miocene or older). Geological and geophysical evidence suggest that structural trends changed from the Oligocene NE-SW towards the Oligo-Miocene NNE-SSW and the Plio-Pleistocene NW-SE and E-W directions. The ruptured "X" fault during the Yogyakarta earthquakes 2006 is likely to be a NNE-SSW trending fault which is parallel to the Opak fault and both were firstly active in the Oligo-Miocene as sinistral strike-slip faults. However, while the Opak fault had changed into a normal faulting after the Pliocene, the evidence from Kali Ngalang and Kali Widoro suggests that the "X" fault system was still reactivated as a strike-slip one during the Plio-Pleistocene orogeny. As this new interpretation of active fault causes spatial discrepancy between locations of earthquakes epicenters and highly damaged regions, other geo-engineering factors must be considerably important in determining the final scale of seismic hazards. The most vulnerable areas for seismic hazards are those located nearest to the ruptured fault and are underlain by thick Quaternary unconsolidated deposits. In case of regions along the fault line, seismic hazards seem to reach more distance region, such as the case of Gantiwarno region, as the seismic waves can travel more easily along the fault line.

  14. Earthquake location determination using data from DOMERAPI and BMKG seismic networks: A preliminary result of DOMERAPI project

    SciTech Connect

    Ramdhan, Mohamad; Nugraha, Andri Dian; Widiyantoro, Sri; Métaxian, Jean-Philippe; Valencia, Ayunda Aulia

    2015-04-24

    DOMERAPI project has been conducted to comprehensively study the internal structure of Merapi volcano, especially about deep structural features beneath the volcano. DOMERAPI earthquake monitoring network consists of 46 broad-band seismometers installed around the Merapi volcano. Earthquake hypocenter determination is a very important step for further studies, such as hypocenter relocation and seismic tomographic imaging. Ray paths from earthquake events occurring outside the Merapi region can be utilized to delineate the deep magma structure. Earthquakes occurring outside the DOMERAPI seismic network will produce an azimuthal gap greater than 180{sup 0}. Owing to this situation the stations from BMKG seismic network can be used jointly to minimize the azimuthal gap. We identified earthquake events manually and carefully, and then picked arrival times of P and S waves. The data from the DOMERAPI seismic network were combined with the BMKG data catalogue to determine earthquake events outside the Merapi region. For future work, we will also use the BPPTKG (Center for Research and Development of Geological Disaster Technology) data catalogue in order to study shallow structures beneath the Merapi volcano. The application of all data catalogues will provide good information as input for further advanced studies and volcano hazards mitigation.

  15. CTEPP-OH DATA COLLECTED ON FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data set contains data for CTEPP-OH concerning the potential sources of pollutants at the day care center including the chemicals that have been applied in the past at the day care center by staff members or by commercial contractors. The day care teacher was asked questions...

  16. CTEPP NC DATA COLLECTED ON FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data set contains data concerning the potential sources of pollutants at the day care center including the chemicals that have been applied in the past at the day care center by staff members or by commercial contractors. The day care teacher was asked questions related to t...

  17. X-ray Weekly Monitoring of the Galactic Center Sgr A* with Suzaku

    NASA Astrophysics Data System (ADS)

    Maeda, Yoshitomo; Nobukawa, Masayoshi; Hayashi, Takayuki; Iizuka, Ryo; Saitoh, Takayuki; Murakami, Hiroshi

    A small gas cloud, G2, is on an orbit almost straight into the supermassive blackhole Sgr A* by spring 2014. This event gives us a rare opportunity to test the mass feeding onto the blackhole by a gas. To catch a possible rise of the mass accretion from the cloud, we have been performing the bi-week monitoring of Sgr A* in autumn and spring in the 2013 fiscal year. The key feature of Suzaku is the high-sensitivity wide-band X-ray spectroscopy all in one observatory. It is characterized by a large effective area combined with low background and good energy resolution, in particular a good line spread function in the low-energy range. Since the desired flare events associated with the G2 approach is a transient event, the large effective area is critical and powerful tools to hunt them. The first monitoring in 2013 autumn was successfully made. The X-rays from Sgr A* and its nearby emission were clearly resolved from the bright transient source AX J1745.6-2901. No very large flare from Sgr A*was found during the monitoring. We also may report the X-ray properties of two serendipitous sources, the neutron star binary AX J1745.6-2901 and a magnetar SGR J1745-29.

  18. Cooperative Monitoring Center Occasional Paper/9: De-Alerting Strategic Ballistic Missiles

    SciTech Connect

    Connell, Leonard W.; Edenburn, Michael W.; Fraley, Stanley K.; Trost, Lawrence C.

    1999-03-01

    This paper presents a framework for evaluating the technical merits of strategic ballistic missile de-alerting measures, and it uses the framework to evaluate a variety of possible measures for silo-based, land-mobile, and submarine-based missiles. De-alerting measures are defined for the purpose of this paper as reversible actions taken to increase the time or effort required to launch a strategic ballistic missile. The paper does not assess the desirability of pursuing a de-alerting program. Such an assessment is highly context dependent. The paper postulates that if de-alerting is desirable and is used as an arms control mechanism, de-alerting measures should satisfy specific cirteria relating to force security, practicality, effectiveness, significant delay, and verifiability. Silo-launched missiles lend themselves most readily to de-alerting verification, because communications necessary for monitoring do not increase the vulnerabilty of the weapons by a significant amount. Land-mobile missile de-alerting measures would be more challenging to verify, because monitoring measures that disclose the launcher's location would potentially increase their vulnerability. Submarine-launched missile de-alerting measures would be extremely challlenging if not impossible to monitor without increasing the submarine's vulnerability.

  19. Monitoring and Modeling of Ground Deformation at Three Sisters volcanic center, central Oregon Cascade Range, 1997-2009 (Invited)

    NASA Astrophysics Data System (ADS)

    Dzurisin, D.; Lisowski, M.; Wicks, C. W.

    2009-12-01

    Modeling of InSAR, GPS, and leveling data indicates that uplift of a broad area centered ~6 km west of the summit of South Sister volcano started in 1997 and is continuing at a declining rate. Surface displacements were measured every summer when possible since August 1992 with InSAR, annually since August 2001 using GPS and leveling surveys, and since May 2001 using continuous GPS. Our best-fit model to the deformation data is a vertical, prolate, spheroidal point-pressure source located 4.9-5.4 km below the surface. A more complicated source of this type that includes dip as a free model parameter does not improve the fit to data significantly, and other source types including tabular bodies (dike or sill) produce decidedly poorer results. The source inflation rate decreased exponentially during 2001-2006 with a 1/e decay time of 5.3 ± 1.1 years. The net increase in source volume from September 1997 to August 2006 was 36-42 x 106 m3. A swarm of ~300 small (maximum magnitude 1.9) earthquakes occurred beneath the deforming area in March 2004; no other unusual seismicity has been noted. We attribute surface deformation to intrusion of magma, perhaps at the brittle-ductile transition in hot, thermally altered crust beneath the active Three Sisters volcanic center. Elastic models like those we investigated cannot distinguish between ongoing intrusion at a declining rate and viscoelastic response of the overlying crust and hydrothermal system to an intrusion that might have ended some time ago. Repeated gravity surveys that began in 2002 might help to resolve this ambiguity; gravity results through summer 2009 will be presented separately at this meeting. Similar deformation episodes in the past probably would have gone unnoticed if, as we suspect, most are caused by small intrusions that do not culminate in eruptions.

  20. EARTHQUAKE TRIGGERING AND SPATIAL-TEMPORAL RELATIONS IN THE VICINITY OF YUCCA MOUNTAIN, NEVADA

    SciTech Connect

    na

    2001-02-08

    It is well accepted that the 1992 M 5.6 Little Skull Mountain earthquake, the largest historical event to have occurred within 25 km of Yucca Mountain, Nevada, was triggered by the M 7.2 Landers earthquake that occurred the day before. On the premise that earthquakes can be triggered by applied stresses, we have examined the earthquake catalog from the Southern Great Basin Digital Seismic Network (SGBDSN) for other evidence of triggering by external and internal stresses. This catalog now comprises over 12,000 events, encompassing five years of consistent monitoring, and has a low threshold of completeness, varying from M 0 in the center of the network to M 1 at the fringes. We examined the SGBDSN catalog response to external stresses such as large signals propagating from teleseismic and regional earthquakes, microseismic storms, and earth tides. Results are generally negative. We also examined the interplay of earthquakes within the SGBDSN. The number of ''foreshocks'', as judged by most criteria, is significantly higher than the background seismicity rate. In order to establish this, we first removed aftershocks from the catalog with widely used methodology. The existence of SGBDSN foreshocks is supported by comparing actual statistics to those of a simulated catalog with uniform-distributed locations and Poisson-distributed times of occurrence. The probabilities of a given SGBDSN earthquake being followed by one having a higher magnitude within a short time frame and within a close distance are at least as high as those found with regional catalogs. These catalogs have completeness thresholds two to three units higher in magnitude than the SGBDSN catalog used here. The largest earthquake in the SGBDSN catalog, the M 4.7 event in Frenchman Flat on 01/27/1999, was preceded by a definite foreshock sequence. The largest event within 75 km of Yucca Mountain in historical time, the M 5.7 Scotty's Junction event of 08/01/1999, was also preceded by foreshocks. The monitoring area of the SGBDSN has been in a long period of very low moment release rate since February of 1999. The seismicity catalog to date suggests that the next significant (M > 4) earthquake within the SGBDSN will be preceded by foreshocks.

  1. Upgrading the Digital Electronics of the PEP-II Bunch Current Monitors at the Stanford Linear Accelerator Center

    SciTech Connect

    Kline, Josh; /SLAC

    2006-08-28

    The testing of the upgrade prototype for the bunch current monitors (BCMs) in the PEP-II storage rings at the Stanford Linear Accelerator Center (SLAC) is the topic of this paper. Bunch current monitors are used to measure the charge in the electron/positron bunches traveling in particle storage rings. The BCMs in the PEP-II storage rings need to be upgraded because components of the current system have failed and are known to be failure prone with age, and several of the integrated chips are no longer produced making repairs difficult if not impossible. The main upgrade is replacing twelve old (1995) field programmable gate arrays (FPGAs) with a single Virtex II FPGA. The prototype was tested using computer synthesis tools, a commercial signal generator, and a fast pulse generator.

  2. Napa Earthquake impact on water systems

    NASA Astrophysics Data System (ADS)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  3. Seismic Monitoring in Haiti

    USGS Multimedia Gallery

    Following the devastating 2010 Haiti earthquake, the USGS has been helping with earthquake awareness and monitoring in the country, with continued support from the U.S. Agency for International Development (USAID). This assistance has helped the Bureau des Mines et de l'Energie (BME) in Port-au-Prin...

  4. Monitoring

    DOEpatents

    Orr, Christopher Henry (Calderbridge, GB); Luff, Craig Janson (Calderbridge, GB); Dockray, Thomas (Calderbridge, GB); Macarthur, Duncan Whittemore (Los Alamos, NM)

    2004-11-23

    The invention provides apparatus and methods which facilitate movement of an instrument relative to an item or location being monitored and/or the item or location relative to the instrument, whilst successfully excluding extraneous ions from the detection location. Thus, ions generated by emissions from the item or location can successfully be monitored during movement. The technique employs sealing to exclude such ions, for instance, through an electro-field which attracts and discharges the ions prior to their entering the detecting location and/or using a magnetic field configured to repel the ions away from the detecting location.

  5. Investigation on the Possible Relationship between Magnetic Pulsations and Earthquakes

    NASA Astrophysics Data System (ADS)

    Jusoh, M.; Liu, H.; Yumoto, K.; Uozumi, T.; Takla, E. M.; Yousif Suliman, M. E.; Kawano, H.; Yoshikawa, A.; Asillam, M.; Hashim, M.

    2012-12-01

    The sun is the main source of energy to the solar system, and it plays a major role in affecting the ionosphere, atmosphere and the earth surface. The connection between solar wind and the ground magnetic pulsations has been proven empirically by several researchers previously (H. J. Singer et al., 1977, E. W. Greenstadt, 1979, I. A. Ansari 2006 to name a few). In our preliminary statistical analysis on relationship between solar and seismic activities (Jusoh and Yumoto, 2011, Jusoh et al., 2012), we observed a high possibility of solar-terrestrial coupling. We observed high tendency of earthquakes to occur during lower phase solar cycles which significantly related with solar wind parameters (i.e solar wind dynamic pressure, speed and input energy). However a clear coupling mechanism was not established yet. To connect the solar impact on seismicity, we investigate the possibility of ground magnetic pulsations as one of the connecting agent. In our analysis, the recorded ground magnetic pulsations are analyzed at different ranges of ultra low frequency; Pc3 (22-100 mHz), Pc4 (6.7-22 mHz) and Pc5 (1.7-6.7 mHz) with the occurrence of local earthquake events at certain time periods. This analysis focuses at 2 different major seismic regions; north Japan (mid latitude) and north Sumatera, Indonesia (low latitude). Solar wind parameters were obtained from the Goddard Space Flight Center, NASA via the OMNIWeb Data Explorer and the Space Physics Data Facility. Earthquake events were extracted from the Advanced National Seismic System (ANSS) database. The localized Pc3-Pc5 magnetic pulsations data were extracted from Magnetic Data Acquisition System (MAGDAS)/Circum Pan Magnetic Network (CPMN) located at Ashibetsu (Japan); for earthquakes monitored at north Japan and Langkawi (Malaysia); for earthquakes observed at north Sumatera. This magnetometer arrays has established by International Center for Space Weather Science and Education, Kyushu University, Japan. From the results, we observed significant correlations between ground magnetic pulsations and solar wind speed at difference earthquake epicenter depths. The details of the analysis will be discussed in the presentation.

  6. Seismological Research Letters, 74, 3, May/June 2003, 271-273 Speculations on Earthquake Forecasting

    E-print Network

    Seismological Research Letters, 74, 3, May/June 2003, 271-273 OPINION Speculations on Earthquake of stress- monitoring sites could lead to earthquake forecasting analogous to the way networks to be a major advance that offers more hope for forecasting earthquakes than was envisioned in the survey Living

  7. Supercomputing meets seismology in earthquake exhibit

    SciTech Connect

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2013-10-03

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  8. Supercomputing meets seismology in earthquake exhibit

    ScienceCinema

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2014-07-22

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  9. Cooperative Monitoring Center Occasional Paper/8: Cooperative Border Security for Jordan: Assessment and Options

    SciTech Connect

    Qojas, M.

    1999-03-01

    This document is an analysis of options for unilateral and cooperative action to improve the security of Jordan's borders. Sections describe the current political, economic, and social interactions along Jordan's borders. Next, the document discusses border security strategy for cooperation among neighboring countries and the adoption of confidence-building measures. A practical cooperative monitoring system would consist of hardware for early warning, command and control, communications, and transportation. Technical solutions can expand opportunities for the detection and identification of intruders. Sensors (such as seismic, break-wire, pressure-sensing, etc.) can warn border security forces of intrusion and contribute to the identification of the intrusion and help formulate the response. This document describes conceptual options for cooperation, offering three scenarios that relate to three hypothetical levels (low, medium, and high) of cooperation. Potential cooperative efforts under a low cooperation scenario could include information exchanges on military equipment and schedules to prevent misunderstandings and the establishment of protocols for handling emergency situations or unusual circumstances. Measures under a medium cooperation scenario could include establishing joint monitoring groups for better communications, with hot lines and scheduled meetings. The high cooperation scenario describes coordinated responses, joint border patrols, and sharing border intrusion information. Finally, the document lists recommendations for organizational, technical, and operational initiatives that could be applicable to the current situation.

  10. Streamflow, groundwater, and water-quality monitoring by USGS Nevada Water Science Center

    USGS Publications Warehouse

    Gipson, Marsha L.; Schmidt, Kurtiss

    2013-01-01

    The U.S. Geological Survey (USGS) has monitored and assessed the quantity and quality of our Nation's streams and aquifers since its inception in 1879. Today, the USGS provides hydrologic information to aid in the evaluation of the availability and suitability of water for public and domestic supply, agriculture, aquatic ecosystems, mining, and energy development. Although the USGS has no responsibility for the regulation of water resources, the USGS hydrologic data complement much of the data collected by state, county, and municipal agencies, tribal nations, U.S. District Court Water Masters, and other federal agencies such as the Environmental Protection Agency, which focuses on monitoring for regulatory compliance. The USGS continues its mission to provide timely and relevant water-resources data and information that are available to water-resource managers, non-profit organizations, industry, academia, and the public. Data collected by the USGS provide the science needed for informed decision-making related to resource management and restoration, assessment of flood and drought hazards, ecosystem health, and effects on water resources from land-use changes.

  11. Retrospective stress-forecasting of earthquakes

    NASA Astrophysics Data System (ADS)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to retrospectively stress-forecasting ~17 earthquakes ranging in magnitude from a M1.7 swarm event in N Iceland, to the 1999 M7.7 Chi-Chi Earthquake in Taiwan, and the 2004 Mw9.2 Sumatra-Andaman Earthquake (SAE). Before SAE, the changes in SWS were observed at seismic stations in Iceland at a distance of ~10,500km the width of the Eurasian Plate, from Indonesia demonstrating the 'butterfly wings' sensitivity of the New Geophysics of a critically microcracked Earth. At that time, the sensitivity of the phenomena had not been recognised, and the SAE was not stress-forecast. These results have been published at various times in various formats in various journals. This presentation displays all the results in a normalised format that allows the similarities to be recognised, confirming that observations of SWS time-delays can stress-forecast the times, magnitudes, and in some circumstances fault-breaks, of impending earthquakes. Papers referring to these developments can be found in geos.ed.ac.uk/home/scrampin/opinion. Also see abstracts in EGU2015 Sessions: Crampin & Gao (SM1.1), Liu & Crampin (NH2.5), and Crampin & Gao (GD.1).

  12. Anomalous Schumann resonance observed in China, possibly associated with Honshu, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Ouyang, X. Y.; Zhang, X. M.; Shen, X. H.; Miao, Y. Q.

    2012-04-01

    Schumann resonance (hereafter SR) occurs in the cavity between the Earth and the ionosphere, and it is originated by the global lightning activities [1]. Some recent publications showed that anomalous SR phenomena may occur before major earthquakes [2-4]. Considering good prospects for the application of SR in Earthquake monitoring, we have established four observatories in Yunnan province, a region with frequent seismicity in the southwest of China. Our instruments can provide three components of magnetic field in 0-30 Hz, including BNS(North-South component), BEW(East-West component) and BV (Vertical component). The sample frequency is 100 Hz. In this research, we use high quality data recorded at Yongsheng observatory (geographic coordinates: 26.7° N, 100.77°E) to analyze SR phenomena to find out anomalous effects possibly related with the Ms9.0 Earthquake (epicenter: 38.297° N, 142.372° E) near the east coast of Honshu, Japan on 11 March 2011. We select the data 15 days before and after the earthquake. SR in BNS and SR in BEWappear different in background characteristics. Frequencies of four SR modes in BNSare generally higher than that in BEW. Amplitude of SR in BNSis strong at around 05:00 LT, 15:00 LT and 23:00 LT of the day, while amplitude of SR in BEW is just intense around 16:00 LT, corresponding to about 08:00 UT. Because American, African and Asian thunderstorm centers play their dominant roles respectively in the intervals of 21:00UT±1h, 15:00UT±1h and 08:00UT±1h [1, 3], we can see that SR in BEWis most sensitive to signals from Asian center and SR in BNS is in good response to three centers. SR in BNS and SR in BEW have presented different features in the aspect of anomalous effects related with earthquakes. BEW component gives us a clear picture of anomalous SR phenomena, which are characterized by increase in amplitude of four SR modes and increase in frequency at first SR mode several days before the earthquake. The amplitude of four SR modes began to increase four days before Honshu earthquake (7th March). And this continued to the day of the earthquake (11th March). Then it fell to the usual intensity after the earthquake (12th March). The frequency at first SR mode in BEW unconventionally exceeded the first mode frequency in BNS with an enhancement of 0.7 Hz on 8th and 9th March. We did not find similar anomalous effects in BNS. The anomalous effects in BEW may be caused by interference between direct path from Asian center to the observatory and disturbed path scattered by the perturbation in the ionosphere over Honshu. More detailed analysis is going on. 1. Nickolaenko A P and Hayakawa M, Resonances in the Earth-ionosphere cavity. 2002: Kluwer Academic Pub. 2. Hayakawa M, Ohta K, Nickolaenko A P, et al. Anomalous effect in Schumann resonance phenomena observed in Japan, possibly associated with the Chi-chi earthquake in Taiwan. Annales geophysicae,2005. pp. 1335-1346. 3. Hayakawa M, Nickolaenko A P, Sekiguchi M, et al., Anomalous ELF phenomena in the Schumann resonance band as observed at Moshiri (Japan) in possible association with an earthquake in Taiwan. Nat. Hazards Earth Syst. Sci, 2008. 8(6): p. 1309-1316. 4. Ohta K, Izutsu J, and Hayakawa M, Anomalous excitation of Schumann resonances and additional anomalous resonances before the 2004 Mid-Niigata prefecture earthquake and the 2007 Noto Hantou Earthquake. Physics and Chemistry of the Earth, Parts A/B/C, 2009. 34(6-7): p. 441-448.

  13. Speeding earthquake disaster relief

    USGS Publications Warehouse

    Mortensen, Carl; Donlin, Carolyn; Page, Robert A.; Ward, Peter

    1995-01-01

    In coping with recent multibillion-dollar earthquake disasters, scientists and emergency managers have found new ways to speed and improve relief efforts. This progress is founded on the rapid availability of earthquake information from seismograph networks.

  14. Associating an ionospheric parameter with major earthquake occurrence throughout the world

    NASA Astrophysics Data System (ADS)

    Ghosh, D.; Midya, S. K.

    2014-02-01

    With time, ionospheric variation analysis is gaining over lithospheric monitoring in serving precursors for earthquake forecast. The current paper highlights the association of major (Ms ? 6.0) and medium (4.0 ? Ms < 6.0) earthquake occurrences throughout the world in different ranges of the Ionospheric Earthquake Parameter (IEP) where `Ms' is earthquake magnitude on the Richter scale. From statistical and graphical analyses, it is concluded that the probability of earthquake occurrence is maximum when the defined parameter lies within the range of 0-75 (lower range). In the higher ranges, earthquake occurrence probability gradually decreases. A probable explanation is also suggested.

  15. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  16. MONITORING TOXIC ORGANIC GASES AND PARTICLES NEAR THE WORLD TRADE CENTER AFTER SEPTEMBER 11, 2001

    EPA Science Inventory

    The September 11, 2001 attack on the World Trade Center (WTC) resulted in an intense fire and the subsequent, complete collapse of the two main structures and adjacent buildings, as well as significant damage to many surrounding buildings within and around the WTC complex. Thi...

  17. Hatfield Marine Science Center Dynamic Revetment Project DSL permit # 45455-FP, Monitoring Report February, 2015

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  18. Hatfield Marine Science Center Dynamic Revetment Project DSL Permit # 45455-FP. Monitoring Report. February, 2014.

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  19. CTEPP DATA COLLECTION FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data collection form is used to identify the potential sources of pollutants at the day care center. The day care teacher is asked questions related to the age of their day care building; age and frequency of cleaning carpets or rugs; types of heating and air conditioning de...

  20. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  1. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  2. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  3. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time?dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground?motion exceedance probabilities as well as short?term rupture probabilities—in concert with the long?term forecasts of probabilistic seismic?hazard analysis (PSHA).

  4. Magma supply and storage in volcanic systems: Shallow crustal emplacement processes and causes of the large axial high along the western Galapagos Spreading Center, and, Relation of earthquakes to tectonic and magmatic features near Lassen Peak, northern California

    NASA Astrophysics Data System (ADS)

    Blacic, Tanya Marie

    Magma storage and supply is investigated in two different tectonic environments: the Galapagos Spreading Center (GSC, a plume-influenced mid-ocean ridge, and Lassen Peak, a subduction-related volcano. Along the GSC multi-channel seismic reflection data are used to infer crustal accretion processes and forward modeling is used to investigate causes of the axial high. At Lassen Peak, catalog earthquakes are relocated using the double-difference method and resulting locations are examined. Moving westward away from the hotspot along the GSC the magma lens deepens, layer 2A thickens, and the axial high rapidly disappears near 92.7°W. Increasing layer 2A thickness and magma lens depth support the interpretation of layer 2A as the extrusive volcanic layer with thickness controlled by pressure on the magma lens and its ability to push magma to the surface. Off-axis thickening of layer 2A east of 94.0°W suggests narrower magma lenses focus diking close to the ridge axis such that lava flowing away from the axis blankets older flows thickening the extrusive crust off-axis. Causes of the GSC axial high are investigated using a model that determines the flexural response of the lithosphere to loads resulting from the thermal and magmatic structure. Results reveal that the large axial high requires either that the crust below the magma lens contains a lot of melt (?35%), or that melt extends into the mantle in a narrow region beneath the axis. Less melt is required for a profile to the west where the axial high is smaller (like the East Pacific Rise). Earthquake relocation at Lassen Peak shows focusing of events into three clusters 4--6 km beneath the south flank of the volcano. These clusters may be related to movement of magmatic and hydrothermal fluids and may mark the top of a region of hot crust overlying a small magma chamber. Just north of Manzanita Creek (˜14 km northwest of Lassen Peak) is a linear set of earthquakes not corresponding to any mapped faults. A single basaltic vent at the eastern end of this feature indicates magma may have used this weak zone in the crust to make its way to the surface.

  5. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and hazard response to create a program that is both educational and provides a public service. Seismic Sleuths and Written in Stone are the harbingers of a new genre of earthquake programs that are the antithesis of the 1974 film Earthquake and the 2004 miniseries 10.5. Film producers and those in the earthquake education community are demonstrating that it is possible to tell an exciting story, inspire awareness, and encourage empowerment without sensationalism.

  6. Earthquake Hazard in the Heart of the Homeland

    USGS Publications Warehouse

    Gomberg, Joan; Schweig, Eugene

    2007-01-01

    Evidence that earthquakes threaten the Mississippi, Ohio, and Wabash River valleys of the Central United States abounds. In fact, several of the largest historical earthquakes to strike the continental United States occurred in the winter of 1811-1812 along the New Madrid seismic zone, which stretches from just west of Memphis, Tenn., into southern Illinois. Several times in the past century, moderate earthquakes have been widely felt in the Wabash Valley seismic zone along the southern border of Illinois and Indiana. Throughout the region, between 150 and 200 earthquakes are recorded annually by a network of monitoring instruments, although most are too small to be felt by people. Geologic evidence for prehistoric earthquakes throughout the region has been mounting since the late 1970s. But how significant is the threat? How likely are large earthquakes and, more importantly, what is the chance that the shaking they cause will be damaging?

  7. The Cooperative Monitoring Center: Achieving cooperative security objectives through technical collaborations

    SciTech Connect

    Pregenzer, A.

    1996-08-01

    The post cold war security environment poses both difficult challenges and encouraging opportunities. Some of the most difficult challenges are related to regional conflict and the proliferation of weapons of mass destruction. New and innovative approaches to prevent the proliferation of weapons of mass destruction are essential. More effort must be focused on underlying factors that motivate countries to seek weapons of mass destruction. Historically the emphasis has been on denial: denying information, denying technology, and denying materials necessary to build such weapons. Though still important, those efforts are increasingly perceived to be insufficient, and initiatives that address underlying motivational factors are needed. On the opportunity side, efforts to establish regional dialogue and confidence-building measures are increasing in many areas. Such efforts can result in cooperative agreements on security issues such as border control, demilitarized zones, weapons delivery systems, weapons of mass destruction free zones, environmental agreements, and resource sharing. In some cases, implementing such cooperative agreements will mean acquiring, analyzing, and sharing large quantities of data and sensitive information. These arrangements for ``cooperative monitoring`` are becoming increasingly important to the security of individual countries, regions, and international institutions. However, many countries lack sufficient technical and institutional infrastructure to take full advantage of these opportunities. Constructing a peaceful twenty-first century will require that technology is brought to bear in the most productive and innovative ways to meet the challenges of proliferation and to maximize the opportunities for cooperation.

  8. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  9. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  10. The Geology of Earthquakes

    NASA Astrophysics Data System (ADS)

    Wallace, Robert E.

    The Geology of Earthquakes is a major contribution that brings together under one cover the many and complex elements of geology that are fundamental to earthquakes and seismology. Here are described and analyzed the basic causes of earthquakes, the resulting effects of earthquakes and faulting on the surface of the Earth, techniques of analyzing these effects, and engineering and public policy considerations for earthquake hazard mitigation. The three authors have played major roles in developing the fundamentals in both scientific and policy matters; thus they speak with an authority that few others could.

  11. Real-Time Earthquake Analysis for Disaster Mitigation (READI) Network

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2014-12-01

    Real-time GNSS networks are making a significant impact on our ability to forecast, assess, and mitigate the effects of geological hazards. I describe the activities of the Real-time Earthquake Analysis for Disaster Mitigation (READI) working group. The group leverages 600+ real-time GPS stations in western North America operated by UNAVCO (PBO network), Central Washington University (PANGA), US Geological Survey & Scripps Institution of Oceanography (SCIGN project), UC Berkeley & US Geological Survey (BARD network), and the Pacific Geosciences Centre (WCDA project). Our goal is to demonstrate an earthquake and tsunami early warning system for western North America. Rapid response is particularly important for those coastal communities that are in the near-source region of large earthquakes and may have only minutes of warning time, and who today are not adequately covered by existing seismic and basin-wide ocean-buoy monitoring systems. The READI working group is performing comparisons of independent real time analyses of 1 Hz GPS data for station displacements and is participating in government-sponsored earthquake and tsunami exercises in the Western U.S. I describe a prototype seismogeodetic system using a cluster of southern California stations that includes GNSS tracking and collocation with MEMS accelerometers for real-time estimation of seismic velocity and displacement waveforms, which has advantages for improved earthquake early warning and tsunami forecasts compared to seismic-only or GPS-only methods. The READI working group's ultimate goal is to participate in an Indo-Pacific Tsunami early warning system that utilizes GNSS real-time displacements and ionospheric measurements along with seismic, near-shore buoys and ocean-bottom pressure sensors, where available, to rapidly estimate magnitude and finite fault slip models for large earthquakes, and then forecast tsunami source, energy scale, geographic extent, inundation and runup. This will require cooperation with other real-time efforts around the Pacific Rim in terms of sharing, analysis centers, and advisory bulletins to the responsible government agencies. The IAG's Global Geodetic Observing System (GGOS), in particular its natural hazards theme, provides a natural umbrella for achieving this objective.

  12. Commensurability of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Hu, Hui; Han, Yanben; Su, Youjin; Wang, Rui

    2013-07-01

    During recent years huge earthquakes frequently occurred and caused surprise attack on many places of the globe. Frequent exceptional strong disasters of earthquakes remind that we must strengthen our research on cause of formation, mechanism, prediction and forecast of earthquakes, and achieve the goal of advancing the development of Earth science and mitigation of seismic disasters. The commensurability of earthquake occurrences has been studied by means of the commensurability revealed by the Titius-Bode law in the paper. The studied results show that the earthquakes basically all occur at the commensurable point of its time axis, respectively. It also shows that occurrence of the earthquakes is not accidental, showing certain patterns and inevitability, and the commensurable value is different for earthquakes occurring in different areas.

  13. Earthquake Engineering Mitigation of Blast Loading

    E-print Network

    Fainman, Yeshaiahu

    Earthquake Engineering Mitigation of Blast Loading Health Monitoring & Condition Assessment-structural building components, and visual sensing for dy- namic testing. Associate Professor Hyonny Kim comes to UC-fluidics and protective/energy absorbing materials provides synergy with the Department's ongoing work in blast mitigation

  14. Developing stress-monitoring sites using cross-hole seismology to stress-forecast the times and magnitudes of future earthquakes

    E-print Network

    Developing stress-monitoring sites using cross-hole seismology to stress-forecast the times 2000 Abstract A new understanding of rockmass deformation suggests that changing stress in the crust almost all rocks in the crust. These stress-aligned micro cracks cause the widely observed splitting

  15. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  16. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  17. Launch Complex 39 Observation Gantry Area (SWMU# 107) Annual Long-Term Monitoring Report (Year 1) Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Johnson, Jill W.; Towns, Crystal

    2015-01-01

    This document has been prepared by Geosyntec Consultants, Inc. (Geosyntec) to present and discuss the findings of the 2014 and 2015 Long-Term Monitoring (LTM) activities that were completed at the Launch Complex 39 (LC39) Observation Gantry Area (OGA) located at the John F. Kennedy Space Center (KSC), Florida (Site). The remainder of this report includes: (i) a description of the Site location; (ii) summary of Site background and previous investigations; (iii) description of field activities completed as part of the annual LTM program at the Site; (iv) groundwater flow evaluation; (v) presentation and discussion of field and analytical results; and (vi) conclusions and recommendations. Applicable KSC Remediation Team (KSCRT) Meeting minutes are included in Attachment A. This Annual LTM Letter Report was prepared by Geosyntec Consultants (Geosyntec) for NASA under contract number NNK12CA13B, Delivery Order NNK13CA39T project number PCN ENV2188.

  18. Radioanalytical Data Quality Objectives and Measurement Quality Objectives during a Federal Radiological Monitoring and Assessment Center Response

    SciTech Connect

    E. C. Nielsen

    2006-01-01

    During the early and intermediate phases of a nuclear or radiological incident, the Federal Radiological Monitoring and Assessment Center (FRMAC) collects environmental samples that are analyzed by organizations with radioanalytical capability. Resources dedicated to quality assurance (QA) activities must be sufficient to assure that appropriate radioanalytical measurement quality objectives (MQOs) and assessment data quality objectives (DQOs) are met. As the emergency stabilizes, QA activities will evolve commensurate with the need to reach appropriate DQOs. The MQOs represent a compromise between precise analytical determinations and the timeliness necessary for emergency response activities. Minimum detectable concentration (MDC), lower limit of detection, and critical level tests can all serve as measurements reflecting the MQOs. The relationship among protective action guides (PAGs), derived response levels (DRLs), and laboratory detection limits is described. The rationale used to determine the appropriate laboratory detection limit is described.

  19. Oscillating brittle and viscous behavior through the earthquake cycle in the Red River Shear Zone: Monitoring flips between reaction and textural softening and hardening

    NASA Astrophysics Data System (ADS)

    Wintsch, Robert P.; Yeh, Meng-Wan

    2013-03-01

    Microstructures associated with cataclasites and mylonites in the Red River shear zone in the Diancang Shan block, Yunnan Province, China show evidence for both reaction hardening and softening at lower greenschist facies metamorphic conditions. The earliest fault-rocks derived from Triassic porphyritic orthogneiss protoliths are cataclasites. Brittle fractures and crushed grains are cemented by newly precipitated quartz. These cataclasites are subsequently overprinted by mylonitic fabrics. Truncations and embayments of relic feldspars and biotites show that these protolith minerals have been dissolved and incompletely replaced by muscovite, chlorite, and quartz. Both K-feldspar and plagioclase porphyroclasts are truncated by muscovite alone, suggesting locally metasomatic reactions of the form: 3K-feldspar + 2H+ = muscovite + 6SiO2(aq) + 2K+. Such reactions produce muscovite folia and fish, and quartz bands and ribbons. Muscovite and quartz are much weaker than the reactant feldspars and these reactions result in reaction softening. Moreover, the muscovite tends to align in contiguous bands that constitute textural softening. These mineral and textural modifications occurred at constant temperature and drove the transition from brittle to viscous deformation and the shift in deformation mechanism from cataclasis to dissolution-precipitation and reaction creep. These mylonitic rocks so produced are cut by K-feldspar veins that interrupt the mylonitic fabric. The veins add K-feldspar to the assemblage and these structures constitute both reaction and textural hardening. Finally these veins are boudinaged by continued viscous deformation in the mylonitic matrix, thus defining a late ductile strain event. Together these overprinting textures and microstructures demonstrate several oscillations between brittle and viscous deformation, all at lower greenschist facies conditions where only frictional behavior is predicted by experiments. The overlap of the depths of greenschist facies conditions with the base of the crustal seismic zone suggests that the implied oscillations in strain rate may have been related to the earthquake cycle.

  20. The ICOS Ecosystem network and Thematic Center: an infrastructure to monitor and better understand the ecosystem GHGs exchanges

    NASA Astrophysics Data System (ADS)

    Papale, D.; Ceulemans, R.; Janssens, I.; Loustau, D.; Valentini, R.

    2012-12-01

    The ICOS Ecosystem network is part of the ICOS European Research Infrastructure (www.icos-infrastructure.eu) together with the Atmospheric and Ocean networks. The ICOS Ecosystem includes highly standardized monitoring sites based on commercially available instruments embedded into an integrated system that is coordinated by the ICOS Ecosystem Thematic Center (ETC) which is responsible for the methodologies advancement, data processing and data distribution. The ecosystem monitoring activity will involve human intervention in field activities and for this reason rigorously standardized protocol for field ecosystem measurements are in preparation also in coordination with others international related activities. The core measurement in the ICOS Ecosystem sites are the main GHGs fluxes that include CO2, H2O, CH4 and N2O, using the eddy covariance method and chambers for the soil effluxes. To better interpret and understand the GHGs exchanges a full series of meteorological data (including spectral reflectance measurements and full radiation and water balance) are also collected and the sites are characterized in terms of carbon stocks, nutrients availability and management and disturbance history. Centralized raw data processing, QAQC and uncertainty estimation, test and development of new methodologies and techniques, assistance to the network and chemical analysis and long term storage of the vegetation and soil samples are the main activities where the ETC is responsible. The ETC, based in Italy and with sections in Belgium and France, is under construction and will be operative in 2013. The Ecosystem network, including the variables collected, the protocols under preparation and the data access and data use policies will be presented together with the Ecosystem Thematic Center role and development strategy. The aim is to identify and discuss integration and collaboration with others similar initiatives, also thanks to the support of the COOPEUS European project that will facilitate coordination between US and EU networks, and to receive the feedbacks from potential users of the infrastructure.

  1. Predicting Earthquake Response of Civil Structures from Ambient Noise

    NASA Astrophysics Data System (ADS)

    Prieto, G.; Lawrence, J. F.; Chung, A. I.; Kohler, M. D.

    2009-12-01

    Increased monitoring of civil structures for response to earthquake motions is fundamental for reducing seismic hazard. Seismic monitoring is difficult because typically only a few useful, intermediate to large earthquakes occur per decade near instrumented structures. Here we demonstrate that the impulse response function (IRF) of a multi-story building can be generated from ambient noise. Estimated shear-wave velocity, attenuation values, and resonance frequencies from the IRFs agree with previous estimates for the instrumented UCLA Factor building. The accuracy of the approach is demonstrated by predicting the Factor building’s response to an M4.2 earthquake. The methodology described here allows for rapid non-invasive determination of structural parameters from the IRFs within days and could be used as a new tool for stateof- health monitoring of civil structures (buildings, bridges, etc.) before and/or after major earthquakes.

  2. Engaging Students in Earthquake Science

    NASA Astrophysics Data System (ADS)

    Cooper, I. E.; Benthien, M.

    2004-12-01

    The Southern California Earthquake Center Communication, Education, and Outreach program (SCEC CEO) has been collaborating with the University of Southern California (USC) Joint Education Project (JEP) and the Education Consortium of Central Los Angeles (ECCLA) to work directly with the teachers and schools in the local community around USC. The community surrounding USC is 57 % Hispanic (US Census, 2000) and 21% African American (US Census, 2000). Through the partnership with ECCLA SCEC has created a three week enrichment intersession program, targeting disadvantaged students at the fourth/fifth grade level, dedicated entirely to earthquakes. SCEC builds partnerships with the intersession teachers, working together to actively engage the students in learning about earthquakes. SCEC provides a support system for the teachers, supplying them with the necessary content background as well as classroom manipulatives. SCEC goes into the classrooms with guest speakers and take the students out of the classroom on two field trips. There are four intersession programs each year. SCEC is also working with USC's Joint Education Project program. The JEP program has been recognized as one of the "oldest and best organized" Service-Learning programs in the country (TIME Magazine and the Princeton Review, 2000). Through this partnership SCEC is providing USC students with the necessary tools to go out to the local schools and teach students of all grade levels about earthquakes. SCEC works with the USC students to design engaging lesson plans that effectively convey content regarding earthquakes. USC students can check out hands-on/interactive materials to use in the classrooms from the SCEC Resource Library. In both these endeavors SCEC has expanded our outreach to the local community. SCEC is reaching over 200 minority children each year through our partnerships, and this number will increase as our programs grow.

  3. Catalog of Earthquake Hypocenters at Alaskan Volcanoes: January 1 through December 31, 2006

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl

    2008-01-01

    Between January 1 and December 31, 2006, AVO located 8,666 earthquakes of which 7,783 occurred on or near the 33 volcanoes monitored within Alaska. Monitoring highlights in 2006 include: an eruption of Augustine Volcano, a volcanic-tectonic earthquake swarm at Mount Martin, elevated seismicity and volcanic unrest at Fourpeaked Mountain, and elevated seismicity and low-level tremor at Mount Veniaminof and Korovin Volcano. A new seismic subnetwork was installed on Fourpeaked Mountain. This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field during 2006, (2) a description of earthquake detection, recording, analysis, and data archival systems, (3) a description of seismic velocity models used for earthquake locations, (4) a summary of earthquakes located in 2006, and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2006.

  4. Uplift and Subsidence Associated with the Great Aceh-Andaman Earthquake of 2004

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The magnitude 9.2 Indian Ocean earthquake of December 26, 2004, produced broad regions of uplift and subsidence. In order to define the lateral extent and the downdip limit of rupture, scientists from Caltech, Pasadena, Calif.; NASA's Jet Propulsion Laboratory, Pasadena, Calif.; Scripps Institution of Oceanography, La Jolla, Calif.; the U.S. Geological Survey, Pasadena, Calif.; and the Research Center for Geotechnology, Indonesian Institute of Sciences, Bandung, Indonesia; first needed to define the pivot line separating those regions. Interpretation of satellite imagery and a tidal model were one of the key tools used to do this.

    These pre-Sumatra earthquake (a) and post-Sumatra earthquake (b) images of North Sentinel Island in the Indian Ocean, acquired from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra spacecraft, show emergence of the coral reef surrounding the island following the earthquake. The tide was 30 plus or minus 14 centimeters lower in the pre-earthquake image (acquired November 21, 2000) than in the post-earthquake image (acquired February 20, 2005), requiring a minimum of 30 centimeters of uplift at this locality. Observations from an Indian Coast Guard helicopter on the northwest coast of the island suggest that the actual uplift is on the order of 1 to 2 meters at this site.

    In figures (c) and (d), pre-earthquake and post-earthquake ASTER images of a small island off the northwest coast of Rutland Island, 38 kilometers east of North Sentinel Island, show submergence of the coral reef surrounding the island. The tide was higher in the pre-earthquake image (acquired January 1, 2004) than in the post-earthquake image (acquired February 4, 2005), requiring subsidence at this locality. The pivot line must run between North Sentinel and Rutland islands. Note that the scale for the North Sentinel Island images differs from that for the Rutland Island images.

    The tidal model used for this study was based on data from JPL's Topex/Poseidon satellite. The model was used to determine the relative sea surface height at each location at the time each image was acquired, a critical component used to quantify the deformation.

    The scientists' method of using satellite imagery to recognize changes in elevation relative to sea surface height and of using a tidal model to place quantitative bounds on coseismic uplift or subsidence is a novel approach that can be adapted to other forms of remote sensing and can be applied to other subduction zones in tropical regions.

    ASTER is one of five Earth-observing instruments launched December 18, 1999, on NASA's Terra satellite. The instrument was built by Japan's Ministry of Economy, Trade and Industry. A joint U.S./Japan science team is responsible for validation and calibration of the instrument and the data products.

    The broad spectral coverage and high spectral resolution of ASTER provides scientists in numerous disciplines with critical information for surface mapping, and monitoring of dynamic conditions and temporal change. Example applications are: monitoring glacial advances and retreats; monitoring potentially active volcanoes; identifying crop stress; determining cloud morphology and physical properties; wetlands evaluation; thermal pollution monitoring; coral reef degradation; surface temperature mapping of soils and geology; and measuring surface heat balance.

    The U.S. science team is located at NASA's Jet Propulsion Laboratory, Pasadena, Calif. The Terra mission is part of NASA's Science Mission Directorate.

  5. Data Management and Site-Visit Monitoring of the Multi-Center Registry in the Korean Neonatal Network

    PubMed Central

    Choi, Chang Won

    2015-01-01

    The Korean Neonatal Network (KNN), a nationwide prospective registry of very-low-birth-weight (VLBW, < 1,500 g at birth) infants, was launched in April 2013. Data management (DM) and site-visit monitoring (SVM) were crucial in ensuring the quality of the data collected from 55 participating hospitals across the country on 116 clinical variables. We describe the processes and results of DM and SVM performed during the establishment stage of the registry. The DM procedure included automated proof checks, electronic data validation, query creation, query resolution, and revalidation of the corrected data. SVM included SVM team organization, identification of unregistered cases, source document verification, and post-visit report production. By March 31, 2015, 4,063 VLBW infants were registered and 1,693 queries were produced. Of these, 1,629 queries were resolved and 64 queries remain unresolved. By November 28, 2014, 52 participating hospitals were visited, with 136 site-visits completed since April 2013. Each participating hospital was visited biannually. DM and SVM were performed to ensure the quality of the data collected for the KNN registry. Our experience with DM and SVM can be applied for similar multi-center registries with large numbers of participating centers. PMID:26566353

  6. Data Management and Site-Visit Monitoring of the Multi-Center Registry in the Korean Neonatal Network.

    PubMed

    Choi, Chang Won; Park, Moon Sung

    2015-10-01

    The Korean Neonatal Network (KNN), a nationwide prospective registry of very-low-birth-weight (VLBW, < 1,500 g at birth) infants, was launched in April 2013. Data management (DM) and site-visit monitoring (SVM) were crucial in ensuring the quality of the data collected from 55 participating hospitals across the country on 116 clinical variables. We describe the processes and results of DM and SVM performed during the establishment stage of the registry. The DM procedure included automated proof checks, electronic data validation, query creation, query resolution, and revalidation of the corrected data. SVM included SVM team organization, identification of unregistered cases, source document verification, and post-visit report production. By March 31, 2015, 4,063 VLBW infants were registered and 1,693 queries were produced. Of these, 1,629 queries were resolved and 64 queries remain unresolved. By November 28, 2014, 52 participating hospitals were visited, with 136 site-visits completed since April 2013. Each participating hospital was visited biannually. DM and SVM were performed to ensure the quality of the data collected for the KNN registry. Our experience with DM and SVM can be applied for similar multi-center registries with large numbers of participating centers. PMID:26566353

  7. Levy Flights and Earthquakes

    E-print Network

    O. Sotolongo-Costa; J. C. Antoranz; A. Posadas; F. Vidal; A. Vazquez

    2002-05-27

    Levy flights representation is proposed to describe earthquake characteristics like the distribution of waiting times and position of hypocenters in a seismic region. Over 7500 microearthquakes and earthquakes from 1985 to 1994 were analyzed to test that its spatial and temporal distributions are such that can be described by a Levy flight with anomalous diffusion (in this case in a subdiffusive regime). Earthquake behavior is well described through Levy flights and Levy distribution functions such as results show.

  8. Project Title: Earthquake Documentary Interviews

    E-print Network

    Hickman, Mark

    Project Title: Earthquake Documentary Interviews Bachelor of Arts Internship Company: Chris Thomson Academic Adviser: Mary Wiles Project Reference Number: S112/CEISMIC/29/NP - Earthquake collected on the earthquake, its survivors and their stories. This project is unique

  9. Earthquakes of the Holocene.

    USGS Publications Warehouse

    Schwartz, D.P.

    1987-01-01

    Areas in which significant new data and insights have been obtained are: 1) fault slip rates; 2) earthquake recurrence models; 3) fault segmentation; 4) dating past earthquakes; 5) paleoseismicity in the E and central US; 6) folds and earthquakes, and 7) future earthquake behavior. Summarizes important trends in each of these research areas based on information published between June 1982 and June 1986 and preprints of papers in press. The bibliography for this period contains mainly referred publications in journals and books.-from Author

  10. Earthquakes and Plate Boundaries

    ERIC Educational Resources Information Center

    Lowman, Paul; And Others

    1978-01-01

    Contains the contents of the Student Investigation booklet of a Crustal Evolution Education Project (CEEP) instructional modules on earthquakes. Includes objectives, procedures, illustrations, worksheets, and summary questions. (MA)

  11. Are Earthquake Magnitudes Clustered?

    SciTech Connect

    Davidsen, Joern; Green, Adam

    2011-03-11

    The question of earthquake predictability is a long-standing and important challenge. Recent results [Phys. Rev. Lett. 98, 098501 (2007); ibid.100, 038501 (2008)] have suggested that earthquake magnitudes are clustered, thus indicating that they are not independent in contrast to what is typically assumed. Here, we present evidence that the observed magnitude correlations are to a large extent, if not entirely, an artifact due to the incompleteness of earthquake catalogs and the well-known modified Omori law. The latter leads to variations in the frequency-magnitude distribution if the distribution is constrained to those earthquakes that are close in space and time to the directly following event.

  12. Monitoring of fungal loads in seabird rehabilitation centers with comparisons to natural seabird environments in northern California.

    PubMed

    Burco, Julia D; Massey, J Gregory; Byrne, Barbara A; Tell, Lisa; Clemons, Karl V; Ziccardi, Michael H

    2014-03-01

    Aspergillosis remains a major cause of mortality in captive and rehabilitated seabirds. To date, there has been poor documentation of fungal (particularly Aspergillus spp.) burdens in natural seabird loafing and roosting sites compared with fungal numbers in rehabilitation or captive settings and the various microenvironments that seabirds are exposed to during the rehabilitation process. This study compares fungal, particularly Aspergillus spp., burdens potentially encountered by seabirds in natural and rehabilitation environments. Differences among the various microenvironments in the rehabilitation facility were evaluated to determine the risk of infection when seabirds are experiencing high stress and poor immune function. Aspergillus spp. counts were quantified in three wildlife rehabilitation centers and five natural seabird loafing and roosting sites in northern California using a handheld impact air sampler and a water filtration system. Wildlife rehabilitation centers demonstrated an increase in numbers of conidia of Aspergillus spp. and Aspergillus fumigatus in air and water samples from select aquatic bird rehabilitation centers compared with natural seabird environments in northern California. Various microenvironments in the rehabilitation facility were identified as having higher numbers of conidia of Aspergillus spp. These results suggest that periodic monitoring of multiple local areas, where the birds spend time in a rehabilitation facility, should be done to identify "high risk" sites, where birds should spend minimal time, or sites that should be cleaned more frequently or have improved air flow to reduce exposure to fungal conidia. Overall, these results suggest that seabirds may be more likely to encounter Aspergillus spp. in various microenvironments in captivity, compared with their native habitats, which could increase their risk of developing disease when in a debilitated state. PMID:24712159

  13. The loma prieta, california, earthquake: an anticipated event.

    PubMed

    1990-01-19

    The first major earthquake on the San Andreas fault since 1906 fulfilled a long-term forecast for its rupture in the southern Santa Cruz Mountains. Severe damage occurred at distances of up to 100 kilometers from the epicenter in areas underlain by ground known to be hazardous in strong earthquakes. Stronger earthquakes will someday strike closer to urban centers in the United States, most of which also contain hazardous ground. The Loma Prieta earthquake demonstrated that meaningful predictions can be made of potential damage patterns and that, at least in well-studied areas, long-term forecasts can be made of future earthquake locations and magnitudes. Such forecasts can serve as a basis for action to reduce the threat major earthquakes pose to the United States. PMID:17735847

  14. Earthquake-induced water-level fluctuations at Yucca Mountain, Nevada, June 1992

    SciTech Connect

    O`Brien, G.M.

    1993-07-01

    This report presents earthquake-induced water-level and fluid-pressure data for wells in the Yucca Mountain area, Nevada, during June 1992. Three earthquakes occurred which caused significant water-level and fluid-pressure responses in wells. Wells USW H-5 and USW H-6 are continuously monitored to detect short-term responses caused by earthquakes. Two wells, monitored hourly, had significant, longer-term responses in water level following the earthquakes. On June 28, 1992, a 7.5-magnitude earthquake occurred near Landers, California causing an estimated maximum water-level change of 90 centimeters in well USW H-5. Three hours later a 6.6-magnitude earthquake occurred near Big Bear Lake, California; the maximum water-level fluctuation was 20 centimeters in well USW H-5. A 5.6-magnitude earthquake occurred at Little Skull Mountain, Nevada, on June 29, approximately 23 kilometers from Yucca Mountain. The maximum estimated short-term water-level fluctuation from the Little Skull Mountain earthquake was 40 centimeters in well USW H-5. The water level in well UE-25p {number_sign}1, monitored hourly, decreased approximately 50 centimeters over 3 days following the Little Skull Mountain earthquake. The water level in UE-25p {number_sign}1 returned to pre-earthquake levels in approximately 6 months. The water level in the lower interval of well USW H-3 increased 28 centimeters following the Little Skull Mountain earthquake. The Landers and Little Skull Mountain earthquakes caused responses in 17 intervals of 14 hourly monitored wells, however, most responses were small and of short duration. For several days following the major earthquakes, many smaller magnitude aftershocks occurred causing measurable responses in the continuously monitored wells.

  15. 4th International Conference on Earthquake Engineering Taipei, Taiwan

    E-print Network

    Lynch, Jerome P.

    Center for Research on Earthquake Engineering (NCREE) in Taiwan. Emphasis is placed on using simulation4th International Conference on Earthquake Engineering Taipei, Taiwan October 12-13, 2006 Paper No. 175 EXPERIMENTS AND SIMULATION OF REINFORCED CONCRETE BUILDINGS SUBJECTED TO REVERSED CYCLIC

  16. 76 FR 61115 - Migrant and Seasonal Farmworkers (MSFWs) Monitoring Report and One-Stop Career Center Complaint...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ... and Training Administration Migrant and Seasonal Farmworkers (MSFWs) Monitoring Report and One-Stop... minimized, collection instruments are clearly understood, and the impact of collection requirements on... record keeping requirements to allow for the efficient and effective monitoring of State...

  17. Historical earthquakes in Libya

    NASA Astrophysics Data System (ADS)

    Suleiman, A. S.

    2003-04-01

    As a result of the relative motion of the African and European plates, Libya, located at the north central margin of the African continent, has experienced a considerable intraplate tectonism particularly at its northern coastal regions. In this study I present a reevaluation of the seismicity of Libya with special focus on the historical seismicity. Data on historical seismicity is of crucial importance for seismic hazard assessment in Libya. The earliest records of earthquakes in Libya is documented back from the Roman period when two large earthquakes (262 A.D. and 365 A.D) destroyed most of the temples and public buildings of Cyrene. A number of earthquakes that affected Libya in the Middle ages includes the 704 A.D. earthquake of Sabha (southern Libya) which reportedly destroyed several towns and village. In 1183 A.D., a powerful earthquake destroyed Tripoli, killing 20,000 people. Mild tremors were felt in Tripoli in 1803, 1811 and 1903 A.D. The Hun Graben area was the site of several earthquakes through history, in April 19 -1935 a great earthquake (mb=7.1) hit this area, followed by a very large number of aftershocks including two of magnitudes 6.0 and 6.5 on the Richter scale. In 1941 a major earthquake of magnitude 5.6 hit the Hun Graben area. In 1939 an earthquake of magnitude 5.6 occurred in the Gulf of Sirt area, followed by a number of aftershocks. Reinterpretation and improvement of the source quality for selected earthquakes will be presented. The present study aims to focus on investigating the original sources of information and in developing historical earthquake database.

  18. A Nevadan's guide to preparing for, surviving, and recovering from an earthquake

    E-print Network

    A Nevadan's guide to preparing for, surviving, and recovering from an earthquake Put the odds, Reno Nevada Bureau of Mines and Geology #12;Nevada Earthquake Safety Council Nevada Bureau of Mines Management Agency United States Geological Survey Southern California Earthquake Center Science Background

  19. Benefits of Earthquake Early Warning to Large Municipalities (Invited)

    NASA Astrophysics Data System (ADS)

    Featherstone, J.

    2013-12-01

    The City of Los Angeles has been involved in the testing of the Cal Tech Shake Alert, Earthquake Early Warning (EQEW) system, since February 2012. This system accesses a network of seismic monitors installed throughout California. The system analyzes and processes seismic information, and transmits a warning (audible and visual) when an earthquake occurs. In late 2011, the City of Los Angeles Emergency Management Department (EMD) was approached by Cal Tech regarding EQEW, and immediately recognized the value of the system. Simultaneously, EMD was in the process of finalizing a report by a multi-discipline team that visited Japan in December 2011, which spoke to the effectiveness of EQEW for the March 11, 2011 earthquake that struck that country. Information collected by the team confirmed that the EQEW systems proved to be very effective in alerting the population of the impending earthquake. The EQEW in Japan is also tied to mechanical safeguards, such as the stopping of high-speed trains. For a city the size and complexity of Los Angeles, the implementation of a reliable EQEW system will save lives, reduce loss, ensure effective and rapid emergency response, and will greatly enhance the ability of the region to recovery from a damaging earthquake. The current Shake Alert system is being tested at several governmental organizations and private businesses in the region. EMD, in cooperation with Cal Tech, identified several locations internal to the City where the system would have an immediate benefit. These include the staff offices within EMD, the Los Angeles Police Department's Real Time Analysis and Critical Response Division (24 hour crime center), and the Los Angeles Fire Department's Metropolitan Fire Communications (911 Dispatch). All three of these agencies routinely manage the collaboration and coordination of citywide emergency information and response during times of crisis. Having these three key public safety offices connected and included in the early testing of an EQEW system will help shape the EQEW policy which will determine the seismic safety of millions of Californians in the years to come.

  20. Crowd-Sourced Global Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  1. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  2. Earthquake sound perception

    NASA Astrophysics Data System (ADS)

    Tosi, Patrizia; Sbarra, Paola; De Rubeis, Valerio

    2012-12-01

    Sound is an effect produced by almost all earthquakes. Using a web-based questionnaire on earthquake effects that included questions relating to seismic sound, we collected 77,000 responses for recent shallow Italian earthquakes. An analysis of audibility attenuation indicated that the decrease of the percentage of respondents hearing the sound was proportional to the logarithm of the epicentral distance and linearly dependent on earthquake magnitude, in accordance with the behavior of ground displacement. Even if this result was based on Italian data, qualitative agreement with the results of theoretical displacement, and of a similar study based on French seismicity suggests wider validity. We also found that, given earthquake magnitude, audibility increased together with the observed macroseismic intensity, leading to the possibility of accounting for sound audibility in intensity assessment. Magnitude influenced this behavior, making small events easier to recognize, as suggested by their frequency content.

  3. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ?3 lower than average for California earthquakes. I present intensity observations from the 2014 South Napa earthquake that suggest that it may have been a low stress drop event.

  4. Implications for earthquake risk reduction in the United States from the Kocaeli, Turkey, earthquake of August 17, 1999

    USGS Publications Warehouse

    U.S. Geological Survey

    2000-01-01

    This report documents implications for earthquake risk reduction in the U.S. The magnitude 7.4 earthquake caused 17,127 deaths, 43,953 injuries, and displaced more than 250,000 people from their homes. The report warns that similar disasters are possible in the United States where earthquakes of comparable size strike the heart of American urban areas. Another concern described in the report is the delayed emergency response that was caused by the inadequate seismic monitoring system in Turkey, a problem that contrasts sharply with rapid assessment and response to the September Chi-Chi earthquake in Taiwan. Additionally, the experience in Turkey suggests that techniques for forecasting earthquakes may be improving.

  5. 1 INTRODUCTION Korea has a long history of earthquakes. Earthquake

    E-print Network

    Spencer Jr., Billie F.

    1 INTRODUCTION Korea has a long history of earthquakes. Earthquake events are well documented by those historic and recent earthquakes was not very high, and it is believed that Korea belongs to a low to moderate seismicity zone. However, after the Northridge and Kobe earthquakes, there was a growing concern

  6. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt earthquakes within, as an average 90s of their occurrence, and can map, in certain cases, the damaged areas. Thanks to the flashsourced and crowdsourced information, we developed an innovative Twitter earthquake information service (currently under test and to be open by November) which intends to offer notifications for earthquakes that matter for the public only. It provides timely information for felt and damaging earthquakes regardless their magnitude and heads-up for seismologists. In conclusion, the experience developed at the EMSC demonstrates the benefit of involving eyewitnesses in earthquake surveillance. The data collected directly and indirectly from eyewitnesses complement information derived from monitoring networks and contribute to improved services. By increasing interaction between science and society, it opens new opportunities for raising awareness on seismic hazard.

  7. Istanbul Earthquake Early Warning and Rapid Response System

    NASA Astrophysics Data System (ADS)

    Erdik, M. O.; Fahjan, Y.; Ozel, O.; Alcik, H.; Aydin, M.; Gul, M.

    2003-12-01

    As part of the preparations for the future earthquake in Istanbul a Rapid Response and Early Warning system in the metropolitan area is in operation. For the Early Warning system ten strong motion stations were installed as close as possible to the fault zone. Continuous on-line data from these stations via digital radio modem provide early warning for potentially disastrous earthquakes. Considering the complexity of fault rupture and the short fault distances involved, a simple and robust Early Warning algorithm, based on the exceedance of specified threshold time domain amplitude levels is implemented. The band-pass filtered accelerations and the cumulative absolute velocity (CAV) are compared with specified threshold levels. When any acceleration or CAV (on any channel) in a given station exceeds specific threshold values it is considered a vote. Whenever we have 2 station votes within selectable time interval, after the first vote, the first alarm is declared. In order to specify the appropriate threshold levels a data set of near field strong ground motions records form Turkey and the world has been analyzed. Correlations among these thresholds in terms of the epicenter distance the magnitude of the earthquake have been studied. The encrypted early warning signals will be communicated to the respective end users by UHF systems through a "service provider" company. The users of the early warning signal will be power and gas companies, nuclear research facilities, critical chemical factories, subway system and several high-rise buildings. Depending on the location of the earthquake (initiation of fault rupture) and the recipient facility the alarm time can be as high as about 8s. For the rapid response system one hundred 18 bit-resolution strong motion accelerometers were placed in quasi-free field locations (basement of small buildings) in the populated areas of the city, within an area of approximately 50x30km, to constitute a network that will enable early damage assessment and rapid response information after a damaging earthquake. Early response information is achieved through fast acquisition and analysis of processed data obtained from the network. The stations are routinely interrogated on regular basis by the main data center. After triggered by an earthquake, each station processes the streaming strong motion data to yield the spectral accelerations at specific periods, 12Hz filtered PGA and PGV and will send these parameters in the form of SMS messages at every 20s directly to the main data center through a designated GSM network and through a microwave system. A shake map and damage distribution map (using aggregate building inventories and fragility curves) will be automatically generated using the algorithm developed for this purpose. Loss assessment studies are complemented by a large citywide digital database on the topography, geology, soil conditions, building, infrastructure and lifeline inventory. The shake and damage maps will be conveyed to the governor's and mayor's offices, fire, police and army headquarters within 3 minutes using radio modem and GPRS communication. An additional forty strong motion recorders were placed on important structures in several interconnected clusters to monitor the health of these structures after a damaging earthquake.

  8. Development of a telecare system based on ZigBee mesh network for monitoring blood pressure of patients with hemodialysis in health care centers.

    PubMed

    Du, Yi-Chun; Lee, You-Yun; Lu, Yun-Yuan; Lin, Chia-Hung; Wu, Ming-Jei; Chen, Chung-Lin; Chen, Tainsong

    2011-10-01

    In Taiwan, the number of the patients needing dialysis increases rapidly in recent years. Because there is risk in every hemodialysis session, monitoring physiological status, such as blood pressure measurement every 30 min to 1 h is needed during about 4 h hemodialysis process. Therefore, an assisted measurement on blood pressure is needful in dialysis care centers. Telecare system (TCS) is regarded as one of important technique in the medical care. In this study, we utilized ZigBee wireless technique to establish a mesh network for monitoring blood pressure automatically and data storage in medical record system for display and further analysis. Moreover, while the blood pressure exceeds the normal range, the system could send a warning signal to remind, or inform the relatives and clinicians in health care center through the personal handy-phone system (PHS) immediately. The proposed system provides an assisted device for monitoring patients' blood pressure during hemodialysis process and saving medical manpower. PMID:20703683

  9. Can earthquakes be Karen Felzer

    E-print Network

    Felzer, Karen

    Can earthquakes be predicted? Karen Felzer U.S. Geological Survey #12;Earthquake predictions that most seismologists agree with #12;Long term earthquake probabilities These kinds of predictions to duck and cover! >99% chance that a M 6.7 earthquake will occur in CA within 30 years. 2008 Working

  10. Post-Sumatra Enhancements at the Pacific Tsunami Warning Center

    NASA Astrophysics Data System (ADS)

    McCreery, C.; Weinstein, S.; Becker, N.; Cessaro, R.; Hirshorn, B.; Fryer, G.; Hsu, V.; Sardina, V.; Koyanagi, S.; Shiro, B.; Wang, D.; Walsh, D.

    2007-12-01

    Following the tragic Indian Ocean Tsunami of 2004, the Richard Hagemeyer Pacific Tsunami Warning Center (PTWC) has dramatically enhanced its capabilities. With improved communications PTWC now ingests seismic data from almost all broadband stations of the Global Seismographic Network and will soon add many stations from the International Monitoring System. As data sources are increased PTWC's response time to any earthquake declines; for most earthquakes the center now gets out an initial message in about 12 minutes. With 24-hour staffing, that performance is maintained around the clock. Direct measurement of tsunamis has been improved through communications upgrades to coastal tide gauges by NOAA and other collaborators in the Pacific Tsunami Warning System, and by the NOAA deployment of DART instruments throughout the world's oceans. In addition to providing warnings for the Pacific (with the exception of Alaska and the west coasts of the U.S, and Canada, which are the responsibility of the West Coast and Alaska Tsunami Warning Center), PTWC also operates as an interim warning center for the Indian Ocean (a task performed in collaboration with the Japan Meteorological Agency) and the Caribbean. PTWC also operates as a local warning center for the State of Hawaii. In Hawaii, the installation of new seismometers again means a continuous reduction in PTWC's response times. Initial assessments of local earthquakes are routinely accomplished in less than five minutes, and the first message for the Kiholo Bay Earthquake of 2006 was issued in only three minutes. With the development of the Hawaii Integrated Seismographic Network, in collaboration with the U.S. Geological Survey, the goal is to reduce the time for tsunami warnings to under two minutes for any earthquake in the Hawaiian Islands.

  11. Waveform Cross-Correlation for Improved North Texas Earthquake Locations

    NASA Astrophysics Data System (ADS)

    Phillips, M.; DeShon, H. R.; Oldham, H. R.; Hayward, C.

    2014-12-01

    In November 2013, a sequence of earthquakes began in Reno and Azle, TX, two communities located northwest of Fort Worth in an area of active oil and gas extraction. Only one felt earthquake had been reported within the area before the occurrence of probable injection-induced earthquakes at the Dallas-Fort Worth airport in 2008. The USGS National Earthquakes Information Center (NEIC) has reported 27 felt earthquakes in the Reno-Azle area through January 28, 2014. A temporary seismic network was installed beginning in December 2013 to acquire data to improve location and magnitude estimates and characterize the earthquake sequence. Here, we present high-resolution relative earthquake locations derived using differential time data from waveform cross-correlation. Cross-correlation is computed using the GISMO software suite and event relocation is done using double difference relocation techniques. Waveform cross-correlation of the local data indicates high (>70%) similarity between 4 major swarms of events lasting between 18 and 24 hours. These swarms are temporal zones of high event frequency; 1.4% of the time series data accounts for 42.1% of the identified local earthquakes. Local earthquakes are occurring along the Newark East Fault System, a NE-SW striking normal fault system previously thought inactive at depths between 2 and 8 km in the Ellenburger limestone formation and underlying Precambrian basement. Data analysis is ongoing and continued characterization of the associated fault will provide improved location estimates.

  12. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    SciTech Connect

    Saragoni, G. Rodolfo

    2008-07-08

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  13. Center for Integration of Natural Disaster Information

    USGS Publications Warehouse

    U.S. Geological Survey

    2001-01-01

    The U.S. Geological Survey's Center for Integration of Natural Disaster Information (CINDI) is a research and operational facility that explores methods for collecting, integrating, and communicating information about the risks posed by natural hazards and the effects of natural disasters. The U.S. Geological Survey (USGS) is mandated by the Robert Stafford Act to warn citizens of impending landslides, volcanic eruptions, and earthquakes. The USGS also coordinates with other Federal, State, and local disaster agencies to monitor threats to communities from floods, coastal storms, wildfires, geomagnetic storms, drought, and outbreaks of disease in wildlife populations.

  14. Fusion of Multi Precursors Earthquake Parameters to Estimate the Date, Magnitude and Affected Area of the Forthcoming Powerful Earthquakes

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.; Saradjian, M. R.

    2012-07-01

    Since not any individual precursor can be used as an accurate stand alone means for the earthquake prediction, it is necessary to integrate different kinds of precursors. The precursors selected for analysis in this study include electron and ion density, electron temperature, total electron content (TEC), electric and magnetic fields and land surface temperature (LST) several days before three strong earthquakes which happened in Samoa Islands, Sichuan (China) and Borujerd (Iran). The precursor's variations were monitored using data obtained from experiments onboard DEMETER (IAP, ISL, ICE and IMSC) and Aqua-MODIS satellites. Regarding the ionospheric precursors, the geomagnetic indices Dst and Kp were used to distinguish pre-earthquake disturbed states from the other anomalies related to the geomagnetic activities. The inter-quartile range of data was utilized to construct their upper and lower bound to detect disturbed states outsides the bounds which might be associated with impending earthquakes. When the disturbed state associated with impending earthquake is detected, based on the type of precursor, the number of days relative to earthquake day is estimated. Then regarding the deviation value of the precursor from the undisturbed state the magnitude of impending earthquake is estimated. The radius of the affected area is calculated using the estimated magnitude and Dobrovolsky formula. In order to assess final earthquake parameters (which are date, magnitude and radius of the affected area) for each case study, using the median and inter-quartile range of earthquake parameters obtained from different precursors, the approximate bounds of final earthquake parameters are defined. For each studied case, a good agreement was found between the estimated and registered earthquake parameters.

  15. Cooperative Monitoring Center Occasional Paper/13: Cooperative monitoring for confidence building: A case study of the Sino-Indian border areas

    SciTech Connect

    SIDHU,WAHEGURU PAL SINGH; YUAN,JING-DONG; BIRINGER,KENT L.

    1999-08-01

    This occasional paper identifies applicable cooperative monitoring techniques and develops models for possible application in the context of the border between China and India. The 1993 and 1996 Sino-Indian agreements on maintaining peace and tranquility along the Line of Actual Control (LAC) and establishing certain confidence building measures (CBMs), including force reductions and limitation on military exercises along their common border, are used to examine the application of technically based cooperative monitoring in both strengthening the existing terms of the agreements and also enhancing trust. The paper also aims to further the understanding of how and under what conditions technology-based tools can assist in implementing existing agreements on arms control and confidence building. The authors explore how cooperative monitoring techniques can facilitate effective implementation of arms control agreements and CBMS between states and contribute to greater security and stability in bilateral, regional, and global contexts.

  16. Earthquakes and emergence

    NASA Astrophysics Data System (ADS)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  17. Earthquake Science ISSN 1674-4519

    E-print Network

    Liu, Mian

    assessments. Keywords Aftershock Á Earthquake Á Intraplate seismicity Á Earthquake hazard 1 Introduction different implications for earthquake hazard assessment in this region, the so- called ``Capital Circle

  18. To capture an earthquake

    SciTech Connect

    Ellsworth, W.L. )

    1990-11-01

    An earthquake model based on the theory of plate tectonics is presented. It is assumed that the plates behave elastically in response to slow, steady motions and the strains concentrate within the boundary zone between the plates. When the accumulated stresses exceed the bearing capacity of the rocks, the rocks break, producing an earthquake and releasing the accumulated stresses. As the steady movement of the plates continues, strain begins to reaccumulate. The cycle of strain accumulation and release is modeled using the motion of a block, pulled across a rough surface by a spring. A model earthquake can be predicted by taking into account a precursory event or the peak spring force prior to slip as measured in previous cycles. The model can be applied to faults, e.g., the San Andreas fault, if the past earthquake history of the fault and the rate of strain accumulation are known.

  19. Earthquakes for Kids

    MedlinePLUS

    ... Hazard Maps & Data Seismic Design Hazard Analysis Tools Faults Data & Products Data Products Learn EQ Topics for ... Buildings NSMP – Strong Motion Crustal Deformation Research Characterizing Faults Early Warning Crustal Deformation Earthquake Processes Ground Failure ...

  20. Nonlinear processes in earthquakes

    SciTech Connect

    Jones, E.M.; Frohlich, C.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Three-dimensional, elastic-wave-propagation calculations were performed to define the effects of near-source geologic structure on the degree to which seismic signals produced by earthquakes resemble {open_quotes}non-double-couple{close_quotes} sources. Signals from sources embedded in a subducting slab showed significant phase and amplitude differences compared with a {open_quotes}no-slab{close_quotes} case. Modifications to the LANL elastic-wave propagation code enabled improved simulations of path effects on earthquake and explosion signals. These simulations demonstrate that near-source, shallow, low-velocity basins can introduce earthquake-like features into explosion signatures through conversion of compressive (P-wave) energy to shear (S- and R-wave) modes. Earthquake sources simulated to date do not show significant modifications.

  1. Northridge, CA Earthquake Damage

    USGS Multimedia Gallery

    The person in this image was a USGS employee at the time this was taken. Collection of USGS still images taken after the January 17, 1994 Northridge earthquake highlighting the damage to buildings and infrastructure....

  2. Helium soil-gas variations associated with recent central California earthquakes: precursor or coincidence?

    USGS Publications Warehouse

    Reimer, G.M.

    1981-01-01

    Decreases in the helium concentration of soil-gas have been observed to precede six of eight recent central California earthquakes. Ten monitoring stations were established near Hollister, California and along the San Andreas Fault to permit gas collection. The data showed decreases occurring a few weeks before the earthquakes and concentratiosn returned to prequake levels either shortly before or after the earthquakes.-Author

  3. Account of an Earthquake

    E-print Network

    Snying dkar skyid

    2009-11-17

    China/Snying dkar skyid Tape No. / Track / Item No. Stag rig 013.WAV Length of track 00:03:25 Related tracks (include description/relationship if appropriate) Title of track Account of an Earthquake Translation of title Description (to... be used in archive entry) Glu mo mtsho describes her experience of being in a strong earthquake. Genre or type (i.e. epic, song, ritual) Personal account Name of recorder (if different from collector) Date of recording 17 November 2009...

  4. Quantitative Earthquake Prediction on Global and Regional Scales

    SciTech Connect

    Kossobokov, Vladimir G.

    2006-03-23

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and for mega-earthquakes of M9.0+. The monitoring at regional scales may require application of a recently proposed scheme for the spatial stabilization of the intermediate-term middle-range predictions. The scheme guarantees a more objective and reliable diagnosis of times of increased probability and is less restrictive to input seismic data. It makes feasible reestablishment of seismic monitoring aimed at prediction of large magnitude earthquakes in Caucasus and Central Asia, which to our regret, has been discontinued in 1991. The first results of the monitoring (1986-1990) were encouraging, at least for M6.5+.

  5. Campi Flegrei Structure From Earthquake Tomography

    NASA Astrophysics Data System (ADS)

    de Luca, G.; de Natale, G.; Benz, H.; Troise, C.; Capuano, P.

    Campi Flegrei caldera is an active volcanic area of Southern Italy characterised, with the neighbouring Mt. Vesuvius, by the highest volcanic risk in the World. Recent episodes of rapid uplift at spectacular rates (1 m per year), accompanied by intense seismicity of low to moderate magnitude (up to ML=4.2), allowed to collect funda- mental information about caldera unrests. Local earthquake arrival times collected by surveillance and temporary seismic networks from 1970 to present formed the basis for a selected data set consisting of about 450 earthquakes, recorded at total of 77 seismic station locations, not contemporary. Tomographic inversion, carried out with different ray tracing methods, put in evidence a minimum Vp, Vs and Vp/Vs located at the center of caldera. This low velocity anomaly, better resolved here with respect to previous work, mainly because of a much larger sampling of earthquakes and seismic station locations, is likely to mark the lighter pyroclastic rocks filling the innermost caldera collapse. The intensity of the anomalies, and the high value of the Vp/Vs ratio in the central zone, are coherent with a strongly fractured, fluid filled medium. The detailed knowledge of the velocity structure also allows to get more precise earth- quake locations. The relocation in the 3D model of over 1000 local earthquakes gives the most detailed picture of Campi Flegrei seismicity so far obtained. The use of a Bayesian algorithm for earthquake location shows a rather sharp picture of the main seismogenic structures of the area.

  6. Injection-induced earthquakes.

    PubMed

    Ellsworth, William L

    2013-07-12

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard. PMID:23846903

  7. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education

    NASA Astrophysics Data System (ADS)

    de Groot, R.

    2008-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  8. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the problems which began to discuss only during the last time. Earthquakes often precede volcanic eruptions. According to Darwin, the earthquake-induced shock may be a common mechanism of the simultaneous eruptions of the volcanoes separated by long distances. In particular, Darwin wrote that ‘… the elevation of many hundred square miles of territory near Concepcion is part of the same phenomenon, with that splashing up, if I may so call it, of volcanic matter through the orifices in the Cordillera at the moment of the shock;…'. According to Darwin the crust is a system where fractured zones, and zones of seismic and volcanic activities interact. Darwin formulated the task of considering together the processes studied now as seismology and volcanology. However the difficulties are such that the study of interactions between earthquakes and volcanoes began only recently and his works on this had relatively little impact on the development of geosciences. In this report, we discuss how the latest data on seismic and volcanic events support the Darwin's observations and ideas about the 1835 Chilean earthquake. The material from researchspace. auckland. ac. nz/handle/2292/4474 is used. We show how modern mechanical tests from impact engineering and simple experiments with weakly-cohesive materials also support his observations and ideas. On the other hand, we developed the mathematical theory of the earthquake-induced catastrophic wave phenomena. This theory allow to explain the most important aspects the Darwin's earthquake reports. This is achieved through the simplification of fundamental governing equations of considering problems to strongly-nonlinear wave equations. Solutions of these equations are constructed with the help of analytic and numerical techniques. The solutions can model different strongly-nonlinear wave phenomena which generate in a variety of physical context. A comparison with relevant experimental observations is also presented.

  9. An appraisal of aftershocks behavior for large earthquakes in Persia

    NASA Astrophysics Data System (ADS)

    Nemati, Majid

    2014-01-01

    This study focuses on the distribution of aftershocks in both location and magnitude for recent earthquakes in Iran. 43 Earthquakes are investigated, using data from the global International Seismological Center (ISC) seismic catalogue and from the regional earthquake catalogue of the Institute of Geophysics, University of Tehran (IGUT) between 1961-2006 and 2006-2012 respectively. We only consider the earthquakes with magnitude greater than 5.0. The majority of these events are intracontinental, occurring over four seismotectonic provinces across Iran. Processing aftershock sequences reported by both catalogues with cut-off magnitude of 2.5 and a sequence duration of 70 days, leads us to define a spatial horizontal area (A) occupied with the aftershocks as a function of mainshock magnitude (M) for Persian earthquakes: ISC: Log10(A) = 0.45MS + 0.23; IGUT: Log10(A) = 0.25MN + 1.7.

  10. Simulating Earthquakes for Science and Society: New Earthquake Visualizations Ideal for Use in Science Communication

    NASA Astrophysics Data System (ADS)

    de Groot, R. M.; Benthien, M. L.

    2006-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently have gained visibility via television news coverage in Southern California. These types of visualizations are becoming pervasive in the teaching and learning of concepts related to earth science. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin &Brick, 2002). Earthquakes are ideal candidates for visualization products: they cannot be predicted, are completed in a matter of seconds, occur deep in the earth, and the time between events can be on a geologic time scale. For example, the southern part of the San Andreas fault has not seen a major earthquake since about 1690, setting the stage for an earthquake as large as magnitude 7.7 -- the "big one." Since no one has experienced such an earthquake, visualizations can help people understand the scale of such an event. Accordingly, SCEC has developed a revolutionary simulation of this earthquake, with breathtaking visualizations that are now being distributed. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  11. An Atlas of ShakeMaps for Selected Global Earthquakes

    USGS Publications Warehouse

    Allen, Trevor I.; Wald, David J.; Hotovec, Alicia J.; Lin, Kuo-Wan; Earle, Paul; Marano, Kristin D.

    2008-01-01

    An atlas of maps of peak ground motions and intensity 'ShakeMaps' has been developed for almost 5,000 recent and historical global earthquakes. These maps are produced using established ShakeMap methodology (Wald and others, 1999c; Wald and others, 2005) and constraints from macroseismic intensity data, instrumental ground motions, regional topographically-based site amplifications, and published earthquake-rupture models. Applying the ShakeMap methodology allows a consistent approach to combine point observations with ground-motion predictions to produce descriptions of peak ground motions and intensity for each event. We also calculate an estimated ground-motion uncertainty grid for each earthquake. The Atlas of ShakeMaps provides a consistent and quantitative description of the distribution and intensity of shaking for recent global earthquakes (1973-2007) as well as selected historic events. As such, the Atlas was developed specifically for calibrating global earthquake loss estimation methodologies to be used in the U.S. Geological Survey Prompt Assessment of Global Earthquakes for Response (PAGER) Project. PAGER will employ these loss models to rapidly estimate the impact of global earthquakes as part of the USGS National Earthquake Information Center's earthquake-response protocol. The development of the Atlas of ShakeMaps has also led to several key improvements to the Global ShakeMap system. The key upgrades include: addition of uncertainties in the ground motion mapping, introduction of modern ground-motion prediction equations, improved estimates of global seismic-site conditions (VS30), and improved definition of stable continental region polygons. Finally, we have merged all of the ShakeMaps in the Atlas to provide a global perspective of earthquake ground shaking for the past 35 years, allowing comparison with probabilistic hazard maps. The online Atlas and supporting databases can be found at http://earthquake.usgs.gov/eqcenter/shakemap/atlas.php/.

  12. The 2007 Mentawai earthquake sequence on the Sumatra megathrust

    NASA Astrophysics Data System (ADS)

    Konca, A.; Avouac, J.; Sladen, A.; Meltzner, A. J.; Kositsky, A. P.; Sieh, K.; Fang, P.; Li, Z.; Galetzka, J.; Genrich, J.; Chlieh, M.; Natawidjaja, D. H.; Bock, Y.; Fielding, E. J.; Helmberger, D. V.

    2008-12-01

    The Sumatra Megathrust has recently produced a flurry of large interplate earthquakes starting with the giant Mw 9.15, Aceh earthquake of 2004. All of these earthquakes occurred within the area monitored by the Sumatra Geodetic Array (SuGAr), which provided exceptional records of near-field co-seismic and postseismic ground displacements. The most recent of these major earthquakes, an Mw 8.4 earthquake and an Mw 7.9 earthquake twelve hours later, occurred in the Mentawai islands area where devastating historical earthquakes had happened in 1797 and 1833. The 2007 earthquake sequence provides an exceptional opportunity to understand the variability of the earthquakes along megathrusts and their relation to interseismic coupling. The InSAR, GPS and teleseismic modeling shows that 2007 earthquakes ruptured a fraction of the strongly coupled Mentawai patch of the megathrust, which is also only a fraction of the 1833 rupture area. It also released a much smaller moment than the one released in 1833, or than the deficit of moment that has accumulated since. Both earthquakes of 2007 consist of 2 sub-events which are 50 to 100 km apart from each other. On the other hand, the northernmost slip patch of 8.4 and southern slip patch of 7.9 earthquakes abut each other, but they ruptured 12 hours apart. Sunda megathrust earthquakes of recent years include a rupture of a strongly coupled patch that closely mimics a prior rupture of that patch and which is well correlated with the interseismic coupling pattern (Nias-Simeulue section), as well as a rupture sequence of a strongly coupled patch that differs substantially in the details from its most recent predecessors (Mentawai section). We conclude that (1) seismic asperities are probably persistent features which arise form heterogeneous strain build up in the interseismic period; and (2) the same portion of a megathrust can rupture in different ways depending on whether asperities break as isolated events or cooperate to produce a larger rupture. The stress distribution on the portion of the Sunda megathrust that had ruptured in 1833 and 1797 was probably not adequate for the development of a single major earthquake of the coupled zone in 2007. Since the moment released in 2007 amounts to only a fraction of the deficit of moment that had accumulated as a result of interseismic strain since 1833, the potential for a large megathrust event in the Mentawai area remains large.

  13. Google Mapplets for Earthquakes and Volcanic Activity

    NASA Astrophysics Data System (ADS)

    Haefner, S. A.; Venezky, D. Y.

    2007-12-01

    The USGS Earthquake and Volcano Hazards Programs monitor, assess, and issue warnings of natural hazards. Users can access our hazards information through our web pages, RSS feeds, and now through USGS Mapplets. Mapplets allow third party data layers to be added on top of Google Maps (http://maps.google.com - My Maps tab). Mapplets are created by parsing a GeoRSS feed, which involves searching through an XML file for location data and plotting the associated information on a map. The new Mapplets allow users to view both real-time earthquakes and current volcanic activity on the same map for the first time. In addition, the USGS Mapplets have been added to Google's extensive collection of Mapplets, allowing users to add the types of information they want to see on their own customized maps. The Earthquake Mapplet plots the past week of earthquakes around the world, showing the location, time and magnitude. The Volcano Mapplet displays the latest U.S. volcano updates, including the current level of both ground-based and aviation hazards. Join us to discuss how Mapplets are made and how they can be used to create your own customized map.

  14. A search for paleoliquefaction and evidence bearing on the recurrence behavior of the great 1811-12 New Madrid earthquakes

    USGS Publications Warehouse

    Wesnousky, S.G.; Leffler, L.M.

    1994-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Missouri. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This professional paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  15. Assessment of the earthquake forecasting approaches (case study: IRAN)

    NASA Astrophysics Data System (ADS)

    Hajizadeh, A.; Vaezi, N. F.

    2010-05-01

    An earthquake is the result of a sudden release of energy in the Earth's crust that creates seismic waves. As well as we can say an earthquake is suddenly shaking in the ground. This is caused when rocks that are beneath the Earth's surface move and break. Scientists attempt to predict the earthquake by means of forecasting and using new technics such as GPS, InSAR, geology, knowledge of past earthquake patterns, gravimetry and etc. Earthquake forecasts declare that a temblor has a certain probability of occurring within a given time. These warnings help to governments, communities, industries and private companies to prepare for large earthquakes and conduct rescue operation and recovery efforts in the aftermath of destructive shocks. In this article we'll assess the forecasting approaches and compare their precision and other factors. Predict time of earthquake occurrence and case study in this paper are the results of this investigation. Since forewarned communities could take steps to evaluate, many of the injuries and deaths that would otherwise occur could be avoided if the government would implement this proposal. We have chosen Iran as the center of this investigation, because Iran is one of the most seismic countries. Key words: earthquake, forecasting, geodesy approaches, IRAN, precise, earth's crust

  16. Post-earthquake ignition vulnerability assessment of Küçükçekmece District

    NASA Astrophysics Data System (ADS)

    Yildiz, S. S.; Karaman, H.

    2013-12-01

    In this study, a geographic information system (GIS)-based model was developed to calculate the post-earthquake ignition probability of a building, considering damage to the building's interior gas and electrical distribution system and the overturning of appliances. In order to make our model more reliable and realistic, a weighting factor was used to define the possible existence of each appliance or other contents in the given occupancy. A questionnaire was prepared to weigh the relevance of the different components of post-earthquake ignitions using the analytical hierarchy process (AHP). The questionnaire was evaluated by researchers who were experienced in earthquake engineering and post-earthquake fires. The developed model was implemented to HAZTURK's (Hazards Turkey) earthquake loss assessment software, as developed by the Mid-America Earthquake Center with the help of Istanbul Technical University. The developed post-earthquake ignition tool was applied to Küçükçekmece, Istanbul, in Turkey. The results were evaluated according to structure types, occupancy types, the number of storeys, building codes and specified districts. The evaluated results support the theory that post-earthquake ignition probability is inversely proportional to the number of storeys and the construction year, depending upon the building code.

  17. Post-earthquake ignition vulnerability assessment of Küçükçekmece District

    NASA Astrophysics Data System (ADS)

    Yildiz, S. S.; Karaman, H.

    2013-05-01

    In this study, a Geographic Information System (GIS) based model was developed to calculate the post-earthquake ignition probability of a building, considering damage to the building's interior gas and electrical distribution system and the overturning of appliances. In order to make our model more reliable and realistic, a weighting factor was used to define the possible existence of each appliance or other contents in the given occupancy. A questionnaire was prepared to weigh the relevance of the different components of post-earthquake ignitions using Analytical Hierarchy Process (AHP). The questionnaire was evaluated by researchers who were experienced in earthquake engineering and post earthquake fires. The developed model was implemented to HAZTURK (Hazards Turkey) earthquake loss assessment software, as developed by Mid-America Earthquake Center with the help of Istanbul Technical University. The developed post-earthquake ignition tool was applied to Küçükçekmece, Istanbul in Turkey. The results were evaluated according to structure types, occupancy types, the number of storeys, building codes and specified districts. The evaluated results support the theory that post-earthquake ignition probability is inversely proportional to the number of storeys and the construction year, depending upon the building code.

  18. Magnitude and location of historical earthquakes in Japan and implications for the 1855 Ansei Edo earthquake

    USGS Publications Warehouse

    Bakun, W.H.

    2005-01-01

    Japan Meteorological Agency (JMA) intensity assignments IJMA are used to derive intensity attenuation models suitable for estimating the location and an intensity magnitude Mjma for historical earthquakes in Japan. The intensity for shallow crustal earthquakes on Honshu is equal to -1.89 + 1.42MJMA - 0.00887?? h - 1.66log??h, where MJMA is the JMA magnitude, ??h = (??2 + h2)1/2, and ?? and h are epicentral distance and focal depth (km), respectively. Four earthquakes located near the Japan Trench were used to develop a subducting plate intensity attenuation model where intensity is equal to -8.33 + 2.19MJMA -0.00550??h - 1.14 log ?? h. The IJMA assignments for the MJMA7.9 great 1923 Kanto earthquake on the Philippine Sea-Eurasian plate interface are consistent with the subducting plate model; Using the subducting plate model and 226 IJMA IV-VI assignments, the location of the intensity center is 25 km north of the epicenter, Mjma is 7.7, and MJMA is 7.3-8.0 at the 1?? confidence level. Intensity assignments and reported aftershock activity for the enigmatic 11 November 1855 Ansei Edo earthquake are consistent with an MJMA 7.2 Philippine Sea-Eurasian interplate source or Philippine Sea intraslab source at about 30 km depth. If the 1855 earthquake was a Philippine Sea-Eurasian interplate event, the intensity center was adjacent to and downdip of the rupture area of the great 1923 Kanto earthquake, suggesting that the 1855 and 1923 events ruptured adjoining sections of the Philippine Sea-Eurasian plate interface.

  19. Satellite Relay Telemetry of Seismic Data in Earthquake Prediction and Control

    NASA Technical Reports Server (NTRS)

    Jackson, W. H.; Eaton, J. P.

    1971-01-01

    The Satellite Telemetry Earthquake Monitoring Program was started to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project. The principal advantages of the satellite relay system over commercial telephone or microwave systems were: (1) it could be made less prone to massive failure during a major earthquake; (2) it could be extended readily into undeveloped regions; and (3) it could provide flexible, uniform communications over large sections of major global tectonic zones. Fundamental characteristics of a communications system to cope with the large volume of raw data collected by a short-period seismograph network are discussed.

  20. GEM - The Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a coordinated global network of regional centers, with a high degree of interaction among the centers and the central secretariat. Broad acceptance of the models will be ensured by including local knowledge in all aspects of hazard and risk assessment and securing participation of local experts throughout development. All GEM efforts will be carried out using a common global software infrastructure and consensus standards. In accordance with principles of open-source development, and to ensure comprehensive global representation, contributions are welcomed and encouraged from a broad group of participants. To ensure uniformity and conformance with the highest scientific standards, all contributions, including models, tools, and data, will be rigorously vetted and independently tested. Recently the EUCENTRE in Pavia/Italy has been selected as the host institution of the GEM secretariat. The project will formally launch in early 2009 by creating the non-profit GEM foundation. While GEM serves a humanitarian imperative it is considered as offering a key to long-term economic development. GEM will enhance risk awareness at global, national and local scales. Greater risk awareness is a precondition for motivating public and private parties to investing into risk reduction and loss prevention, and to promote a greater use of financial risk transfer instruments.

  1. A continuation of base-line studies for environmentally monitoring Space Transportation Systems at John F. Kennedy Space Center. Volume 2: Chemical studies of rainfall and soil analysis

    NASA Technical Reports Server (NTRS)

    Madsen, B. C.

    1980-01-01

    The results of a study which was designed to monitor, characterize, and evaluate the chemical composition of precipitation (rain) which fell at the Kennedy Space Center, Florida (KSC) during the period July 1977 to March 1979 are reported. Results which were obtained from a soil sampling and associated chemical analysis are discussed. The purpose of these studies was to determine the environmental perturbations which might be caused by NASA space activities.

  2. Sun-earth environment study to understand earthquake prediction

    NASA Astrophysics Data System (ADS)

    Mukherjee, S.

    2007-05-01

    Earthquake prediction is possible by looking into the location of active sunspots before it harbours energy towards earth. Earth is a restless planet the restlessness turns deadly occasionally. Of all natural hazards, earthquakes are the most feared. For centuries scientists working in seismically active regions have noted premonitory signals. Changes in thermosphere, Ionosphere, atmosphere and hydrosphere are noted before the changes in geosphere. The historical records talk of changes of the water level in wells, of strange weather, of ground-hugging fog, of unusual behaviour of animals (due to change in magnetic field of the earth) that seem to feel the approach of a major earthquake. With the advent of modern science and technology the understanding of these pre-earthquake signals has become stronger enough to develop a methodology of earthquake prediction. A correlation of earth directed coronal mass ejection (CME) from the active sunspots has been possible to develop as a precursor of the earthquake. Occasional local magnetic field and planetary indices (Kp values) changes in the lower atmosphere that is accompanied by the formation of haze and a reduction of moisture in the air. Large patches, often tens to hundreds of thousands of square kilometres in size, seen in night-time infrared satellite images where the land surface temperature seems to fluctuate rapidly. Perturbations in the ionosphere at 90 - 120 km altitude have been observed before the occurrence of earthquakes. These changes affect the transmission of radio waves and a radio black out has been observed due to CME. Another heliophysical parameter Electron flux (Eflux) has been monitored before the occurrence of the earthquakes. More than hundreds of case studies show that before the occurrence of the earthquakes the atmospheric temperature increases and suddenly drops before the occurrence of the earthquakes. These changes are being monitored by using Sun Observatory Heliospheric observatory (SOHO) satellite data. Whatever the manifestations in the environment of the atmosphere or geosphere may be, there is a positive correlation of CMEs with change in magnetic field followed by aurora borealis or sudden spark of light from the sky before an earthquake. Any change in geomorphology in the pixel level, changes in groundwater level, geochemical anomalies of soils surrounding active faults and vegetation anomalies should be monitored in the mirror image position of sunspots on the earth facing side in reference to CME from the sun.

  3. Real-time neural network earthquake profile predictor

    DOEpatents

    Leach, Richard R. (Castro Valley, CA); Dowla, Farid U. (Castro Valley, CA)

    1996-01-01

    A neural network has been developed that uses first-arrival energy to predict the characteristics of impending earthquake seismograph signals. The propagation of ground motion energy through the earth is a highly nonlinear function. This is due to different forms of ground motion as well as to changes in the elastic properties of the media throughout the propagation path. The neural network is trained using seismogram data from earthquakes. Presented with a previously unseen earthquake, the neural network produces a profile of the complete earthquake signal using data from the first seconds of the signal. This offers a significant advance in the real-time monitoring, warning, and subsequent hazard minimization of catastrophic ground motion.

  4. Real-time neural network earthquake profile predictor

    DOEpatents

    Leach, R.R.; Dowla, F.U.

    1996-02-06

    A neural network has been developed that uses first-arrival energy to predict the characteristics of impending earthquake seismograph signals. The propagation of ground motion energy through the earth is a highly nonlinear function. This is due to different forms of ground motion as well as to changes in the elastic properties of the media throughout the propagation path. The neural network is trained using seismogram data from earthquakes. Presented with a previously unseen earthquake, the neural network produces a profile of the complete earthquake signal using data from the first seconds of the signal. This offers a significant advance in the real-time monitoring, warning, and subsequent hazard minimization of catastrophic ground motion. 17 figs.

  5. US earthquake observatories: recommendations for a new national network

    SciTech Connect

    Not Available

    1980-01-01

    This report is the first attempt by the seismological community to rationalize and optimize the distribution of earthquake observatories across the United States. The main aim is to increase significantly our knowledge of earthquakes and the earth's dynamics by providing access to scientifically more valuable data. Other objectives are to provide a more efficient and cost-effective system of recording and distributing earthquake data and to make as uniform as possible the recording of earthquakes in all states. The central recommendation of the Panel is that the guiding concept be established of a rationalized and integrated seismograph system consisting of regional seismograph networks run for crucial regional research and monitoring purposes in tandem with a carefully designed, but sparser, nationwide network of technologically advanced observatories. Such a national system must be thought of not only in terms of instrumentation but equally in terms of data storage, computer processing, and record availability.

  6. The ARIA project: Advanced Rapid Imaging and Analysis for Natural Hazard Monitoring and Response

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Webb, F.; Simons, M.; Rosen, P. A.; Cruz, J.; Yun, S.; Fielding, E. J.; Moore, A. W.; Hua, H.; Agram, P.; Lundgren, P.

    2012-12-01

    ARIA is a joint JPL/Caltech coordinated effort to automate geodetic imaging capabilities for hazard response and societal benefit. Over the past decade, space-based geodetic measurements such as InSAR and GPS have provided new assessment capabilities and situational awareness on the size and location of earthquakes following seismic disasters and on volcanic eruptions following magmatic events. Geodetic imaging's unique ability to capture surface deformation in high spatial and temporal resolution allow us to resolve the fault geometry and distribution of slip associated with any given earthquake in correspondingly high spatial & temporal detail. In addition, remote sensing with radar provides change detection and damage assessment capabilities for earthquakes, floods and other disasters that can image even at night or through clouds. These data sets are still essentially hand-crafted, and thus are not generated rapidly and reliably enough for informing decision-making agencies and the public following an earthquake. We are building an end-to-end prototype geodetic imaging data system that would form the foundation for an envisioned operational hazard response center integrating InSAR, GPS, seismology, and modeling to deliver monitoring, actionable science, and situational awareness products. This prototype exploits state-of-the-art analysis algorithms from technologists and scientists, These algorithms enable the delivery of actionable products from larger data sets with enhanced modeling and interpretation, and the development of next generation techniques. We are collaborating with USGS scientists in both the earthquake and volcano science program for our initial data product infusion. We present our progress to date on development of prototype data system and demonstration data products, and example responses we have run such as generating products for the 2011 M9.0 Tohoku-oki, M6.3 Christchurch earthquakes, the 2011 M7.1 Van earthquake, and several simulated earthquake response exercises.

  7. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should also be both specific (although allowably uncertain) and actionable. In this analysis, an attempt is made at both simple and intuitive color-coded alerting criteria; yet the necessary uncertainty measures by which one can gauge the likelihood for the alert to be over- or underestimated are preserved. The essence of the proposed impact scale and alerting is that actionable loss information is now available in the immediate aftermath of significant earthquakes worldwide on the basis of quantifiable loss estimates. Utilizing EIS, PAGER's rapid loss estimates can adequately recommend alert levels and suggest appropriate response protocols, despite the uncertainties; demanding or awaiting observations or loss estimates with a high level of accuracy may increase the losses. ?? 2011 American Society of Civil Engineers.

  8. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  9. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    ERIC Educational Resources Information Center

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  10. Patient experiences with self-monitoring renal function after renal transplantation: results from a single-center prospective pilot study

    PubMed Central

    van Lint, Céline L; van der Boog, Paul JM; Wang, Wenxin; Brinkman, Willem-Paul; Rövekamp, Ton JM; Neerincx, Mark A; Rabelink, Ton J; van Dijk, Sandra

    2015-01-01

    Background After a kidney transplantation, patients have to visit the hospital often to monitor for early signs of graft rejection. Self-monitoring of creatinine in addition to blood pressure at home could alleviate the burden of frequent outpatient visits, but only if patients are willing to self-monitor and if they adhere to the self-monitoring measurement regimen. A prospective pilot study was conducted to assess patients’ experiences and satisfaction. Materials and methods For 3 months after transplantation, 30 patients registered self-measured creatinine and blood pressure values in an online record to which their physician had access to. Patients completed a questionnaire at baseline and follow-up to assess satisfaction, attitude, self-efficacy regarding self-monitoring, worries, and physician support. Adherence was studied by comparing the number of registered with the number of requested measurements. Results Patients were highly motivated to self-monitor kidney function, and reported high levels of general satisfaction. Level of satisfaction was positively related to perceived support from physicians (P<0.01), level of self-efficacy (P<0.01), and amount of trust in the accuracy of the creatinine meter (P<0.01). The use of both the creatinine and blood pressure meter was considered pleasant and useful, despite the level of trust in the accuracy of the creatinine device being relatively low. Trust in the accuracy of the creatinine device appeared to be related to level of variation in subsequent measurement results, with more variation being related to lower levels of trust. Protocol adherence was generally very high, although the range of adherence levels was large and increased over time. Conclusion Patients’ high levels of satisfaction suggest that at-home monitoring of creatinine and blood pressure after transplantation offers a promising strategy. Important prerequisites for safe implementation in transplant care seem to be support from physicians and patients’ confidence in both their own self-monitoring skills and the accuracy of the devices used. PMID:26673985

  11. Earthquake and Geothermal Energy

    E-print Network

    Kapoor, Surya Prakash

    2013-01-01

    The origin of earthquake has long been recognized as resulting from strike-slip instability of plate tectonics along the fault lines. Several events of earthquake around the globe have happened which cannot be explained by this theory. In this work we investigated the earthquake data along with other observed facts like heat flow profiles etc... of the Indian subcontinent. In our studies we found a high-quality correlation between the earthquake events, seismic prone zones, heat flow regions and the geothermal hot springs. As a consequence, we proposed a hypothesis which can adequately explain all the earthquake events around the globe as well as the overall geo-dynamics. It is basically the geothermal power, which makes the plates to stand still, strike and slip over. The plates are merely a working solid while the driving force is the geothermal energy. The violent flow and enormous pressure of this power shake the earth along the plate boundaries and also triggers the intra-plate seismicity. In the light o...

  12. TEC enhancement immediately before M9 mega-thrust earthquakes

    NASA Astrophysics Data System (ADS)

    Heki, Kosuke

    2012-07-01

    Earthquakes are often preceded by electromagnetic precursors, e.g. electric currents in the ground and propagation anomalies of radio waves. By monitoring the differences of the L1 and L2 carrier phases from GPS satellites, we can infer ionospheric Total Electron Content (TEC). Here I report that positive anomalies of ionospheric TEC appeared immediately before the 2011 Tohoku-Oki (Mw9.0), 2010 Chile (Maule) (Mw8.8), 2007 Bengkulu (Mw8.6), and 2004 Sumatra-Andaman (Mw9.2) earthquakes. Coseismic vertical movements of the surface excite acoustic and internal gravity waves, causing coseismic ionospheric disturbances (CID), and GPS-TEC data showed that they occurred about ten minutes after these earthquakes. In addition to them, positive TEC anomalies were found to start 60-40 minutes before these earthquakes above the focal regions, and to last until the onsets of CID. In the Tohoku-Oki case, the anomaly was reached about one tenth of the background TEC immediately before the earthquake. TEC enhancements often occur irrespective of earthquakes, for example, sudden increase of TEC due to solar flares and large-scale traveling ionospheric disturbances (LSTID) propagating from the auroral oval to mid-latitude regions. These disturbances can be distinguished by carefully observing their spatial extents and movements. Geomagnetic activities were relatively high in the 2004 Sumatra-Andaman and 2011 Tohoku-Oki events, but were low in the 2007 Bengkulu and 2010 Chile events. For the Tohoku-Oki and the Bengkulu earthquakes, we analyzed the TEC time series of the same satellite and receiver pair over 120 days before and after the earthquakes, and confirmed that the precursory anomalies of the earthquakes were the largest in these periods. We also investigated three M8 class earthquakes, the 1994 Hokkaido-Toho-Oki (Mw8.3), 2006 Kuril (Mw8.2), and the 2003 Tokachi-Oki (Mw8.0) earthquakes. However, only weak precursory TEC anomalies were seen in the 1994 event, and not in the 2003 event. Only M9 class earthquakes are considered to be immediately preceded by such positive TEC anomalies. Because the raw GPS data files are available on the web, one can easily reproduce the results reported here and apply the method to other (including future) earthquakes. The physical mechanism of the preseismic TEC anomalies is not clear, but concentration of positive electric charges on the ground is a possibility.

  13. AMBIENT AIR MONITORING AT GROUND ZERO AND LOWER MANHATTAN FOLLOWING THE COLLAPSE OF THE WORLD TRADE CENTER

    EPA Science Inventory

    The U.S. EPA National Exposure Research Laboratory (NERL) collaborated with EPA's Regional offices to establish a monitoring network to characterize ambient air concentrations of particulate matter (PM) and air toxics in lower Manhattan following the collapse of the World Trade...

  14. Earthquakes and faults in southern California (1970-2010)

    USGS Publications Warehouse

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.

    2012-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.3 in southern California (1970–2010). The bathymetry was generated from digital files from the California Department of Fish And Game, Marine Region, Coastal Bathymetry Project. Elevation data are from the U.S. Geological Survey National Elevation Database. Landsat satellite image is from fourteen Landsat 5 Thematic Mapper scenes collected between 2009 and 2010. Fault data are reproduced with permission from 2006 California Geological Survey and U.S. Geological Survey data. The earthquake data are from the U.S. Geological Survey National Earthquake Information Center.

  15. Sand boils without earthquakes

    USGS Publications Warehouse

    Holzer, T.L.; Clark, M.M.

    1993-01-01

    Sedimentary deformation caused by liquefaction has become a popular means for inferring prehistoric strong earthquakes. This report describes a new mechanism for generating such features in the absence of earthquakes. Sand boils and a 180-m-long sand dike formed in Fremont Valley, California, when sediment-laden surface runoff was intercepted along the upslope part of a 500-m-long preexisting ground crack, flowed subhorizonally in the crack, and then flowed upward in the downslope part of the crack where it discharged as sand boils on the land surface. If the sand boils and their feeder dike were stratigraphically preserved, they could be misinterpreted as evidence for earthquake-induced liquefaction. -Authors

  16. An application of earthquake prediction algorithm M8 in eastern Anatolia at the approach of the 2011 Van earthquake

    NASA Astrophysics Data System (ADS)

    Mojarab, Masoud; Kossobokov, Vladimir; Memarian, Hossein; Zare, Mehdi

    2015-07-01

    On 23rd October 2011, an M7.3 earthquake near the Turkish city of Van, killed more than 600 people, injured over 4000, and left about 60,000 homeless. It demolished hundreds of buildings and caused great damages to thousand others in Van, Ercis, Muradiye, and Çald?ran. The earthquake's epicenter is located about 70 km from a preceding M7.3 earthquake that occurred in November 1976 and destroyed several villages near the Turkey-Iran border and killed thousands of people. This study, by means of retrospective application of the M8 algorithm, checks to see if the 2011 Van earthquake could have been predicted. The algorithm is based on pattern recognition of Times of Increased Probability (TIP) of a target earthquake from the transient seismic sequence at lower magnitude ranges in a Circle of Investigation (CI). Specifically, we applied a modified M8 algorithm adjusted to a rather low level of earthquake detection in the region following three different approaches to determine seismic transients. In the first approach, CI centers are distributed on intersections of morphostructural lineaments recognized as prone to magnitude 7 + earthquakes. In the second approach, centers of CIs are distributed on local extremes of the seismic density distribution, and in the third approach, CI centers were distributed uniformly on the nodes of a 1?×1? grid. According to the results of the M8 algorithm application, the 2011 Van earthquake could have been predicted in any of the three approaches. We noted that it is possible to consider the intersection of TIPs instead of their union to improve the certainty of the prediction results. Our study confirms the applicability of a modified version of the M8 algorithm for predicting earthquakes at the Iranian-Turkish plateau, as well as for mitigation of damages in seismic events in which pattern recognition algorithms may play an important role.

  17. Slow earthquakes triggered by typhoons.

    PubMed

    Liu, ChiChing; Linde, Alan T; Sacks, I Selwyn

    2009-06-11

    The first reports on a slow earthquake were for an event in the Izu peninsula, Japan, on an intraplate, seismically active fault. Since then, many slow earthquakes have been detected. It has been suggested that the slow events may trigger ordinary earthquakes (in a context supported by numerical modelling), but their broader significance in terms of earthquake occurrence remains unclear. Triggering of earthquakes has received much attention: strain diffusion from large regional earthquakes has been shown to influence large earthquake activity, and earthquakes may be triggered during the passage of teleseismic waves, a phenomenon now recognized as being common. Here we show that, in eastern Taiwan, slow earthquakes can be triggered by typhoons. We model the largest of these earthquakes as repeated episodes of slow slip on a reverse fault just under land and dipping to the west; the characteristics of all events are sufficiently similar that they can be modelled with minor variations of the model parameters. Lower pressure results in a very small unclamping of the fault that must be close to the failure condition for the typhoon to act as a trigger. This area experiences very high compressional deformation but has a paucity of large earthquakes; repeating slow events may be segmenting the stressed area and thus inhibiting large earthquakes, which require a long, continuous seismic rupture. PMID:19516339

  18. Earthquake Safety Guide for Homeowners

    E-print Network

    Oklahoma, University of

    Earthquake Safety Guide for Homeowners FEMA 530 / September 2005 FEMA #12; PublishingInformation The Homeowner's Guide to Earthquake Safety was originally developed and published by the California Seismic that slid 2 feet off its foundation as a result of the 6.5 San Simeon Earthquake. #12;CONTENTS Page

  19. Turkish Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  20. Earthquake Resistant Cathedral in Chile

    USGS Multimedia Gallery

    A cathedral in the central square of Chillán, Chile replaces the ancient cathedral that collapsed during the strong earthquake of 1939. This modern structure was constructed with earthquake resistance as the primary consideration. The only damage caused by the M 8.8 earthquake on Feb. 27, 2010 was b...

  1. Earthquakes in Afghanistan Nicholas Ambraseys

    E-print Network

    Bilham, Roger

    1 Earthquakes in Afghanistan Nicholas Ambraseys Dept. of Civil Engineering, Imperial College 80309-0399 We summarize the written history of earthquakes in Afghanistan from 734 AD to the present in the form of a new catalog of more than 1300 earthquakes, and narrative accounts of damage sustained during

  2. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  3. Magnitude 8.1 Earthquake off the Solomon Islands

    NASA Technical Reports Server (NTRS)

    2007-01-01

    On April 1, 2007, a magnitude 8.1 earthquake rattled the Solomon Islands, 2,145 kilometers (1,330 miles) northeast of Brisbane, Australia. Centered less than ten kilometers beneath the Earth's surface, the earthquake displaced enough water in the ocean above to trigger a small tsunami. Though officials were still assessing damage to remote island communities on April 3, Reuters reported that the earthquake and the tsunami killed an estimated 22 people and left as many as 5,409 homeless. The most serious damage occurred on the island of Gizo, northwest of the earthquake epicenter, where the tsunami damaged the hospital, schools, and hundreds of houses, said Reuters. This image, captured by the Landsat-7 satellite, shows the location of the earthquake epicenter in relation to the nearest islands in the Solomon Island group. Gizo is beyond the left edge of the image, but its triangular fringing coral reefs are shown in the upper left corner. Though dense rain forest hides volcanic features from view, the very shape of the islands testifies to the geologic activity of the region. The circular Kolombangara Island is the tip of a dormant volcano, and other circular volcanic peaks are visible in the image. The image also shows that the Solomon Islands run on a northwest-southeast axis parallel to the edge of the Pacific plate, the section of the Earth's crust that carries the Pacific Ocean and its islands. The earthquake occurred along the plate boundary, where the Australia/Woodlark/Solomon Sea plates slide beneath the denser Pacific plate. Friction between the sinking (subducting) plates and the overriding Pacific plate led to the large earthquake on April 1, said the United States Geological Survey (USGS) summary of the earthquake. Large earthquakes are common in the region, though the section of the plate that produced the April 1 earthquake had not caused any quakes of magnitude 7 or larger since the early 20th century, said the USGS.

  4. California earthquake history

    USGS Publications Warehouse

    Toppozada, T.; Branum, D.

    2004-01-01

    This paper presents an overview of the advancement in our knowledge of California's earthquake history since ??? 1800, and especially during the last 30 years. We first review the basic statewide research on earthquake occurrences that was published from 1928 through 2002, to show how the current catalogs and their levels of completeness have evolved with time. Then we review some of the significant new results in specific regions of California, and some of what remains to be done. Since 1850, 167 potentially damaging earthquakes of M ??? 6 or larger have been identified in California and its border regions, indicating an average rate of 1.1 such events per year. Table I lists the earthquakes of M ??? 6 to 6.5 that were also destructive since 1812 in California and its border regions, indicating an average rate of one such event every ??? 5 years. Many of these occurred before 1932 when epicenters and magnitudes started to be determined routinely using seismographs in California. The number of these early earthquakes is probably incomplete in sparsely populated remote parts of California before ??? 1870. For example, 6 of the 7 pre-1873 events in table I are of M ??? 7, suggesting that other earthquakes of M 6.5 to 6.9 occurred but were not properly identified, or were not destructive. The epicenters and magnitudes (M) of the pre-instrumental earthquakes were determined from isoseismal maps that were based on the Modified Mercalli Intensity of shaking (MMI) at the communities that reported feeling the earthquakes. The epicenters were estimated to be in the regions of most intense shaking, and values of M were estimated from the extent of the areas shaken at various MMI levels. MMI VII or greater shaking is the threshold of damage to weak buildings. Certain areas in the regions of Los Angeles, San Francisco, and Eureka were each shaken repeatedly at MMI VII or greater at least six times since ??? 1812, as depicted by Toppozada and Branum (2002, fig. 19).

  5. Real-time processing of earthquake information in Iceland.

    NASA Astrophysics Data System (ADS)

    Kjartansson, E.; Vogfjord, K. S.; Hjaltadottir, S.; Sveinbjornsson, H.; Armannsdottir, S.; Gudmundsson, G. B.

    2009-04-01

    Tools for real-time analysis have been implemented at seismic stations in the SIL system in Iceland, as a part of the Icelandic Meteorological Office participation in the SAFER and TRANSFER projects. These tools include processes to support alert maps and Shake Maps, first steps towards fast magnitude determination based on dominant frequency, and the development of procedures to map faults in near-real-time. Data for alert maps and Shake Maps is obtained using a real-time process that monitors both ground velocity and acceleration in 4 separate frequency bands at each station: 4-50 Hz, 1-10 Hz, 0.25-2.5 Hz and 0.05-0.5 Hz. A reference level is maintained for horizontal and vertical components in each frequency band, such that it is exceeded a few times per hour. When signals exceed this level by more than 50%, a report is sent to the processing center. When 5 or more stations send reports within a time interval of 20 seconds, alert maps are generated. The alert maps show observed values for each station, including peak ground velocity and arrival times for peaks in ground motion and first break. An attempt is also made to solve for the location of the event. The location solution is based on the assumption that time when the vertical component first exceeds the reference level by a certain amount indicate the arrival of the P wave from an earthquake. Before searching for a location solution, the arrival times for different stations are compared and stations are dropped so that no time differences are greater than the time that it takes a P wave to travel from one station to another. The location calculation uses a fixed depth ( 4 km ) and uses a parametric travel time curve that is based on observations from South Iceland. All possible combinations of 3 stations are used to compute potential solutions; the location that yields the lowest sum of absolute residuals is then found. Once the location has been determined, conventional magnitude can be calculated, using recently refined magnitude-distance relations for Icelandic earthquakes. When a good fit is obtained for at least 5 stations, for both arrival times and amplitudes, and the magnitude indicated is greater than 2.0, a Shake Map is generated and placed online automatically. The Shake Maps are usually ready within 2 minutes of the earthquake. The maps can be accessed at http://hraun.vedur.is/ja/alert. This real-time analysis has been operational on over 40 stations in the SIL system since early September 2008. These tools have yielded accurate magnitude estimations for nearly all earthquakes that have been felt in Iceland during this period. In order to extend coverage to surrounding ocean areas, we are working on having access to real-time data from a few seismic stations around the North-Atlantic. This should enable early warning for large offshore earthquakes. Mapping of faults in near-real-time fault is performed by using double-difference relocation of automatically located microearthquakes, relative to a library of events already located with high precision. Thus, taking advantage of the tens of thousands of earthquakes in South Iceland that have been relatively located. Automation of the relocation process is under development. When completed, the process will enable near-real-time delineation of activated faults by the distribution of microearthquakes.

  6. Retrospective Evaluation of the Five-Year and Ten-Year CSEP-Italy Earthquake Forecasts

    E-print Network

    Werner, M J; Marzocchi, W; Wiemer, S

    2010-01-01

    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. In this article, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictabilit...

  7. Development of Diagnostic Reference Levels Using a Real-Time Radiation Dose Monitoring System at a Cardiovascular Center in Korea.

    PubMed

    Kim, Jungsu; Seo, Deoknam; Choi, Inseok; Nam, Sora; Yoon, Yongsu; Kim, Hyunji; Her, Jae; Han, Seonggyu; Kwon, Soonmu; Park, Hunsik; Yang, Dongheon; Kim, Jungmin

    2015-12-01

    Digital cardiovascular angiography accounts for a major portion of the radiation dose among the examinations performed at cardiovascular centres. However, dose-related information is neither monitored nor recorded systemically. This report concerns the construction of a radiation dose monitoring system based on digital imaging and communications in medicine (DICOM) data and its use at the cardiovascular centre of the University Hospitals in Korea. The dose information was analysed according to DICOM standards for a series of procedures, and the formulation of diagnostic reference levels (DRLs) at our cardiovascular centre represents the first of its kind in Korea. We determined a dose area product (DAP) DRL for coronary angiography of 75.6 Gy cm(2) and a fluoroscopic time DRL of 318.0 s. The DAP DRL for percutaneous transluminal coronary intervention was 213.3 Gy cm(2), and the DRL for fluoroscopic time was 1207.5 s. PMID:25700616

  8. Center of Excellence for Applied Mathematical and Statistical Research in support of development of multicrop production monitoring capability

    NASA Technical Reports Server (NTRS)

    Woodward, W. A.; Gray, H. L.

    1983-01-01

    Efforts in support of the development of multicrop production monitoring capability are reported. In particular, segment level proportion estimation techniques based upon a mixture model were investigated. Efforts have dealt primarily with evaluation of current techniques and development of alternative ones. A comparison of techniques is provided on both simulated and LANDSAT data along with an analysis of the quality of profile variables obtained from LANDSAT data.

  9. Earthquake prediction comes of age

    SciTech Connect

    Lindth, A. . Office of Earthquakes, Volcanoes, and Engineering)

    1990-02-01

    In the last decade, scientists have begun to estimate the long-term probability of major earthquakes along the San Andreas fault. In 1985, the U.S. Geological Survey (USGS) issued the first official U.S. government earthquake prediction, based on research along a heavily instrumented 25-kilometer section of the fault in sparsely populated central California. Known as the Parkfield segment, this section of the Sand Andreas had experienced its last big earthquake, a magnitude 6, in 1966. Estimated probabilities of major quakes along the entire San Andreas by a working group of California earthquake experts, using new geologic data and careful analysis of past earthquakes, are reported.

  10. TOWARD END-TO-END MODELING FOR NUCLEAR EXPLOSION MONITORING: SIMULATION OF UNDERGROUND NUCLEAR EXPLOSIONS AND EARTHQUAKES USING HYDRODYNAMIC AND ANELASTIC SIMULATIONS, HIGH-PERFORMANCE COMPUTING AND THREE-DIMENSIONAL EARTH MODELS

    SciTech Connect

    Rodgers, A; Vorobiev, O; Petersson, A; Sjogreen, B

    2009-07-06

    This paper describes new research being performed to improve understanding of seismic waves generated by underground nuclear explosions (UNE) by using full waveform simulation, high-performance computing and three-dimensional (3D) earth models. The goal of this effort is to develop an end-to-end modeling capability to cover the range of wave propagation required for nuclear explosion monitoring (NEM) from the buried nuclear device to the seismic sensor. The goal of this work is to improve understanding of the physical basis and prediction capabilities of seismic observables for NEM including source and path-propagation effects. We are pursuing research along three main thrusts. Firstly, we are modeling the non-linear hydrodynamic response of geologic materials to underground explosions in order to better understand how source emplacement conditions impact the seismic waves that emerge from the source region and are ultimately observed hundreds or thousands of kilometers away. Empirical evidence shows that the amplitudes and frequency content of seismic waves at all distances are strongly impacted by the physical properties of the source region (e.g. density, strength, porosity). To model the near-source shock-wave motions of an UNE, we use GEODYN, an Eulerian Godunov (finite volume) code incorporating thermodynamically consistent non-linear constitutive relations, including cavity formation, yielding, porous compaction, tensile failure, bulking and damage. In order to propagate motions to seismic distances we are developing a one-way coupling method to pass motions to WPP (a Cartesian anelastic finite difference code). Preliminary investigations of UNE's in canonical materials (granite, tuff and alluvium) confirm that emplacement conditions have a strong effect on seismic amplitudes and the generation of shear waves. Specifically, we find that motions from an explosion in high-strength, low-porosity granite have high compressional wave amplitudes and weak shear waves, while an explosion in low strength, high-porosity alluvium results in much weaker compressional waves and low-frequency compressional and shear waves of nearly equal amplitude. Further work will attempt to model available near-field seismic data from explosions conducted at NTS, where we have accurate characterization of the sub-surface from the wealth of geological and geophysical data from the former nuclear test program. Secondly, we are modeling seismic wave propagation with free-surface topography in WPP. We have model the October 9, 2006 and May 25, 2009 North Korean nuclear tests to investigate the impact of rugged topography on seismic waves. Preliminary results indicate that the topographic relief causes complexity in the direct P-waves that leads to azimuthally dependent behavior and the topographic gradient to the northeast, east and southeast of the presumed test locations generate stronger shear-waves, although each test gives a different pattern. Thirdly, we are modeling intermediate period motions (10-50 seconds) from earthquakes and explosions at regional distances. For these simulations we run SPECFEM3D{_}GLOBE (a spherical geometry spectral element code). We modeled broadband waveforms from well-characterized and well-observed events in the Middle East and central Asia, as well as the North Korean nuclear tests. For the recent North Korean test we found that the one-dimensional iasp91 model predicts the observed waveforms quite well in the band 20-50 seconds, while waveform fits for available 3D earth models are generally poor, with some exceptions. Interestingly 3D models can predict energy on the transverse component for an isotropic source presumably due to surface wave mode conversion and/or multipathing.

  11. Fractal dynamics of earthquakes

    SciTech Connect

    Bak, P.; Chen, K.

    1995-05-01

    Many objects in nature, from mountain landscapes to electrical breakdown and turbulence, have a self-similar fractal spatial structure. It seems obvious that to understand the origin of self-similar structures, one must understand the nature of the dynamical processes that created them: temporal and spatial properties must necessarily be completely interwoven. This is particularly true for earthquakes, which have a variety of fractal aspects. The distribution of energy released during earthquakes is given by the Gutenberg-Richter power law. The distribution of epicenters appears to be fractal with dimension D {approx} 1--1.3. The number of after shocks decay as a function of time according to the Omori power law. There have been several attempts to explain the Gutenberg-Richter law by starting from a fractal distribution of faults or stresses. But this is a hen-and-egg approach: to explain the Gutenberg-Richter law, one assumes the existence of another power-law--the fractal distribution. The authors present results of a simple stick slip model of earthquakes, which evolves to a self-organized critical state. Emphasis is on demonstrating that empirical power laws for earthquakes indicate that the Earth`s crust is at the critical state, with no typical time, space, or energy scale. Of course the model is tremendously oversimplified; however in analogy with equilibrium phenomena they do not expect criticality to depend on details of the model (universality).

  12. Infrasonic observation of earthquakes

    SciTech Connect

    Mutschlecner, J.P.; Whitaker, R.W.

    1998-12-31

    Infrasound signals generated by earthquakes have been detected at arrays operated by the Los Alamos National Laboratory. Three modes of propagation are possible and all have been observed by the authors. The observations suggest that regions remote from the epicenters are excited and may serve as secondary source regions. A relation is found between the normalized peak amplitudes and the seismic magnitudes.

  13. Road Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey

  14. Earthquake damage to schools

    USGS Publications Warehouse

    McCullough, Heather

    1994-01-01

    These unusual slides show earthquake damage to school and university buildings around the world. They graphically illustrate the potential danger to our schools, and to the welfare of our children, that results from major earthquakes. The slides range from Algeria, where a collapsed school roof is held up only by students' desks; to Anchorage, Alaska, where an elementary school structure has split in half; to California and other areas, where school buildings have sustained damage to walls, roofs, and chimneys. Interestingly, all the United States earthquakes depicted in this set of slides occurred either on a holiday or before or after school hours, except the 1935 tremor in Helena, Montana, which occurred at 11:35 am. It undoubtedly would have caused casualties had the schools not been closed days earlier by Helena city officials because of a damaging foreshock. Students in Algeria, the People's Republic of China, Armenia, and other stricken countries were not so fortunate. This set of slides represents 17 destructive earthquakes that occurred in 9 countries, and covers more than a century--from 1886 to 1988. Two of the tremors, both of which occurred in the United States, were magnitude 8+ on the Richter Scale, and four were magnitude 7-7.9. The events represented by the slides (see table below) claimed more than a quarter of a million lives.

  15. Fragments, Combustion and Earthquakes

    E-print Network

    Oscar Sotolongo-Costa; Antonio Posadas

    2005-03-16

    This paper is devoted to show the advantages of introducing a geometric viewpoint and a non extensive formulation in the description of apparently unrelated phenomena: combustion and earthquakes. Here, it is shown how the introduction of a fragmentation analysis based on that formulation leads to find a common point for description of these phenomena

  16. Analysis of rupture area of aftershocks caused by twin earthquakes (Case study: 11 April 2012 earthquakes of Aceh-North Sumatra)

    SciTech Connect

    Diansari, Angga Vertika Purwana, Ibnu; Subakti, Hendri

    2015-04-24

    The 11 April 2012 earthquakes off-shore Aceh-North Sumatra are unique events for the history of Indonesian earthquake. It is unique because that they have similar magnitude, 8.5 Mw and 8.1 Mw; close to epicenter distance, similar strike-slip focal mechanism, and occuring in outer rise area. The purposes of this research are: (1) comparing area of earthquakes base on models and that of calculation, (2) fitting the shape and the area of earthquake rupture zones, (3) analyzing the relationship between rupture area and magnitude of the earthquakes. Rupture area of the earthquake fault are determined by using 4 different formulas, i.e. Utsu and Seki (1954), Wells and Coppersmith (1994), Ellsworth (2003), and Christophersen and Smith (2000). The earthquakes aftershock parameters are taken from PGN (PusatGempabumiNasional or National Earthquake Information Center) of BMKG (Indonesia Agency Meteorology Climatology and Geophysics). The aftershock epicenters are plotted by GMT’s software. After that, ellipse and rectangular models of aftershock spreading are made. The results show that: (1) rupture areas were calculated using magnitude relationship which are larger than the the aftershock distributions model, (2) the best fitting model for that earthquake aftershock distribution is rectangular associated with Utsu and Seki (1954) formula, (3) the larger the magnitude of the earthquake, the larger area of the fault.

  17. Long term (2004-2013) correlation analysis among SSTAs (Significant Sequences of TIR Anomalies) and Earthquakes (M>4) occurrence over Greece: examples of application within a multi-parametric system for continuous seismic hazard monitoring.

    NASA Astrophysics Data System (ADS)

    Tramutoli, Valerio; Coviello, Irina; Eleftheriou, Alexander; Filizzola, Carolina; Genzano, Nicola; Lacava, Teodosio; Lisi, Mariano; Makris, John P.; Paciello, Rossana; Pergola, Nicola; Satriano, Valeria; vallianatos, filippos

    2015-04-01

    Real-time integration of multi-parametric observations is expected to significantly contribute to the development of operational systems for time-Dependent Assessment of Seismic Hazard (t-DASH) and earthquake short term (from days to weeks) forecast. However a very preliminary step in this direction is the identification of those parameters (chemical, physical, biological, etc.) whose anomalous variations can be, to some extent, associated to the complex process of preparation of major earthquakes. In this paper one of these parameter (the Earth's emitted radiation in the Thermal Infra-Red spectral region) is considered for its possible correlation with M?4 earthquakes occurred in Greece in between 2004 and 2013. The RST (Robust Satellite Technique) data analysis approach and RETIRA (Robust Estimator of TIR Anomalies) index were used to preliminarily define, and then to identify, Significant Sequences of TIR Anomalies (SSTAs) in 10 years (2004-2013) of daily TIR images acquired by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite. Taking into account physical models proposed for justifying the existence of a correlation among TIR anomalies and earthquakes occurrence, specific validation rules (in line with the ones used by the Collaboratory for the Study of Earthquake Predictability - CSEP - Project) have been defined to drive the correlation analysis process. The analysis shows that more than 93% of all identified SSTAs occur in the pre-fixed space-time window around (M?4) earthquakes time and location of occurrence with a false positive rate smaller than 7%. Achieved results, and particularly the very low rate of false positives registered on a so long testing period, seems already sufficient (at least) to qualify TIR anomalies (identified by RST approach and RETIRA index) among the parameters to be considered in the framework of a multi-parametric approach to time-Dependent Assessment of Seismic Hazard (t-DASH). The added value of real-time integration of such observations with others, independently performed from ground and satellite sensors, is also shown in the case of recent events occurred in Greece.

  18. Cruise report for 01-99-SC: southern California earthquake hazards project

    USGS Publications Warehouse

    Normark, William R.; Reid, Jane A.; Sliter, Ray W.; Holton, David; Gutmacher, Christina E.; Fisher, Michael A.; Childs, Jonathan R.

    1999-01-01

    The focus of the Southern California Earthquake Hazards project is to identify the landslide and earthquake hazards and related ground-deformation processes occurring in the offshore areas that have significant potential to impact the inhabitants of the Southern California coastal region. The project activity is supported through the Coastal and Marine Geology Program of the Geologic Division of the U. S. Geological Survey (USGS) and is a component of the Geologic Division's Science Strategy under Goal 1—Conduct Geologic Hazard Assessments for Mitigation Planning (Bohlen et al., 1998). The project research is specifically stated under Activity 1.1.2 of the Science Strategy: Earthquake Hazard Assessments and Loss Reduction Products in Urban Regions. This activity involves "research, seismic and geodetic monitoring, field studies, geologic mapping, and analyses needed to provide seismic hazard assessments of major urban centers in earthquake-prone regions including adjoining coastal and offshore areas." The southern California urban areas, which form the most populated urban corridor along the U.S. Pacific margin, are among a few specifically designated for special emphasis under the Division's science strategy (Bohlen et al., 1998). The primary objective of the project is to help mitigate the earthquake hazards for the Southern California region by improving our understanding of how deformation is distributed (spatially and temporally) in the offshore with respect to the onshore region. To meet this objective, we are conducting field investigations to observe the distribution, character, and relative intensity of active (i.e., primarily Holocene) deformation within the basins and along the shelf adjacent to the most highly populated areas (Fig. 1). In addition, acoustic imaging should help determine the subsurface dimensions of the faults and identify the size and frequency of submarine landslides, both of which are necessary for evaluating the potential for generating destructive tsunamis in the southern California offshore. In order to evaluate the strain associated with the offshore structures, the initial results from the field mapping under this project will be used to identify possible sites for deployment of acoustic geodetic instruments to monitor strain in the offshore region. A major goal of mapping under this project is to provide detailed geologic and geophysical information in GIS data bases that build on the earlier studies and use the new data to precisely locate active faults and to map recent submarine landslide deposits.

  19. World Conference on Earthquake Engineering Vancouver, B.C., Canada

    E-print Network

    Todorovska, Maria I.

    vibration monitoring data, of ground or structural response, for possible application in data mining in mining large data sets of ground and structural response vibration data under earthquake excitation. One IN DATA MINING Maria I. TODOROVSKA1 and Tzong-Ying HAO2 SUMMARY This paper explores the advantages

  20. Earthquake precursory events around epicenters and local active faults

    NASA Astrophysics Data System (ADS)

    Valizadeh Alvan, H.; Mansor, S. B.; Haydari Azad, F.

    2013-05-01

    The chain of underground events which are triggered by seismic activities and physical/chemical interactions prior to a shake in the earth's crust may produce surface and above surface phenomena. During the past decades many researchers have been carried away to seek the possibility of short term earthquake prediction using remote sensing data. Currently, there are several theories about the preparation stages of earthquakes most of which stress on raises in heat and seismic waves as the main signs of an impending earthquakes. Their differences only lie in the secondary phenomena which are triggered by these events. In any case, with the recent advances in remote sensing sensors and techniques now we are able to provide wider, more accurate monitoring of land, ocean and atmosphere. Among all theoretical factors, changes in Surface Latent Heat Flux (SLHF), Sea & Land Surface Temperature (SST & LST) and surface chlorophyll-a are easier to record from earth observing satellites. SLHF is the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere. Abnormal variations in this factor have been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. In case of oceanic earthquakes, higher temperature at the ocean beds may lead to higher amount of Chl-a on the sea surface. On the other hand, it has been also said that the leak of Radon gas which occurs as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT). We have chosen to perform a statistical, long-term, and short-term approach by considering the reoccurrence intervals of past shakes, mapping foreshocks and aftershocks, and following changes in the above-mentioned precursors prior to past earthquake instances all over the globe. Our analyses also encompass the geographical location and extents of local and regional faults which are considered as important factors during earthquakes. The co-analysis of direct and indirect observation for precursory events is considered as a promising method for possible future successful earthquake predictions. With proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will be able to identify anomalies due to seismic activity in the earth's crust.

  1. System of Earthquakes Alert (SEA) on the territory of Bulgaria developed as a result of DACEA project

    NASA Astrophysics Data System (ADS)

    Solakov, Dimcho; Dimitrova, Liliya; Simeonova, Stela; Aleksandrova, Irena; Stoyanov, Stoyan; Metodiev, Metodi

    2013-04-01

    The prevention of the natural disasters and the performing management of reactions to crisis are common problems for many countries. The Romania-Bulgaria border region is significantly affected by earthquakes occurred in both territories: on the one-hand, Vrancea seismic source, with intermediate-depth events and on the other hand, crustal seismicity recorded in the northern part of Bulgaria (Shabla, Dulovo, Gorna Orjahovitza). The general objective of DACEA (2010-2013) project is to develop an system of earthquake alert in order to prevent the natural disasters caused by earthquakes in the cross-border area, taking into account the nuclear power plants and other chemical plants located along the Danube on the territories of Romania and Bulgaria. An integrated warning system is designed and implemented in the cross-border area. A seismic detection network is put in operation in order to warn the bodies in charge with emergency situations management in case of seismic danger. The main purpose of this network is: • monitoring of the four seismogenic areas relevant for the cross-border area, in order to detect dangerous earthquakes • sending the seismic warning signals within several seconds to the local public authorities in the cross-border area On the territory of Bulgaria the seismic network belonging to SEA is consists of: • 8 seismic stations equipped with Basalt digitizer, accelerometer Epi-sensor and BB seismometer KS2000. • 8 seismic stations equipped with Basalt digitizer, accelerometer Epi-sensor, warning and visual monitoring equipment. The stations are spanned allover the North Bulgaria. The sites were thoroughly examined and the most important requirement was the low level of noise or vibrations. SEA centers were established both in Sofia (in National Institute of Geophysics, Geodesy and Geography - NIGGG) and Bucharest (in National Institute of Research and Development for Earth Physics). Both centers are equipped with servers for data analyses and storage. Specialized software for elaboration of scenarios of seismic hazard is designed and implemented. The reaction of buildings, roads, bridges, land etc. to earthquakes is graphically shown on the monitor. The high risk areas are highlighted in order for the emergency units to be prepared for intervention. This software is designed on the base of a comprehensive relational data base of historical and contemporary seismicity in the cross-border region. The output shake maps and scenarios are to be used by the emergency intervention units, local public authorities and for general public awareness.

  2. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion simulations of a Hayward Fault earthquake, (5) a new USGS Fact Sheet about the earthquake and the Hayward Fault, (6) a virtual tour of the 1868 earthquake, and (7) a new online field trip guide to the Hayward Fault using locations accessible by car and public transit. Finally, the California Geological Survey and many other Alliance members sponsored the Third Conference on Earthquake Hazards in the East Bay at CSU East Bay in Hayward for the three days following the 140th anniversary. The 1868 Alliance hopes to commemorate the anniversary of the 1868 Hayward Earthquake every year to maintain and increase public awareness of this fault, the hazards it and other East Bay Faults pose, and the ongoing need for earthquake preparedness and mitigation.

  3. Earthquake Swarm in Armutlu Peninsula, Eastern Marmara Region, Turkey

    NASA Astrophysics Data System (ADS)

    Yavuz, Evrim; Çaka, Deniz; Tunç, Berna; Serkan Irmak, T.; Woith, Heiko; Cesca, Simone; Lühr, Birger-Gottfried; Bar??, ?erif

    2015-04-01

    The most active fault system of Turkey is North Anatolian Fault Zone and caused two large earthquakes in 1999. These two earthquakes affected the eastern Marmara region destructively. Unbroken part of the North Anatolian Fault Zone crosses north of Armutlu Peninsula on east-west direction. This branch has been also located quite close to Istanbul known as a megacity with its high population, economic and social aspects. A new cluster of microseismic activity occurred in the direct vicinity southeastern of the Yalova Termal area. Activity started on August 2, 2014 with a series of micro events, and then on August 3, 2014 a local magnitude is 4.1 event occurred, more than 1000 in the followed until August 31, 2014. Thus we call this tentatively a swarm-like activity. Therefore, investigation of the micro-earthquake activity of the Armutlu Peninsula has become important to understand the relationship between the occurrence of micro-earthquakes and the tectonic structure of the region. For these reasons, Armutlu Network (ARNET), installed end of 2005 and equipped with currently 27 active seismic stations operating by Kocaeli University Earth and Space Sciences Research Center (ESSRC) and Helmholtz-Zentrum Potsdam Deutsches GeoForschungsZentrum (GFZ), is a very dense network tool able to record even micro-earthquakes in this region. In the 30 days period of August 02 to 31, 2014 Kandilli Observatory and Earthquake Research Institute (KOERI) announced 120 local earthquakes ranging magnitudes between 0.7 and 4.1, but ARNET provided more than 1000 earthquakes for analyzes at the same time period. In this study, earthquakes of the swarm area and vicinity regions determined by ARNET were investigated. The focal mechanism of the August 03, 2014 22:22:42 (GMT) earthquake with local magnitude (Ml) 4.0 is obtained by the moment tensor solution. According to the solution, it discriminates a normal faulting with dextral component. The obtained focal mechanism solution is conformable with the features of local faults in the region. The spatial vicinity of the earthquake swarm and the Yalova geothermal area may suggest a physical link between the ongoing exploitation of the reservoir and the earthquake activity. Keywords: Earthquake swarm, Armutlu Peninsula, ARNET, geothermal activity

  4. Converter Compressor Building, SWMU 089, Hot Spot Areas 1, 2, and 5 Operations, Maintenance, and Monitoring Report, Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Wilson, Deborah M.

    2015-01-01

    This Operations, Maintenance, and Monitoring Report (OMMR) presents the findings, observations, and results from operation of the air sparging (AS) interim measure (IM) for Hot Spot (HS) Areas 1, 2, and 5 at the Converter Compressor Building (CCB) located at Kennedy Space Center (KSC), Florida. The objective of the IM at CCB HS Areas 1, 2, and 5 is to decrease concentrations of volatile organic compounds (VOCs) in groundwater in the treatment zones via AS to levels that will enable a transition to a monitored natural attenuation (MNA) phase. This OMMR presents system operations and maintenance (O&M) information and performance monitoring results since full-scale O&M began in June 2014 (2 months after initial system startup in April 2014), including quarterly performance monitoring events in July and October 2014 and January and May 2015. Based on the results to date, the AS system is operating as designed and is meeting the performance criteria and IM objective. The performance monitoring network is adequately constructed for assessment of IM performance at CCB HS Areas 1, 2, and 5. At the March 2014 KSC Remediation Team (KSCRT) Meeting, team consensus was reached for the design prepared for expansion of the system to treat the HS 4 area, and at the November 2014 KSCRT Meeting, team consensus was reached that HS 3 was adequately delineated horizontally and vertically and for selection of AS for the remedial approach for HS 3. At the July 2015 KSCRT meeting, team consensus was reached to continue IM operations in all zones until HSs 3 and 4 is operational, once HS 3 and 4 zones are operational discontinue operations in HS 1, 2, and 5 zones where concentrations are less than GCTLs to observe whether rebounding conditions occur. Team consensus was also reached to continue quarterly performance monitoring to determine whether operational zones achieve GCTLs and to continue annual IGWM of CCB-MW0012, CCBMW0013, and CCB-MW0056, located south of the treatment area. The next performance monitoring event is scheduled for July 2015.

  5. Cooperative Monitoring Center Occasional Paper/16: The Potential of Technology for the Control of Small Weapons: Applications in Developing Countries

    SciTech Connect

    ALTMANN, JURGEN

    2000-07-01

    For improving the control of small arms, technology provides many possibilities. Present and future technical means are described in several areas. With the help of sensors deployed on the ground or on board aircraft, larger areas can be monitored. Using tags, seals, and locks, important objects and installations can be safeguarded better. With modern data processing and communication systems, more information can be available, and it can be more speedily processed. Together with navigation and transport equipment, action can be taken faster and at greater range. Particular considerations are presented for cargo control at roads, seaports, and airports, for monitoring designated lines, and for the control of legal arms. By starting at a modest level, costs can be kept low, which would aid developing countries. From the menu of technologies available, systems need to be designed for the intended application and with an understanding of the local conditions. It is recommended that states start with short-term steps, such as acquiring more and better radio transceivers, vehicles, small aircraft, and personal computers. For the medium term, states should begin with experiments and field testing of technologies such as tags, sensors, and digital communication equipment.

  6. Catalog of earthquake hypocenters at Alaskan Volcanoes: January 1 through December 31, 2011

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl K.

    2012-01-01

    Between January 1 and December 31, 2011, the Alaska Volcano Observatory (AVO) located 4,364 earthquakes, of which 3,651 occurred within 20 kilometers of the 33 volcanoes with seismograph subnetworks. There was no significant seismic activity above background levels in 2011 at these instrumented volcanic centers. This catalog includes locations, magnitudes, and statistics of the earthquakes located in 2011 with the station parameters, velocity models, and other files used to locate these earthquakes.

  7. Bladder Monitor

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Diagnostic Ultrasound Corporation's Bladder Scan Monitor continuously records and monitors bladder fullness and alerts the wearer or caretaker when voiding is required. The sensor is held against the lower abdomen by a belt and connected to the monitor by a cable. The sensor obtains bladder volume data from sound waves reflecting off the bladder wall. The device was developed by Langley Research Center, the Ames Research Center and the NASA Technology Applications Team. It utilizes Langley's advanced ultrasound technology. It is licensed to the ARC for medical applications, and sublicensed to Diagnostics Ultrasound. Central monitoring systems are planned for the future.

  8. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    USGS Publications Warehouse

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake?Catcher Network (QCN) that connects low?cost microelectromechanical systems accelerometers to a network of volunteer?owned, Internet?connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground?motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real?time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  9. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    NASA Astrophysics Data System (ADS)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    The possibility of earthquake prediction in the frame of several days to few minutes before its occurrence has stirred interest among researchers, recently. Scientists believe that the new theories and explanations of the mechanism of this natural phenomenon are trustable and can be the basis of future prediction efforts. During the last thirty years experimental researches resulted in some pre-earthquake events which are now recognized as confirmed warning signs (precursors) of past known earthquakes. With the advances in in-situ measurement devices and data analysis capabilities and the emergence of satellite-based data collectors, monitoring the earth's surface is now a regular work. Data providers are supplying researchers from all over the world with high quality and validated imagery and non-imagery data. Surface Latent Heat Flux (SLHF) or the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere has been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. On the other hand, the leak of Radon gas occurred as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT) prior to main event. Although co-analysis of direct and indirect observation for precursory events is considered as a promising method for future successful earthquake prediction, without proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will not be able to identify anomalies due to seismic activity in the earth's crust. Active faulting is a key factor in identification of the source and propagation of seismic waves. In many cases, active faults are capable of buildup and sudden release of tectonic stress. Hence, monitoring the active fault systems near epicentral regions of past earthquakes would be a necessity. In this paper, we try to detect possible anomalies in SLHF and AT during two moderate earthquakes of 6 - 6.5 M in Iran and explain the relationships between the seismic activities prior to these earthquake and active faulting in the area. Our analysis shows abnormal SLHF 5~10 days before these earthquakes. Meaningful anomalous concentrations usually occurred in the epicentral area. On the other hand, spatial distributions of these variations were in accordance with the local active faults. It is concluded that the anomalous increase in SLHF shows great potential in providing early warning of a disastrous earthquake, provided that there is a better understanding of the background noise due to the seasonal effects and climatic factors involved. Changes in near surface air temperature along nearby active faults, one or two weeks before the earthquakes, although not as significant as SLHF changes, can be considered as another earthquake indicator.

  10. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  11. Earthquake Fast Facts Below is a list of earthquake fast facts. Did you know...?

    E-print Network

    Oklahoma, University of

    Earthquake Fast Facts Below is a list of earthquake fast facts. Did you know...? Earthquakes or night. Smaller earthquakes often follow the main shock. An earthquake is caused by the breaking and shifting of rock beneath the earth's surface. Ground shaking from earthquakes can collapse buildings

  12. The Global Earthquake Explorer: A Versatile Tool for Educational Seismology

    NASA Astrophysics Data System (ADS)

    Owens, T. J.; Crotwell, P.

    2004-12-01

    User-friendly access, suitable for an educational environment, to the vast IRIS seismological data holdings has been a stated goal of the Education & Outreach community for some time. The Global Earthquake Explorer (GEE) utilizes advanced data access technology hidden by an intuitive map-based interface to provide educational users with full access to data from the IRIS Data Management Center. Within minutes of a significant earthquake anywhere in the world, seismograms of that earthquake are transmitted to center recording facilities for analysis. Designed with education in mind, GEE can access these same data sources used by professional seismologists through a clickable map interface that allows users to easily select the earthquake and seismograph stations of interest and then receive the seismograms over the Internet with a single click of a mouse. With GEE, users can then view and analyze these seismograms on their local computer. GEE is also a teaching tool. It offers teachers a simple and fun way to introduce their students to earthquakes, earth structure, and wave properties. GEE includes several structured Learning Modules that help develop an elementary understanding of physical principles behind earthquakes and seismology.

  13. Deformation processes in great subduction zone earthquake cycles

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Wang, K.; He, J.

    2011-12-01

    Crustal deformation associated with great subduction zone earthquakes yields important information on mantle rheology and slip evolution of the megathrust. We have used three-dimensional viscoelastic finite element models to study the contemporary crustal deformation of three margins, Sumatra, Chile, and Cascadia, that are presently at different stages of their great earthquake cycles. At Sumatra where an Mw 9.2 earthquake occurred in 2004, all the GPS stations are moving seaward. At Chile where an Mw 9.5 earthquake occurred in 1960, coast GPS stations are moving landward, obviously due to the re-locking of the fault, while the inland stations are still moving seaward. At Cascadia where an Mw 9.0 earthquake occurred in 1700, all the GPS stations are moving landward. The earthquake cycle deformation at Alaska where an Mw 9.2 earthquake occurred in 1964 is similar to that of Chile, and the deformation at NE Japan where an Mw 9.0 earthquake occurred in 2011 is similar to that of Sumatra. Model results indicate that the earthquake cycle deformation of different margins is governed by a common physical process. A great earthquake causes the upper plate to move towards the trench and induces shear stresses in the upper mantle. After the earthquake, the fault is re-locked, causing the upper plate to move landward. However, portions of the fault undergo aseismic afterslip for a short duration, causing the overriding areas to move seaward. At the same time, the viscoelastic stress relaxation of the upper mantle causes prolonged seaward motion in inland areas including the forearc and the back arc. After a long time when the earthquake-induced stresses have mostly relaxed, the upper plate moves landward due to the re-locking of the fault. The model of the 2004 Sumatra earthquake indicates that the afterslip must be at work immediately after the earthquake, and the characteristic time of the afterslip is ~1 yr. With the incorporation of the transient (biviscous) rheology, the model well explains the near-field and far-field postseismic deformation within a few years after the 2004 Sumatra event. For all the margins modeled, the steady-state (Maxwell) viscosity of the continental upper mantle is determined to be ~1019 Pa s, two orders of magnitude lower than that of the global value obtained through global postglacial rebound analyses. Based on the model for the 2004 Sumatra earthquake, the transient (Kelvin) viscosity of the continental mantle is one to two orders of magnitude lower than that of the stead-state viscosity. Long-term postseismic deformation is controlled mainly by the steady-state viscosity of the mantle and is relatively better understood. For the short-term postseismic deformation, the interaction of the afterslip of the fault and the transient deformation of the mantle is still poorly understood. Geodetic monitoring following the 2010 Mw 8.8 Maule earthquake and 2011 Mw 9.0 Tohoku earthquakes is expected to improve greatly our understanding the short-term deformation over the next few years.

  14. A possible resonance mechanism of earthquakes

    NASA Astrophysics Data System (ADS)

    Flambaum, V. V.; Pavlov, B. S.

    2015-10-01

    It had been observed by Linkov et al. (Doklady Academii Nauk, Physics of Earth, 313, 23-25 1992) that there exist periodic 4-6 h pulses of ˜200 ?Hz seismogravitational oscillations (SGO) before 95 % of powerful earthquakes. We explain this by beating between an oscillation eigenmode of a whole tectonic plate and a local eigenmode of an active zone. The beating transfers the oscillation energy from the remote zone of the tectonic plate to the active zone, triggering the earthquake. Oscillation frequencies of the plate and ones of the active zone are tuned to a resonance by an additional compression applied to the active zone due to collision of neighboring plates or the magma flow in the liquid underlay of the asthenosphere (the upper mantle). In the case when there are three or more SGO with incommensurable difference frequencies ? m -? n , the SGO beating pattern looks quasi-random, thus masking the non-random nature of the beating process. Nevertheless, we are able to discuss a possibility of the short-term earthquakes predictions based on an accurate monitoring of the beating dynamics.

  15. Analysis of the seismicity in the region of Mirovo salt mine after 8 years monitoring

    NASA Astrophysics Data System (ADS)

    Dimitrova, Liliya; Solakov, Dimcho; Simeonova, Stela; Aleksandrova, Irena; Georgieva, Gergana

    2015-04-01

    Mirovo salt deposit is situated in the NE part of Bulgaria and 5 kilometers away from the town of Provadiya. The mine is in operation since 1956. The salt is produced by dilution and extraction of the brine to the surface. A system of chambers-pillars is formed within the salt body as a result of the applied technology. The mine is situated in a seismically quiet part of the state. The region is characterized with complex geological structure and several faults. During the last 3 decades a large number of small and moderate earthquakes (M<4.5) are realized in the close vicinity of the salt deposit. Local seismological network (LSN) is deployed in the region to monitor the local seismicity. It consists of 6 three component digital stations. A real-time data transfer from LSN stations to National Data Center (in Sofia) is implemented using the VPN and MAN networks of the Bulgarian Telecommunication Company. Common processing and interpretation of the data from LSN and the national seismic network is performed. Real-time and interactive data processing are performed by the Seismic Network Data Processor (SNDP) software package. More than 700 earthquakes are registered by the LSN within 30km region around the mine during the 8 years monitoring. First we processed the data and compile a catalogue of the earthquakes occur within the studied region (30km around the salt mine). Spatial pattern of seismicity is analyzed. A large number of the seismic events occurred within the northern and north-western part of the salt body. Several earthquakes occurred in close vicinity of the mine. Concerning that the earthquakes could be tectonic and/or induced an attempt is made to find criteria to distinguish natural from induced seismicity. To characterize and distinguish the main processes active in the area we also made waveform and spectral analysis of a number of earthquakes.

  16. The Earthquake That Tweeted

    NASA Astrophysics Data System (ADS)

    Petersen, D.

    2011-12-01

    Advances in mobile technology and social networking are enabling new behaviors that were not possible even a few short years ago. When people experience a tiny earthquake, it's more likely they're going to reach for their phones and tell their friends about it than actually take cover under a desk. With 175 million Twitter accounts, 750 million Facebook users and more than five billion mobile phones in the world today, people are generating terrific amounts of data simply by going about their everyday lives. Given the right tools and guidance these connected individuals can act as the world's largest sensor network, doing everything from reporting on earthquakes to anticipating global crises. Drawing on the author's experience as a user researcher and experience designer, this presentation will discuss these trends in crowdsourcing the collection and analysis of data, and consider their implications for how the public encounters the earth sciences in their everyday lives.

  17. Monitoring-well construction and ground-water quality analysis at the U. S. Army Reserve Center Complex and Training area, 84th Division, Milwaukee, Wisconsin

    SciTech Connect

    Not Available

    1989-03-01

    The purpose of this field investigation was to determine geologic and hydrogeologic characteristics at the U.S. Army Reserve Center Complex and Training Center (U.S.A.R.C.) in Milwaukee, Wisconsin and to assess current ground water quality at this site. These objectives were accomplished by (1) reviewing existing monitoring data; (2) installing additional ground water monitoring wells; (3) collecting bimonthly water elevation data; and (4) performing two monthly ground water sampling events. This report presents information pertaining to ground water quality and documentation of well construction methods and ground water sampling protocols employed. Hydrogeologic and water chemistry data has been compiled to determine ground water contamination. Previous studies have shown that ground water has been impacted by chloride, arsenic, cadmium, iron, and volatile organic compounds. Enforcement standards have been exceeded for sulfate, dissolved iron, and volatile organic compounds. Groundwater elevation data obtained during this study has indicated that generally ground water flow within the shallow unconfined water table system is directed towards the south and southwest.

  18. Deployment and Earthquake Scenarios for the QCN in Mexico

    NASA Astrophysics Data System (ADS)

    Dominguez, L. A.; Husker, A. L.; Lawrence, J. F.; Cruz-Atienza, V. M.; Valdes-Gonzales, C. M.; Cochran, E. S.

    2012-12-01

    The Quake Catcher Network (QCN) is a seismic experiment designed to improve earthquake monitoring and seismic coverage around the world using newly developed solid-state accelerometers. Unlike traditional seismic arrays, the QCN takes advantage of low-cost MEMS sensors and Internet connectivity for a rapid installation and fast data recovery. The efficiency of this experiment has been proved for the rapid aftershock mobilization after the 2010 M7.2 Darfield, New Zealand earthquake and 2010 Mw 8.8 Maule earthquake. Here, we report the recent progress of the deployment of the QCN in Mexico. The network is gradually growing along the Pacific coast with more than 15 sensors installed from Acapulco to Salina Cruz. The recent Mw 7.4 Oaxaca earthquake along the Mexican subduction zone was the first test of the system and its algorithms for a large earthquake. Within a minute of the initiation of the earthquake the QCN system reported a magnitude M7.6 event, whose epicenter (16.6621°, -98.1879°) was located 33 km from the epicenter reported by the Mexican National Seismological Service (16.42°, -98.36°). The biggest delay in the measurement was the 45-second S-wave travel time from the epicenter to Acapulco where the majority of sensors that detected the earthquake were installed. The first magnitude measurements from the USGS and the Mexican National Seismological Service were M7.9 and M7.8 respectively and both took several minutes to calculate. To further test the detection algorithms, we compute synthetic earthquakes to further test the system. Synthetic earthquakes allow for any arrangement of QCN sensors as well as a wide range of earthquake sizes and rupture distributions. Various location and magnitude determination tests were performed of a synthetic Mw 8 earthquake near Acapulco. The rapid detection algorithms are able to determine the epicenter to within 18 km - 35 km from the actual epicenter with magnitudes ranging from M7.5 to M8.4 depending on the station configuration used.

  19. Do Earthquakes Shake Stock Markets?

    PubMed Central

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan. PMID:26197482

  20. The BEYOND center of excellence for the effective exploitation of satellite time series towards natural disasters monitoring and assessment

    NASA Astrophysics Data System (ADS)

    Kontoes, Charalampos; Papoutsis, Ioannis; Amiridis, Vassilis; Balasis, George; Keramitsoglou, Iphigenia; Herekakis, Themistocles; Christia, Eleni

    2014-05-01

    BEYOND project (2013-2016, 2.3Meuro) funded under the FP7-REGPOT scheme is an initiative which aims to build a Centre of Excellence for Earth Observation (EO) based monitoring of natural disasters in south-eastern Europe (http://beyond-eocenter.eu/), established at the National Observatory of Athens (NOA). The project focuses on capacity building on top of the existing infrastructure, aiming at unlocking the institute's potential through the systematic interaction with high-profile partners across Europe, and at consolidating state-of-the-art equipment and technological know-how that will allow sustainable cutting-edge interdisciplinary research to take place with an impact on the regional and European socioeconomic welfare. The vision is to set up innovative integrated observational solutions to allow a multitude of space borne and ground-based monitoring networks to operate in a complementary and cooperative manner, create archives and databases of long series of observations and higher level products, and make these available for exploitation with the involvement of stakeholders. In BEYOND critical infrastructural components are being procured for fostering access, use, retrieval and analysis of long EO data series and products. In this framework NOA has initiated activities for the development, installation and operation of important acquisition facilities and hardware modules, including space based observational infrastructures as the X-/L-band acquisition station for receiving EOS Aqua/Terra, NPP, JPSS, NOAA, Metop, Feng Yun data in real time, the setting up of an ESA's Mirror Site of Sentinel missions to be operable from 2014 onwards, an advanced Raman Lidar portable station, a spectrometer facility, several ground magnetometer stations. All these are expected to work in synergy with the existing capacity resources and observational networks including the MSG/SEVIRI acquisition station, nationwide seismographic, GPS, meteo and atmospheric networks. The analysis of the satellite time series from this diverse EO based monitoring network facilities established at NOA covers a broad spectrum of research activities. Indicatively using Landsat TM/ETM+ imagery we have developed algorithms for the automatic diachronic mapping of burnt areas over Greece since 1984 and we have been using MSG/SEVIRI data to detect forest wildfires in Greece since 2007, analyze their temporal and geographical signatures and store these events for further analysis in relation with auxiliary geo-information layers for risk assessment applications. In the field of geophysics we have been employing sophisticated radar interferometry techniques using SAR sensor diversity with multi-frequency, multi-resolution and multi-temporal datasets (e.g. ERS1/ERS2, ENVISAT, TerraSAR-X, COSMO-SkyMED) to map diachronic surface deformation associated with volcanic activity, tectonic stress accumulation and urban subsidence. In the field of atmospheric research, we have developed a 3-dimentional global climatology of aerosol and cloud distributions using the CALIPSO dataset. The database, called LIVAS, will continue utilizing CALIPSO observations but also datasets from the upcoming ADM-Aeolus and EarthCARE ESA missions in order to provide a unique historical dataset of global aerosol and cloud vertical distributions, as well as respective trends in cloud cover, aerosol/cloud amount and variability of the natural and anthropogenic aerosol component. Additionally, our team is involved in Swarm magnetic field constellation, a new Earth Explorer mission in ESA's Living Planet Programme launched on November 22, 2013, as member of the validation team of the mission. Finally, assessment of heat wave risk and hazards is carried out systematically using MODIS satellite data.

  1. Sand Volcano Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)

  2. Housing Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  3. Foreshocks of strong earthquakes

    NASA Astrophysics Data System (ADS)

    Guglielmi, A. V.; Sobisevich, L. E.; Sobisevich, A. L.; Lavrov, I. P.

    2014-07-01

    The specific enhancement of ultra-low-frequency (ULF) electromagnetic oscillations a few hours prior to the strong earthquakes, which was previously mentioned in the literature, motivated us to search for the distinctive features of the mechanical (foreshock) activity of the Earth's crust in the epicentral zones of the future earthquakes. Activation of the foreshocks three hours before the main shock is revealed, which is roughly similar to the enhancement of the specific electromagnetic ULF emission. It is hypothesized that the round-the-world seismic echo signals from the earthquakes, which form the peak of energy release 2 h 50 min before the main events, act as the triggers of the main shocks due to the cumulative action of the surface waves converging to the epicenter. It is established that the frequency of the fluctuations in the foreshock activity decreases at the final stages of the preparation of the main shocks, which probably testifies to the so-called mode softening at the approach of the failure point according to the catastrophe theory.

  4. Space geodesy and earthquake prediction

    NASA Technical Reports Server (NTRS)

    Bilham, Roger

    1987-01-01

    Earthquake prediction is discussed from the point of view of a new development in geodesy known as space geodesy, which involves the use of extraterrestrial sources or reflectors to measure earth-based distances. Space geodesy is explained, and its relation to terrestrial geodesy is examined. The characteristics of earthquakes are reviewed, and the ways that they can be exploited by space geodesy to predict earthquakes is demonstrated.

  5. Precursory changes in ionosphere immediately before mega-thrust earthquakes

    NASA Astrophysics Data System (ADS)

    Heki, K.; Cahyadi, M. N.

    2012-12-01

    Heki [2011] reported that positive anomalies of ionospheric Total Electron Content (TEC) appeared about 1 hour before the 2011 Tohoku-Oki earthquake (Mw9.0) by using the Japanese dense GPS array. Here we show that similar anomalies commonly precede mega-thrust earthquakes, i.e. the 2004 Sumatra-Andaman (Mw9.2), 2010 Maule (Mw8.8), 2012 Off-Northern Sumatra (Mw8.6), 2007 Bengkulu (Mw8.5), and 1994 Hokkaido-Toho-Oki earthquakes (Mw8.3). So far, the 2005 Nias earthquake (Mw8.6) is the only over-Mw8.5 earthquake without clear preseismic TEC changes (TEC data then were disrupted by severe plasma bubble signatures). The anomalies started 90-40 minutes before earthquakes. They are positive and smaller negative anomalies often accompanied. The centers of the positive anomalies sometimes shift southward (northward) from the ruptured faults in the northern (southern) hemispheres. The attached figure shows the slant TEC changes observed by Chilean GPS stations over 2.5-hours period encompassing the 2010 Maule earthquake. Clear onsets of the TEC anomalies can be seen about 40 minutes prior to the mainshock. TEC increases may occur irrespective of earthquakes. We studied geomagnetic activities before and after these mega-thrust events; the 2010 Maule and the 2007 Bengkulu earthquakes occurred during geomagnetic quiescence and the others occurred during more or less disturbed periods. We analyzed the TEC time series of the same satellite and station pairs over 120 days before and after the 2011 Tohoku-oki and 2007 Bengkulu earthquakes. Relatively large TEC changes with similar spatial and temporal scales to the preseismic anomalies occur from time to time, many of which are due to large-scale traveling ionospheric disturbances from the auroral oval. In short, they are not rare but not so often, i.e. we can rule out with confidence the possibility that the TEC anomalies before all these six earthquakes are fortuitous. We also review observables other than GPS-TEC showing similar preseismic changes, and suggest that f0Es at Kokubunji and geomagnetic field in NE Japan showed interesting behaviors immediately before the 2011 Tohoku-oki earthquake.

  6. Parameterization of 18th January 2011 earthquake in Dalbadin Region, Southwest Pakistan

    NASA Astrophysics Data System (ADS)

    Shafiq-Ur-Rehman; Azeem, Tahir; Abd el-aal, Abd el-aziz Khairy; Nasir, Asma

    2013-12-01

    An earthquake of magnitude 7.3 Mw occurred on 18th January 2011 in Southwestern Pakistan, Baluchistan province (Dalbadin Region). The area has complex tectonics due to interaction of Indian, Eurasian and Arabian plates. Both thrust and strike slip earthquakes are dominant in this region with minor, localized normal faulting events. This earthquake under consideration (Dalbadin Earthquake) posed constraints in depth and focal parameters due to lack of data for evaluation of parameters from Pakistan, Iran or Afghanistan region. Normal faulting mechanism has been proposed by many researchers for this earthquake. In the present study the earthquake was relocated using the technique of travel time residuals. Relocated coordinates and depth were utilized to calculate the focal mechanism solution with outcome of a dominant strike slip mechanism, which is contrary to normal faulting. Relocated coordinates and resulting mechanism are more reliable than many reporting agencies as evaluation in this study is augmented by data from local seismic monitoring network of Pakistan. The tectonics in the area is governed by active subduction along the Makran Subduction Zone. This particular earthquake has strike slip mechanism due to breaking of subducting oceanic plate. This earthquake is located where oceanic lithosphere is subducting along with relative movements between Lut and Helmand blocks. Magnitude of this event i.e. Mw = 7.3, re evaluated depth and a previous study of mechanism of earthquake in same region (Shafiq et al., 2011) also supports the strike slip movement.

  7. Assessment of Interplate and Intraplate Earthquakes 

    E-print Network

    Bellam, Srigiri Shankar

    2012-10-19

    of earthquakes are observed in the surface plates, interplate and intraplate earthquakes, which are classified, based on the location of the origin of an earthquake either between two plates or within the plate respectively. Limited work has been completed...

  8. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...2013-01-01 2013-01-01 false Earthquake hazards. 120.174 Section 120...Other Laws and Orders § 120.174 Earthquake hazards. When loan proceeds are...construction must conform with the “National Earthquake Hazards Reduction Program...

  9. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...2010-01-01 2010-01-01 false Earthquake hazards. 120.174 Section 120...Other Laws and Orders § 120.174 Earthquake hazards. When loan proceeds are...construction must conform with the “National Earthquake Hazards Reduction Program...

  10. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...2011-01-01 2011-01-01 false Earthquake hazards. 120.174 Section 120...Other Laws and Orders § 120.174 Earthquake hazards. When loan proceeds are...construction must conform with the “National Earthquake Hazards Reduction Program...

  11. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...2012-01-01 2012-01-01 false Earthquake hazards. 120.174 Section 120...Other Laws and Orders § 120.174 Earthquake hazards. When loan proceeds are...construction must conform with the “National Earthquake Hazards Reduction Program...

  12. 13 CFR 120.174 - Earthquake hazards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...2014-01-01 2014-01-01 false Earthquake hazards. 120.174 Section 120...Other Laws and Orders § 120.174 Earthquake hazards. When loan proceeds are...construction must conform with the “National Earthquake Hazards Reduction Program...

  13. Estimating Temperature Retrieval Accuracy Associated With Thermal Band Spatial Resolution Requirements for Center Pivot Irrigation Monitoring and Management

    NASA Technical Reports Server (NTRS)

    Ryan, Robert E.; Irons, James; Spruce, Joseph P.; Underwood, Lauren W.; Pagnutti, Mary

    2006-01-01

    This study explores the use of synthetic thermal center pivot irrigation scenes to estimate temperature retrieval accuracy for thermal remote sensed data, such as data acquired from current and proposed Landsat-like thermal systems. Center pivot irrigation is a common practice in the western United States and in other parts of the world where water resources are scarce. Wide-area ET (evapotranspiration) estimates and reliable water management decisions depend on accurate temperature information retrieval from remotely sensed data. Spatial resolution, sensor noise, and the temperature step between a field and its surrounding area impose limits on the ability to retrieve temperature information. Spatial resolution is an interrelationship between GSD (ground sample distance) and a measure of image sharpness, such as edge response or edge slope. Edge response and edge slope are intuitive, and direct measures of spatial resolution are easier to visualize and estimate than the more common Modulation Transfer Function or Point Spread Function. For these reasons, recent data specifications, such as those for the LDCM (Landsat Data Continuity Mission), have used GSD and edge response to specify spatial resolution. For this study, we have defined a 400-800 m diameter center pivot irrigation area with a large 25 K temperature step associated with a 300 K well-watered field surrounded by an infinite 325 K dry area. In this context, we defined the benchmark problem as an easily modeled, highly common stressing case. By parametrically varying GSD (30-240 m) and edge slope, we determined the number of pixels and field area fraction that meet a given temperature accuracy estimate for 400-m, 600-m, and 800-m diameter field sizes. Results of this project will help assess the utility of proposed specifications for the LDCM and other future thermal remote sensing missions and for water resource management.

  14. Salt Kinematics after Earthquake

    NASA Astrophysics Data System (ADS)

    Aftabi, P.; Roustaei, M.

    2007-12-01

    Salt extrusions are simple natural models. The shapes of salt extrusions are complex gauges of the forces extruding them [9]. The uplift rates of a part of the Namakdan is between 1-3 mm/y-1 at the rim to 3-6 mm/yr-1 in the interior [5]. The salt glacier flowed plastically during the brief annual rainy season [6].The salt flow faster with temperature rise and flow slowly when temperature fall [3]. The displacements with>10cm/y and >50 cm/y suggest that, the salt extruded with rate 82 mm a-1 [9] but spread cm to m per year [8].The recent InSAR study Near Namakdan had no explanations about high activity of salt after earthquake [10].The coseismic vertical displacements suggest reactivation of blind thrust [11] .Our recent field measurements reported here suggest that any fast flow in salt may related to the mild to strong Earthquakes and may caused by diapiric reactivation. The earthquake of 27Nov 2005 with Mw ~ 6 occurred in Qeshm Island has a distance of 65 km to Bandar Abbass.The base of our measurements illustrated by Aftabi[1,2]and suggested by Bailly[4].Two wooden stakes about 50 cm in length and (2x3 cm<) in section were hammered vertically into the surficial marly salt along a line on the SW slopes of the thin southern namakier of Namakdan. Two others hammered on the walls of the cave in the south western part of the Namakdan diapir. The distance between stakes was measured (+/- a mm and cm) using a meter scale and the azimuths between them were measured (+/- a degree) by compass. Between readings, the meter scale was stored and carried in an ice chest to minimize its thermal expansion or contraction. Readings were made immediately one and two day after in the same times after installation measurements. We expected repeat of the main shock as mild earthquake one year later as earthquake cycling, we therefore return there and measured distance between stakes in salt one year after main shock but in the same times.IIEES reported the 26 and 27th 06 mild earthquake with Ml 3.3[12]. The distances between stakes both lengthened between measurements demonstrating local extension strains in the tiny southern namakier of Namakdan inside the cave and out in the marly cover.We interpreted some big activities in our 05 InSAR image as brine movements in the rim of the salt.Some of the distances between stakes exhibited complete elastic recovery during one day [1]. Others time-dependant elastic-plastic recovery[1], while in the Namakdan diapir the stakes just extended rapidly in scale of 1cm after 24 hours indicate permanent non recoverable plastic strains[1]with high rate after earthquake.With special thanks to M.Madani, C.J.Talbot and E.Feilding. REF: [1]Aftabi, P., 2000[2]Aftabi, p. et al, 2005[3]Aftabi, P., 2006[4]Bailly, E.B, 1931[5]Bruthans, J et al,In Press [6]Talbot, C.J., & Rojers, E.A.1980[7]Talbot, C.J, 1998[8] Talbot, C.J et al., 2000[9]Talbot, C.J, &Aftabi, P., 2004 [10]Nilforoushan, F. et al., 2005[11]Niessen et al.,In press [12] www.iiees.ac.ir

  15. Volcanic strain change prior to an earthquake swarm observed by groundwater level sensors in Meakan-dake, Hokkaido, Japan

    NASA Astrophysics Data System (ADS)

    Takahashi, Hiroaki; Shibata, Tomo; Yamaguchi, Teruhiro; Ikeda, Ryuji; Okazaki, Noritoshi; Akita, Fujio

    2012-02-01

    We installed and operated a low-cost groundwater level observation system at intermittent hot spring wells in order to monitor volcanic strain signals from the active Meakan-dake volcano in eastern Hokkaido, Japan. Data are sampled at 1 Hz and are transmitted to the data center in real time. Evaluation of the water level time series with theoretical predictive tidal strain and coseismic static strain changes has suggested that the wells penetrate to the artesian aquifer and act as a volumetric strain sensor. An active earthquake swarm with more than 400 events occurred at the shallower part of the volcano from January 9 to 11, 2008. Three independent wells recorded pre- to co-swarm groundwater drops simultaneously, which represented a decrease in volumetric strain. The total volumetric strain change during the three active days was estimated to be from 6 to 7 × 10- 7. The observed data, including changes in volumetric strain, absence of deformation in the GPS coordinates, and activation of deep low-frequency earthquakes, might imply possible deflation of a source deeper than 10 km, and these preceding deeper activities might induce an earthquake swarm in a shallower part of the Meakan-dake volcano.

  16. Geodynamic monitoring in real times

    NASA Astrophysics Data System (ADS)

    Outkin, V.; Yurkov, A.; Klimshin, A.; Kozlova, I.

    2011-12-01

    For the decision of problems of the short-term and intermediate term forecast of tectonic earthquakes the technique conditionally named - geodynamic monitoring which does not use the data of seismic monitoring for the operative decision of problems of the forecast is offered. Geodynamic monitoring (GDM) is to studying tensely - deformed conditions of the separate block of rock on change of activity natural radioactive gas is carried out by accommodation in the chosen file of specially designed monitors of radon - devices fixing change in time (VAR). The monitor of radon, (the detector of radon) as the basic measuring device located in the block of rocks, possesses enormous tensosensitivity to relative strain condition of a file. Depending on the enclosed pressure choose three characteristic points: 1) 30-35 % of "background" size VAR - the beginning of accumulation of inelastic energy; 2) 50 % of background VAR - the process of stabilization of an elastic condition of a file; 3) 70-75 % "background" VAR - a critical pressure in the mountain block, an opportunity as spontaneous dump of elastic energy, and under action external "triggerring" forces. If the size of the saved up energy is close to critical dump needs energy at a level of energy of variations of rotation of the Earth. Such significant energy causes "plenty" of earthquakes on all planet simultaneously. This fact confirms an opportunity of the short-term forecast of strong (destructive) earthquakes: dump of elastic pressure of the Earth in this case occurs in 25-30 hours after passage of variations of rotation of the Earth. It is for the notification of the population about coming nearer earthquake. External power functions (mechanical, electromagnetic, etc.), preparations influencing system and occurrence of tectonic earthquakes, are divided on two big classes: 1) "forecasting " functions - processes functionally connected to accumulation of elastic pressure and to dump by its rather small dozes; 2) external mechanical actions which initiate dump of the saved up elastic pressure - "triggirring functions", promoting dump of the elastic pressure resulting in earthquake. The short-term forecast of especially large earthquakes is entirely based on use of monitoring of rotation of the Earth: essential "triggerring functions" (variations of rotation of the Earth) dump(reset) the saved up pressure on all surface of the Earth, causing thus large earthquakes. Therefore the prevention(warning) of large earthquakes should be formed on the basis of monitoring variations of heterogeneity of rotation of the Earth, that usually precedes dump of elastic pressure(voltage) at 25-30 o'clock.

  17. Earthquake Drill using the Earthquake Early Warning System at an Elementary School

    NASA Astrophysics Data System (ADS)

    Oki, Satoko; Yazaki, Yoshiaki; Koketsu, Kazuki

    2010-05-01

    Japan frequently suffers from many kinds of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. On average, we lose about 120 people a year due to natural hazards in this decade. Above all, earthquakes are noteworthy, since it may kill thousands of people in a moment like in Kobe in 1995. People know that we may have "a big one" some day as long as we live on this land and that what to do; retrofit houses, appliance heavy furniture to walls, add latches to kitchen cabinets, and prepare emergency packs. Yet most of them do not take the action, and result in the loss of many lives. It is only the victims that learn something from the earthquake, and it has never become the lore of the nations. One of the most essential ways to reduce the damage is to educate the general public to be able to make the sound decision on what to do at the moment when an earthquake hits. This will require the knowledge of the backgrounds of the on-going phenomenon. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools. This presentation is the report of a year and half courses that we had at the model elementary school in Tokyo Metropolitan Area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1923 Kanto earthquake (M 7.9) making 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global economic repercussion. We provide the school kids with the "World Seismicity Map" to let them realize that earthquake disasters take place unequally. Then we let the kids jump in front of the seismometer with projecting the real-time data to the wall. Grouped kids contest the largest amplitude by carefully considering how to jump high but nail the landing with their teammates. Their jumps are printed out via portable printer and compared with the real earthquake which occurred even 600km away but still huge when printed out in the same scale. Actually, a magnitude 7 earthquake recorded 600km away needs an A0 paper when scaled with a jump of 10 kids printed in an A4 paper. They've got to understand what to do not to be killed with the great big energy. We also offer earthquake drills using the Earthquake Early Warning System (EEW System). An EEW System is officially introduced in 2007 by JMA (Japan Meteorological Agency) to issue prompt alerts to provide several to several ten seconds before S-wave arrives. When hearing the alarm, school kids think fast to find a place to protect themselves. It is not always when they are in their classrooms but in the chemical lab, music room which does not have any desks to protect them, or in the PE class. Then in the science class, we demonstrate how the EEW System works. A 8m long wave propagation device made with spindles connected with springs is used to visualize the P- and S-waves. In the presentation, we would like to show the paper materials and sufficient movies.

  18. At which distance from previous earthquakes will the next one occur?

    NASA Astrophysics Data System (ADS)

    González, Álvaro

    2014-05-01

    Earthquakes are not distributed uniformly in space. Instead, they tend to originate at, or close to, the locations of past earthquakes, reflecting the underlying geometry of the fault system which generates them. This presentation will describe a method for calculating which fraction of future earthquakes is expected to occur at given distances from past ones. Being very simple, the procedure is based on the empirical distribution of distances between past earthquakes. The results provide a detailed probabilistic answer to the talk's title and can be expressed in maps of spatial probabilities. The skill of these forecast maps is confirmed by four years of daily, real-time tests for California, the Western Pacific and global seismicity, performed at the Southern California Earthquake Center, within the Collaboratory for the Study of Earthquake Predictability (www.cseptesting.org). Such a method could be particularly useful for developing more realistic, physics-based, smoothed seismicity models for probabilistic seismic hazard maps.

  19. Did the September 2010 (Darfield) earthquake trigger the February 2011 (Christchurch) event?

    PubMed Central

    Stramondo, Salvatore; Kyriakopoulos, Christodoulos; Bignami, Christian; Chini, Marco; Melini, Daniele; Moro, Marco; Picchiani, Matteo; Saroli, Michele; Boschi, Enzo

    2011-01-01

    We have investigated the possible cause-and-effect relationship due to stress transfer between two earthquakes that occurred near Christchurch, New Zealand, in September 2010 and in February 2011. The Mw 7.1 Darfield (Canterbury) event took place along a previously unrecognized fault. The Mw 6.3 Christchurch earthquake, generated by a thrust fault, occurred approximately five months later, 6?km south-east of Christchurch's city center. We have first measured the surface displacement field to retrieve the geometries of the two seismic sources and the slip distribution. In order to assess whether the first earthquake increased the likelihood of occurrence of a second earthquake, we compute the Coulomb Failure Function (CFF). We find that the maximum CFF increase over the second fault plane is reached exactly around the hypocenter of the second earthquake. In this respect, we may conclude that the Darfield earthquake contributed to promote the rupture of the Christchurch fault. PMID:22355616

  20. Earthquake spatial distribution: the correlation dimension

    E-print Network

    Kagan, Yan Y

    2007-01-01

    EARTHQUAKE SPATIAL DISTRIBUTION X - 15 Fig. 3 green ? simulationearthquake spatial pattern. We test these formulae by simulation.simulation. We consider lacunarity or intermittency of spatial earthquake

  1. Review of variations in Mw < 7 earthquake motions on position and tec (Mw = 6.5 aegean sea earthquake sample)

    NASA Astrophysics Data System (ADS)

    Yildirim, O.; Inyurt, S.; Mekik, C.

    2015-10-01

    Turkey is a country located in Middle Latitude zone and in which tectonic activity is intensive. Lastly, an earthquake of magnitude 6.5Mw occurred at Aegean Sea offshore on date 24 May 2014 at 12:25 UTC and it lasted approximately 40 s. The said earthquake was felt also in Greece, Romania and Bulgaria in addition to Turkey. In recent years seismic origin ionospheric anomaly detection studies have been done with TEC (Total Electron Contents) generated from GNSS (Global Navigation Satellite System) signals and the findings obtained have been revealed. In this study, TEC and positional variations have been examined seperately regarding the earthquake which occurred in the Aegean Sea. Then The correlation of the said ionospheric variation with the positional variation has been investigated. For this purpose, total fifteen stations have been used among which the data of four numbers of CORS-TR stations in the seismic zone (AYVL, CANA, IPSA, YENC) and IGS and EUREF stations are used. The ionospheric and positional variations of AYVL, CANA, IPSA and YENC stations have been examined by Bernese 5.0v software. When the (PPP-TEC) values produced as result of the analysis are examined, it has been understood that in the four stations located in Turkey, three days before the earthquake at 08:00 and 10:00 UTC, the TEC values were approximately 4 TECU above the upper limit TEC value. Still in the same stations, one day before the earthquake at 06:00, 08:00 and 10:00 UTC, it is being shown that the TEC values were approximately 5 TECU below the lower limit TEC value. On the other hand, the GIM-TEC values published by the CODE center have been examined. Still in all stations, it has been observed that three days before the earthquake the TEC values in the time portions of 08:00 and 10:00 UTC were approximately 2 TECU above, one day before the earthquake at 06:00, 08:00 and 10:00 UTC, the TEC values were approximately 4 TECU below the lower limit TEC value. Again, by using the same fifteen numbers of stations, positional variation investigation before and after the earthquake has been made for AYVL, CANA, IPSA and YENC stations. As result of the analysis made, positional displacements has been seen before and after earthquake at CANA station which is the nearest station to earthquake center. It is about 10 and 3 cm before three days and one day earthquake.

  2. Earthquakes' impact on hydrothermal systems may be far-reaching

    NASA Astrophysics Data System (ADS)

    Johnson, H. Paul; Dziak, Robert P.; Fisher, Charles R.; Fox, Christopher G.; Pruis, Matthew J.

    Recent work has linked earthquake activity with changes in flow and temperature due to hydrothermal venting at mid-ocean ridges. These intriguing relationships are important motivation for modeling marine hydrothermal systems. However, a re-examination of some earlier vent monitoring data from the Juan de Fuca Ridge, combined with analysis of recently reprocessed SOSUS (SOund Surveillance System) hydrophone data (Figure 1), suggest that such activity may be linked over considerable distances of greater than 200 km and reaction intervals of over a month.The available observational data are sparse, so the direct association between earthquakes and changes in crustal fluid circulation are difficult to verify. However, the response times and distance scales are consistent with other observations, including earthquakes in land-based settings [Hill et al., 1993] and modeling of flow in porous media [Pruis et al., 2000]. If true, these associations imply that marine hydrothermal systems are extremely complex and may be sensitive to very subtle environmental changes.

  3. Moored systems designed to sense deep ocean earthquakes

    SciTech Connect

    Hartman, P.J.

    1982-09-01

    The ability to predict earthquakes and tsunamis is becoming increasingly important as world population continues to grow in high-density coastal metropolitan areas. Earthquakes which occur in and near undersea subduction zones where the earth's crust slides under continental masses generate highly destructive tsunamis. Deep ocean buoy systems and sensor implantation techniques are being developed to obtain seismic data from the earth's crust in water depths of 6000 m. For the first time, deep-sea drilling, high-resolution seismic sensors, and long-term, deep-ocean mooring technology are being combined to provide systems which continuously monitor earthquake activity in the deep ocean. Such systems provide vital seismic research information to the scientific community.

  4. ISET Journal of Earthquake Technology, Paper No. 500, Vol. 46, No. 1, March 2009, pp. 117 SMOOTH SPECTRA OF HORIZONTALAND VERTICAL GROUND

    E-print Network

    Gupta, Vinay Kumar

    ISET Journal of Earthquake Technology, Paper No. 500, Vol. 46, No. 1, March 2009, pp. 1­17 SMOOTH*** *Seismology Research Center, International Institute of Earthquake Engineering and Seismology, Tehran, Iran **Earthquake Research Institute, University of Tokyo, Tokyo, Japan ***Iran Strong Motion Network, Building

  5. Bulletin of the Seismological Society of America, Vol. 96, No. 3, pp. 11401158, June 2006, doi: 10.1785/0120040239 Fault Parameter Constraints Using Relocated Earthquakes: A Validation

    E-print Network

    California at San Diego, University of

    , doi: 10.1785/0120040239 E Fault Parameter Constraints Using Relocated Earthquakes: A Validation California Earthquake Data Center (NCEDC) catalog, calculated using the FPFIT algorithm (Reasenberg to seismic velocity model variations and earthquake location (Hardebeck and Shearer, 2002). We assume any

  6. Fault-Zone Maturity Defines Maximum Earthquake Magnitude

    NASA Astrophysics Data System (ADS)

    Bohnhoff, M.; Bulut, F.; Stierle, E.; Ben-Zion, Y.

    2014-12-01

    Estimating the maximum likely magnitude of future earthquakes on transform faults near large metropolitan areas has fundamental consequences for the expected hazard. Here we show that the maximum earthquakes on different sections of the North Anatolian Fault Zone (NAFZ) scale with the duration of fault zone activity, cumulative offset and length of individual fault segments. The findings are based on a compiled catalogue of historical earthquakes in the region, using the extensive literary sources that exist due to the long civilization record. We find that the largest earthquakes (M~8) are exclusively observed along the well-developed part of the fault zone in the east. In contrast, the western part is still in a juvenile or transitional stage with historical earthquakes not exceeding M=7.4. This limits the current seismic hazard to NW Turkey and its largest regional population and economical center Istanbul. Our findings for the NAFZ are consistent with data from the two other major transform faults, the San Andreas fault in California and the Dead Sea Transform in the Middle East. The results indicate that maximum earthquake magnitudes generally scale with fault-zone evolution.

  7. Rupture propagation behavior and the largest possible earthquake induced by fluid injection into deep reservoirs

    NASA Astrophysics Data System (ADS)

    Gischig, Valentin S.

    2015-09-01

    Earthquakes caused by fluid injection into deep underground reservoirs constitute an increasingly recognized risk to populations and infrastructure. Quantitative assessment of induced seismic hazard, however, requires estimating the maximum possible magnitude earthquake that may be induced during fluid injection. Here I seek constraints on an upper limit for the largest possible earthquake using source-physics simulations that consider rate-and-state friction and hydromechanical interaction along a straight homogeneous fault. Depending on the orientation of the pressurized fault in the ambient stress field, different rupture behaviors can occur: (1) uncontrolled rupture-front propagation beyond the pressure front or (2) rupture-front propagation arresting at the pressure front. In the first case, fault properties determine the earthquake magnitude, and the upper magnitude limit may be similar to natural earthquakes. In the second case, the maximum magnitude can be controlled by carefully designing and monitoring injection and thus restricting the pressurized fault area.

  8. Prediction Capabilities of VLF/LF Emission as the Main Precursor of Earthquake

    E-print Network

    Kachakhidze, Manana

    2013-01-01

    Recent satellite and ground-based observations proved that in earthquake preparation period in the seismogenic area we have VLF/LF and ULF electromagnetic emissions. According to the opinion of the authors of the present paper this phenomenon is more universal and reliable than other earthquake indicators. Hypothetically, in case of availability of adequate methodological grounds, in the nearest future, earth VLF/LF electromagnetic emission might be declared as the main precursor of earthquake. In particular, permanent monitoring of frequency spectrum of earth electromagnetic emission generated in the earthquake preparation period might turn out very useful with the view of prediction of large (M 5) inland earthquakes. The present paper offers a scheme of the methodology according to which the reality of the above given hypothesis can be checked up. To prove the prediction capabilities of earth electromagnetic emission we have used avalanche-like unstable model of fault formation and an analogous model of ele...

  9. Groundwater-strain coupling before the 1999 Mw 7.6 Taiwan Chi-Chi earthquake

    NASA Astrophysics Data System (ADS)

    Chen, Chieh-Hung; Tang, Chi-Chia; Cheng, Kai-Chien; Wang, Chung-Ho; Wen, Strong; Lin, Cheng-Horng; Wen, Yi-Ying; Meng, Guojie; Yeh, Ta-Kang; Jan, Jyh Cherng; Yen, Horng-Yuan; Liu, Jann-Yenq

    2015-05-01

    The coupling of pre-earthquake anomalous phenomena between long-term groundwater levels recorded at 42 monitoring stations and time-varying surface strain derived from 16 GPS stations was found in the Choshuichi Alluvial Fan before the 1999 Mw 7.6 Chi-Chi earthquake in Taiwan. The noise-free groundwater-level anomalies consistently comprised by a sequence of decrease, rise and flat phases, which agree very well with changes in strain rates computed from the GPS stations. These coupling agreements show that in addition to compression, tension can be generated before a thrust earthquake occurrence as well. This case demonstrates that short-term surface deformation as signals against noise and accuracy of pre-earthquake anomalous phenomena can be simultaneously examined by using multiple-parameter crosscheck for significantly reducing the uncertainty of earthquake precursory evaluation.

  10. Assessment of Readiness for Clinical Decision Support to Aid Laboratory Monitoring of Immunosuppressive Care at U.S. Liver Transplant Centers

    PubMed Central

    Weir, C.; Evans, R. S.; Staes, C.

    2014-01-01

    Summary Background Following liver transplantation, patients require lifelong immunosuppressive care and monitoring. Computerized clinical decision support (CDS) has been shown to improve post-transplant immunosuppressive care processes and outcomes. The readiness of transplant information systems to implement computerized CDS to support post-transplant care is unknown. Objectives a) Describe the current clinical information system functionality and manual and automated processes for laboratory monitoring of immunosuppressive care, b) describe the use of guidelines that may be used to produce computable logic and the use of computerized alerts to support guideline adherence, and c) explore barriers to implementation of CDS in U.S. liver transplant centers. Methods We developed a web-based survey using cognitive interviewing techniques. We surveyed 119 U.S. transplant programs that performed at least five liver transplantations per year during 2010–2012. Responses were summarized using descriptive analyses; barriers were identified using qualitative methods. Results Respondents from 80 programs (67% response rate) completed the survey. While 98% of programs reported having an electronic health record (EHR), all programs used paper-based manual processes to receive or track immunosuppressive laboratory results. Most programs (85%) reported that 30% or more of their patients used external laboratories for routine testing. Few programs (19%) received most external laboratory results as discrete data via electronic interfaces while most (80%) manually entered laboratory results into the EHR; less than half (42%) could integrate internal and external laboratory results. Nearly all programs had guidelines regarding pre-specified target ranges (92%) or testing schedules (97%) for managing immunosuppressive care. Few programs used computerized alerting to notify transplant coordinators of out-of-range (27%) or overdue laboratory results (20%). Conclusions Use of EHRs is common, yet all liver transplant programs were largely dependent on manual paper-based processes to monitor immunosuppression for post-liver transplant patients. Similar immunosuppression guidelines provide opportunities for sharing CDS once integrated laboratory data are available. PMID:25589912

  11. The 2014 Greeley, Colorado Earthquakes: Science, Industry, Regulation, and Media

    NASA Astrophysics Data System (ADS)

    Yeck, W. L.; Sheehan, A. F.; Weingarten, M.; Nakai, J.; Ge, S.

    2014-12-01

    On June 1, 2014 (UTC) a magnitude 3.2 earthquake occurred east of the town of Greeley, Colorado. The earthquake was widely felt, with reports from Boulder and Golden, over 60 miles away from the epicenter. The location of the earthquake in a region long considered aseismic but now the locus of active oil and gas production prompted the question of whether this was a natural or induced earthquake. Several classic induced seismicity cases hail from Colorado, including the Rocky Mountain Arsenal earthquakes in the 1960s and the Paradox Valley earthquakes in western Colorado. In both cases the earthquakes were linked to wastewater injection. The Greeley earthquake epicenter was close to a Class II well that had been injecting waste fluid into the deepest sedimentary formation of the Denver Basin at rates as high as 350,000 barrels/month for less than a year. The closest seismometers to the June 1 event were more than 100 km away, necessitating deployment of a local seismic network for detailed study. IRIS provided six seismometers to the University of Colorado which were deployed starting within 3 days of the mainshock. Telemetry at one site allowed for real time monitoring of the ongoing seismic sequence. Local media interest was extremely high with speculation that the earthquake was linked to the oil and gas industry. The timetable of media demand for information provided some challenges given the time needed for data collection and analysis. We adopted a policy of open data and open communication with all interested parties, and made proactive attempts to provide information to industry and regulators. After 3 weeks of data collection and analysis, the proximity and timing of the mainshock and aftershocks to the C4A injection well, along with a sharp increase in seismicity culminating in an M 2.6 aftershock, led to a decision by the Colorado Oil and Gas Corporation Commission (COGCC) to recommend a temporary halt to injection at the C4A injection well. This was the first time that such action had been taken by the COGCC. This presentation provides an overview of the interactions among academic researchers, industry, media, and regulators during the period of rapid response to this earthquake sequence, and the role of seismology in informing those responses.

  12. Monitoring the Dusty S-Cluster Object (DSO/G2) on its Orbit towards the Galactic Center Black Hole

    E-print Network

    Valencia-S., M; Zajacek, M; Peissker, F; Parsa, M; Grosso, N; Mossoux, E; Porquet, D; Jalali, B; Karas, V; Yazici, S; Shahzamanian, B; Sabha, N; Saalfeld, R; Smajic, S; Grellmann, R; Moser, L; Horrobin, M; Borkar, A; Marin, M Garcia; Dovciak, M; Kunneriath, D; Karssen, G D; Bursa, M; Straubmeier, C; Bushouse, H

    2014-01-01

    We elaborate on our Astronomer's Telegram #6285 and report in detail new near-infrared (1.45 - 2.45 microns) observations of the Dusty S-cluster Object (DSO/G2) during its approach to the black hole at the center of the Galaxy that were carried out with ESO VLT/SINFONI between February and April 2014. We detect spatially compact Br-gamma and Pa-alpha line emission from the DSO at about 30-40mas east of SgrA*. The velocity of the source, measured from the red-shifted emission, is (2700+-60) km/s. No blue-shifted emission above the noise level is detected at the position of SgrA* or upstream the presumed orbit. The full width at half maximum of the red Br-gamma line is (50+-10) Angstroms, i.e. no significant line broadening with respect to last year is observed. This is a further indication for the compactness of the source. For the moment, the flaring activity of the black hole in the near-infrared regime has not shown any statistically significant increment. We conclude that the DSO source had not yet reached...

  13. Cooperative Monitoring Center Occasional Paper/12: ENTNEA: A Concept for Enhancing Nuclear Transparency for Confidence Building in Northeast Asia

    SciTech Connect

    Nam, Man-Kwon; Shin, Sung-Tack

    1999-06-01

    Nuclear energy continues to be a strong and growing component of economic development in Northeast Asia. A broad range of nuclear energy systems already exists across the region and vigorous growth is projected. Associated with these capabilities and plans are various concerns about operational safety, environmental protection, and accumulation of spent fuel and other nuclear materials. We consider cooperative measures that might address these concerns. The confidence building measures suggested here center on the sharing of information to lessen concerns about nuclear activities or to solve technical problems. These activities are encompassed by an Enhanced Nuclear Transparency in Northeast Asia (ENTNEA) concept that would be composed of near-term, information-sharing activities and an eventual regional institution. The near-term activities would address specific concerns and build a tradition of cooperation; examples include radiation measurements for public safety and emergency response, demonstration of safe operations at facilities and in transportation, and material security in the back end of the fuel cycle. Linkages to existing efforts and organizations would be sought to maximize the benefits of cooperation. In the longer term, the new cooperative tradition might evolve into an ENTNEA institution. In institutional form, ENTNEA could combine the near-term activities and new cooperative activities, which might require an institutional basis, for the mutual benefit and security of regional parties.

  14. The Los Alamos Seismic Network (LASN): Improved Network Instrumentation, Local Earthquake Catalog Updates, and Peculiar Types of Data

    NASA Astrophysics Data System (ADS)

    Roberts, P. M.; Ten Cate, J. A.; House, L. S.; Greene, M. K.; Morton, E.; Kelley, R. E.

    2013-12-01

    The Los Alamos Seismic Network (LASN) has operated for 41 years, and provided the data to locate more than 2,500 earthquakes in north-central New Mexico. The network was installed for seismic verification research, as well as to monitor and locate earthquakes near Los Alamos National Laboratory (LANL). LASN stations are the only monitoring stations in New Mexico north of Albuquerque. The original network once included 22 stations in northern Mew Mexico. With limited funding in the early 1980's, the network was downsized to 7 stations within an area of about 15 km (N-S) by 15 km (E-W), centered on Los Alamos. Over the last four years, eight additional stations have been installed, which have considerably expanded the spatial coverage of the network. Currently, 7 stations have broadband, three-component seismometers with digital telemetry, and the remaining 8 have traditional 1 Hz short-period seismometers with either analog telemetry or on-site digital recording. A vertical array of accelerometers was also installed in a wellbore on LANL property. This borehole array has 3-component digital strong-motion sensors. Recently we began upgrading the local strong-motion accelerometer (SMA) network as well, with the addition of high-resolution digitizers and high-sensitivity force-balance accelerometers (FBA). We will present an updated description of the current LASN station, instrumentation and telemetry configurations, as well as the data acquisition and event-detection software structure used to record events in Earthworm. Although more than 2,000 earthquakes were detected and located in north-central New Mexico during the first 11 years of LASN's operation (1973 to 1984), currently only 1-2 earthquakes per month are detected and located within about 150 km of Los Alamos. Over 850 of these nearby earthquakes have been located from 1973 to present. We recently updated the LASN earthquake catalog for north-central New Mexico up through 2012 and most of 2013. Locations for these earthquakes are based on new, consistently picked arrival times, updated station locations, and the best available velocity model. Most have magnitudes less than 1.5 and are not contained in the catalogs of any other network. With 3 of the new broadband stations in and around the nearby Valles Caldera, LASN is now able to monitor even very small volcano-seismic events that may be associated with the caldera. The expanded station coverage and instrument sensitivity has also allowed detection of smaller, more distant events and new types of peculiar, non-earthquake signals we had not previously seen (e.g., train noise). These unusual signals have complicated our event discrimination efforts. We will show an updated map of north-central New Mexico seismicity based on these recent efforts, as well as examples of some the new types of data LASN is now picking up. Although the network and data are generally not accessible to the public, requests for data can be granted on a case-by-case basis.

  15. Self-Organized Earthquakes

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.; Klein, W.

    2011-12-01

    Self-Organized Criticality was proposed by the Per Bak et al. [1] as a means of explaining scaling laws observed in driven natural systems, usually in (slowly) driven threshold systems. The example used by Bak was a simple cellular automaton model of a sandpile, in which grains of sand were slowly dropped (randomly) onto a flat plate. After a period of time, during which the 'critical state' was approached, a series of self-similar avalanches would begin. Scaling exponents for the frequency-area statistics of the sandpile avalanches were found to be approximately 1, a value that characterizes 'flicker noise' in natural systems. SOC is associated with a critical point in the phase diagram of the system, and it was found that the usual 2-scaling field theory applies. A model related to SOC is the Self-Organized Spinodal (SOS), or intermittent criticality model. Here a slow but persistent driving force leads to quasi-periodic approach to, and retreat from, the classical limit of stability, or spinodal. Scaling exponents for this model can be related to Gutenberg-Richter and Omori exponents observed in earthquake systems. In contrast to SOC models, nucleation, both classical and non-classical types, is possible in SOS systems. Tunneling or nucleation rates can be computed from Langer-Klein-Landau-Ginzburg theories for comparison to observations. Nucleating droplets play a role similar to characteristic earthquake events. Simulations of these systems reveals much of the phenomenology associated with earthquakes and other types of "burst" dynamics. Whereas SOC is characterized by the full scaling spectrum of avalanches, SOS is characterized by both system-size events above the nominal frequency-size scaling curve, and scaling of small events. Applications to other systems including integrate-and-fire neural networks and financial crashes will be discussed. [1] P. Bak, C. Tang and K. Weisenfeld, Self-Organized Criticality, Phys. Rev. Lett., 59, 381 (1987).

  16. Make an Earthquake: Ground Shaking!

    ERIC Educational Resources Information Center

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  17. Earthquakes Threaten Many American Schools

    ERIC Educational Resources Information Center

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  18. Earthquake Preparedness Checklist for Schools.

    ERIC Educational Resources Information Center

    1999

    A brochure provides a checklist highlighting the important questions and activities that should be addressed and undertaken as part of a school safety and preparedness program for earthquakes. It reminds administrators and other interested parties on what not to forget in preparing schools for earthquakes, such as staff knowledge needs, evacuation…

  19. Implementation of the National Incident Management System (NIMS)/Incident Command System (ICS) in the Federal Radiological Monitoring and Assessment Center(FRMAC) - Emergency Phase

    SciTech Connect

    NSTec Environmental Restoration

    2007-04-01

    Homeland Security Presidential Directive HSPD-5 requires all federal departments and agencies to adopt a National Incident Management System (NIMS)/Incident Command System (ICS) and use it in their individual domestic incident management and emergency prevention, preparedness, response, recovery, and mitigation programs and activities, as well as in support of those actions taken to assist state and local entities. This system provides a consistent nationwide template to enable federal, state, local, and tribal governments, private-sector, and nongovernmental organizations to work together effectively and efficiently to prepare for, prevent, respond to, and recover from domestic incidents, regardless of cause, size, or complexity, including acts of catastrophic terrorism. This document identifies the operational concepts of the Federal Radiological Monitoring and Assessment Center's (FRMAC) implementation of the NIMS/ICS response structure under the National Response Plan (NRP). The construct identified here defines the basic response template to be tailored to the incident-specific response requirements. FRMAC's mission to facilitate interagency environmental data management, monitoring, sampling, analysis, and assessment and link this information to the planning and decision staff clearly places the FRMAC in the Planning Section. FRMAC is not a mitigating resource for radiological contamination but is present to conduct radiological impact assessment for public dose avoidance. Field monitoring is a fact-finding mission to support this effort directly. Decisions based on the assessed data will drive public protection and operational requirements. This organizational structure under NIMS is focused by the mission responsibilities and interface requirements following the premise to provide emergency responders with a flexible yet standardized structure for incident response activities. The coordination responsibilities outlined in the NRP are based on the NIMS/ICS construct and Unified Command (UC) for management of a domestic incident. The NRP Nuclear/Radiological Incident Annex (NUC) further provides requirements and protocols for coordinating federal government capabilities to respond to nuclear/radiological Incidents of National Significance (INS) and other radiological incidents. When a FRMAC is established, it operates under the parameters of NIMS as defined in the NRP. FRMAC and its operations have been modified to reflect NIMS/ICS concepts and principles and to facilitate working in a Unified Command structure. FRMAC is established at or near the scene of the incident to coordinate radiological monitoring and assessment and is established in coordination with the U.S. Department of Homeland Security (DHS); the coordinating agency; other federal agencies; and state, local, and tribal authorities. However, regardless of the coordinating agency designation, U.S. Department of Energy (DOE) coordinates radiological monitoring and assessment activities for the initial phases of the offsite federal incident response through the Radiological Assistance Program (RAP) and FRMAC assets. Monitoring and assessment data are managed by FRMAC in an accountable, secure, and retrievable format. Monitoring data interpretations, including exposure rate contours, dose projections, and any requested radiological assessments are to be provided to the DHS; to the coordinating agency; and to state, local, and tribal government agencies.

  20. TLC-Asthma: An Integrated Information System for Patient-centered Monitoring, Case Management, and Point-of-Care Decision Support

    PubMed Central

    Adams, William G.; Fuhlbrigge, Anne L.; Miller, Charles W.; Panek, Celeste G.; Gi, Yangsoon; Loane, Kathleen C.; Madden, Nancy E.; Plunkett, Anne M.; Friedman, Robert H.

    2003-01-01

    A great deal of successful work has been done in the area of EMR development, implementation, and evaluation. Less work has been done in the area of automated systems for patients. Efforts to link data at multiple levels – the patient, the case manager, and the clinician have been rudimentary to-date. In this paper we present a model information system that integrates patient health information across multiple domains to support the monitoring and care of children with persistent asthma. The system has been developed for use in a multi-specialty group practice and includes three primary components: 1) a patient-centered telephone-linked communication system; 2) a web-based alert reporting and nurse case-management system; and 3) EMR-based provider communication to support clinical decision making at the point-of-care. The system offers a model for a new level of connectivity for health information that supports customized monitoring, IT-enabled nurse case-managers, and the delivery of longitudinal data to clinicians to support the care of children with persistent asthma. Systems like the one described are well -suited, perhaps essential, technologies for the care of children and adults with chronic conditions such as asthma. PMID:14728122

  1. Monitoring the Dusty S-cluster object (DSO/G2) near the Galactic center black hole: model predictions for Br-gamma energy shift during the passage

    NASA Astrophysics Data System (ADS)

    Karas, V.; Zajacek, M.; Kunneriath, D.; Valencia-S., M.; Eckart, A.

    2015-07-01

    Dusty S-cluster Object (DSO/G2) has approached the supermassive black hole at the center of the Galaxy and its passage through the peribothron was monitored by the ESO VLT/SINFONI observations taken in the near-infrared K-band. The profile and the energy shift of Br-gamma spectral line can be employed to further constrain the nature of this event. We update and discuss the model predictions for different scenarios: a core-less cloud versus an enshrouded star with a partially disintegrating envelope, potentially forming a bow shock due to stellar outflow. A comparison of observations with model predictions shows that the DSO is a star rather than a core-less cloud.

  2. Earthquake Scaling, Simulation and Forecasting

    NASA Astrophysics Data System (ADS)

    Sachs, Michael Karl

    Earthquakes are among the most devastating natural events faced by society. In 2011, just two events, the magnitude 6.3 earthquake in Christcurch New Zealand on February 22, and the magnitude 9.0 Tohoku earthquake off the coast of Japan on March 11, caused a combined total of $226 billion in economic losses. Over the last decade, 791,721 deaths were caused by earthquakes. Yet, despite their impact, our ability to accurately predict when earthquakes will occur is limited. This is due, in large part, to the fact that the fault systems that produce earthquakes are non-linear. The result being that very small differences in the systems now result in very big differences in the future, making forecasting difficult. In spite of this, there are patterns that exist in earthquake data. These patterns are often in the form of frequency-magnitude scaling relations that relate the number of smaller events observed to the number of larger events observed. In many cases these scaling relations show consistent behavior over a wide range of scales. This consistency forms the basis of most forecasting techniques. However, the utility of these scaling relations is limited by the size of the earthquake catalogs which, especially in the case of large events, are fairly small and limited to a few 100 years of events. In this dissertation I discuss three areas of earthquake science. The first is an overview of scaling behavior in a variety of complex systems, both models and natural systems. The focus of this area is to understand how this scaling behavior breaks down. The second is a description of the development and testing of an earthquake simulator called Virtual California designed to extend the observed catalog of earthquakes in California. This simulator uses novel techniques borrowed from statistical physics to enable the modeling of large fault systems over long periods of time. The third is an evaluation of existing earthquake forecasts, which focuses on the Regional Earthquake Likelihood Models (RELM) test: the first competitive test of earthquake forecasts in California.

  3. Education for Earthquake Disaster Prevention in the Tokyo Metropolitan Area

    NASA Astrophysics Data System (ADS)

    Oki, S.; Tsuji, H.; Koketsu, K.; Yazaki, Y.

    2008-12-01

    Japan frequently suffers from all types of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. In the first half of this year, we already had three big earthquakes and heavy rainfall, which killed more than 30 people. This is not just for Japan but Asia is the most disaster-afflicted region in the world, accounting for about 90% of all those affected by disasters, and more than 50% of the total fatalities and economic losses. One of the most essential ways to reduce the damage of natural disasters is to educate the general public to let them understand what is going on during those desasters. This leads individual to make the sound decision on what to do to prevent or reduce the damage. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools, and ERI, the Earthquake Research Institute, is qualified to develop education for earthquake disaster prevention in the Tokyo metropolitan area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1703 Genroku earthquake (M 8.0) and the 1923 Kanto earthquake (M 7.9) which had 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global economic repercussion. To better understand earthquakes in this region, "Special Project for Earthquake Disaster Mitigation in Tokyo Metropolitan Area" has been conducted mainly by ERI. It is a 4-year project to develop a high-density network with 400 sites at local elementary schools. We start our education project by using the real seismograms observed at their own schoolyards, putting emphasis on the reality and causality of earthquake disasters. In this presentation, we report some of the educational demonstrations and science experiments for the school kids and their parents.

  4. Monitoring the Dusty S-cluster Object (DSO/G2) on its Orbit toward the Galactic Center Black Hole

    NASA Astrophysics Data System (ADS)

    Valencia-S., M.; Eckart, A.; Zaja?ek, M.; Peissker, F.; Parsa, M.; Grosso, N.; Mossoux, E.; Porquet, D.; Jalali, B.; Karas, V.; Yazici, S.; Shahzamanian, B.; Sabha, N.; Saalfeld, R.; Smajic, S.; Grellmann, R.; Moser, L.; Horrobin, M.; Borkar, A.; García-Marín, M.; Dov?iak, M.; Kunneriath, D.; Karssen, G. D.; Bursa, M.; Straubmeier, C.; Bushouse, H.

    2015-02-01

    We analyze and report in detail new near-infrared (1.45-2.45 ?m) observations of the Dusty S-cluster Object (DSO/G2) during its approach to the black hole at the center of the Galaxy that were carried out with the ESO Very Large Telescope/SINFONI between 2014 February and September. Before 2014 May we detect spatially compact Br? and Pa? line emission from the DSO at about 40 mas east of Sgr A*. The velocity of the source, measured from the redshifted emission, is 2700 ± 60 km s-1. No blueshifted emission above the noise level is detected at the position of Sgr A* or upstream of the presumed orbit. After May we find spatially compact Br? blueshifted line emission from the DSO at about 30 mas west of Sgr A* at a velocity of -3320 ± 60 km s-1 and no indication for significant redshifted emission. We do not detect any significant extension of the velocity gradient across the source. We find a Br? line FWHM of 50 ± 10 Å before and 15 ± 10 Å after the peribothron transit, i.e., no significant line broadening with respect to last year is observed. Br? line maps show that the bulk of the line emission originates from a region of less than 20 mas diameter. This is consistent with a very compact source on an elliptical orbit with a peribothron time passage in 2014.39 ± 0.14. For the moment, the flaring activity of the black hole in the near-infrared regime has not shown any statistically significant increment. Increased accretion activity of Sgr A* may still be upcoming. We discuss details of a source model according to which the DSO is a young accreting star rather than a coreless gas and dust cloud.

  5. Earthquake Simulator Finds Tremor Triggers

    SciTech Connect

    Johnson, Paul

    2015-03-27

    Using a novel device that simulates earthquakes in a laboratory setting, a Los Alamos researcher has found that seismic waves-the sounds radiated from earthquakes-can induce earthquake aftershocks, often long after a quake has subsided. The research provides insight into how earthquakes may be triggered and how they recur. Los Alamos researcher Paul Johnson and colleague Chris Marone at Penn State have discovered how wave energy can be stored in certain types of granular materials-like the type found along certain fault lines across the globe-and how this stored energy can suddenly be released as an earthquake when hit by relatively small seismic waves far beyond the traditional “aftershock zone” of a main quake. Perhaps most surprising, researchers have found that the release of energy can occur minutes, hours, or even days after the sound waves pass; the cause of the delay remains a tantalizing mystery.

  6. Are Earthquakes a Critical Phenomenon?

    NASA Astrophysics Data System (ADS)

    Ramos, O.

    2014-12-01

    Earthquakes, granular avalanches, superconducting vortices, solar flares, and even stock markets are known to evolve through power-law distributed events. During decades, the formalism of equilibrium phase transition has coined these phenomena as critical, which implies that they are also unpredictable. This work revises these ideas and uses earthquakes as the paradigm to demonstrate that slowly driven systems evolving through uncorrelated and power-law distributed avalanches (UPLA) are not necessarily critical systems, and therefore not necessarily unpredictable. By linking the correlation length to the pdf of the distribution, and comparing it with the one obtained at a critical point, a condition of criticality is introduced. Simulations in the classical Olami-Feder-Christensen (OFC) earthquake model confirm the findings, showing that earthquakes are not a critical phenomenon. However, one single catastrophic earthquake may show critical properties and, paradoxically, the emergence of this temporal critical behaviour may eventually carry precursory signs of catastrophic events.

  7. Testing an earthquake prediction algorithm

    USGS Publications Warehouse

    Kossobokov, V.G.; Healy, J.H.; Dewey, J.W.

    1997-01-01

    A test to evaluate earthquake prediction algorithms is being applied to a Russian algorithm known as M8. The M8 algorithm makes intermediate term predictions for earthquakes to occur in a large circle, based on integral counts of transient seismicity in the circle. In a retroactive prediction for the period January 1, 1985 to July 1, 1991 the algorithm as configured for the forward test would have predicted eight of ten strong earthquakes in the test area. A null hypothesis, based on random assignment of predictions, predicts eight earthquakes in 2.87% of the trials. The forward test began July 1, 1991 and will run through December 31, 1997. As of July 1, 1995, the algorithm had forward predicted five out of nine earthquakes in the test area, which success ratio would have been achieved in 53% of random trials with the null hypothesis.

  8. Testing earthquake source inversion methodologies

    USGS Publications Warehouse

    Page, M.; Mai, P.M.; Schorlemmer, D.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  9. Geodetic Constraints on Deformation and Earthquake Hazards in the E Mediterranean and Adjacent Regions (Invited)

    NASA Astrophysics Data System (ADS)

    Reilinger, R. E.; King, R. W.; Floyd, M.; Vernant, P.; McClusky, S.; Cakir, Z.; Ergintav, S.; Ozener, H.

    2013-12-01

    During the past ~20 years, GPS, and more recently InSAR, have had a profound impact on estimates of earthquake hazards by quantifying active deformation on a global scale. Fault slip rates have been determined to mm and sub-mm/yr precisions for the large majority of active continental faults, and many faults that have generated large historical earthquakes are being monitored continuously with dedicated GPS networks. In addition, earthquakes that have occurred within these networks are providing unprecedented information on the seismic deformation cycle (i.e., inter-, co-, and post-seismic), that allow improved estimates of rates of strain accumulation and hence earthquake repeat times. The identification of aseismic fault creep from GPS and InSAR observations has provided further constraints on fault mechanics, with direct implications for earthquake occurrence. These geodetic observations, in combination with instrumental, historical, and paleo-earthquake studies that allow estimation of seismic offsets and independent estimates of earthquake repeat times, provide an observational basis to estimate earthquake likelihood with quantitative uncertainties, a substantial and important step forward in seismic hazard estimation. Specific targets for geodetic studies of faulting in the 'greater' E Mediterranean region, include the North Anatolian (NAF), Dead Sea, E Anatolian, Main Caucasus Thrust, Gulf of Corinth, Main Recent, and many smaller, but potentially very destructive fault systems, as well as faulting along the Hellenic, Cyprus, and Calabrian trench systems that have the added hazards associated with tsunami generation. We use examples from the NAF and Hellenic subduction system to illustrate the utility of geodetic observations for constraining fault behavior and consider the implications for seismic hazard estimation. While we emphasize the important progress made in recent years, continued monitoring and analysis is essential to develop sufficient case studies to better understand the mechanics of earthquake generation, and the range of behaviors between and within individual fault systems, to more precisely constrain earthquake hazard estimates.

  10. The CATDAT damaging earthquakes database

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  11. Towards the Future "Earthquake" School in the Cloud: Near-real Time Earthquake Games Competition in Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, K. H.; Liang, W. T.; Wu, Y. F.; Yen, E.

    2014-12-01

    To prevent the future threats of natural disaster, it is important to understand how the disaster happened, why lives were lost, and what lessons have been learned. By that, the attitude of society toward natural disaster can be transformed from training to learning. The citizen-seismologists-in-Taiwan project is designed to elevate the quality of earthquake science education by means of incorporating earthquake/tsunami stories and near-real time earthquake games competition into the traditional curricula in schools. Through pilot of courses and professional development workshops, we have worked closely with teachers from elementary, junior high, and senior high schools, to design workable teaching plans through a practical operation of seismic monitoring at home or school. We will introduce how the 9-years-old do P- and S-wave picking and measure seismic intensity through interactive learning platform, how do scientists and school teachers work together, and how do we create an environment to facilitate continuous learning (i.e., near-real time earthquake games competition), to make earthquake science fun.

  12. Water Quality, Fish Tissue, and Bed Sediment Monitoring in Waterbodies of Fort Chaffee Maneuver Training Center, Arkansas, 2002-2004

    USGS Publications Warehouse

    Justus, B.G.; Stanton, Gregory P.

    2005-01-01

    The Fort Chaffee Maneuver Training Center is a facility used to train as many as 50,000 Arkansas National Guardsmen each year. Due to the nature of ongoing training and also to a poor understanding of environmental procedures that were practiced in the World War II era, areas within Fort Chaffee have the potential to be sources of a large number of contaminants. Because some streams flow on to Fort Chaffee, there is also the potential for sources that are off post to affect environmental conditions on post. This study evaluates constituent concentrations in water, fish tissue, and bed sediment collected from waterbodies on Fort Chaffee between September 2002 and July 2004. Constituent concentrations detected in the three media and measured at nine stream sites and four lake sites were compared to national and regional criteria when available. Two of the larger streams, Big and Vache Grasse Creeks, were sampled at multiple sites. All three sampled media were analyzed for insecticides, PCBs, explosives, and trace elements. Additionally, water samples were analyzed for nutrients and herbicides. The different constituents detected in the three sample media (water, fish tissue, and bed sediment) indicate that land-use activities both on and off post are influencing environmental conditions. Contaminants such as explosives that were sometimes detected in water samples have an obvious relation to military training; however, the occurrence and locations of some nutrients, insecticides, and trace elements suggest that land use both on and off post also could be influencing environmental conditions to some degree. Constituent concentrations at sites on Vache Grasse Creek, and particularly the most upstream site, which was located immediately downstream from an off-post wastewater-treatment facility, indicate that environmental conditions were being influenced by an off-post source. The most upstream site on Vache Grasse Creek had both the highest number of detections and the highest concentrations detected of all sites sampled. Event-mean storm concentrations and storm loads calculated from storm-flow samples at two sites each for Big and Vache Grasse Creeks indicate that storm loads were highest at the two Vache Grasse Creek sites for 24 of the 25 constituents detected. Further evaluation by normalizing storm loads at Big Creek to storm loads at Vache Grasse Creek by stream flow indicate that event loads at Vache Grasse Creek were about two or more times higher than those on Big Creek for 15 of the 25 constituents measured. Low concentrations of arsenic and lead were detected in water samples, but all detections for the two trace elements occurred in samples collected at the upstream site on Vache Grasse Creek. The nickel concentration in fish livers collected from the upstream site on Vache Grasse Creek was 45 percent higher than the median of a national study of 145 sites. Mercury concentrations in edible fish tissue, which are a widespread concern in the United States, exceeded an USEPA criterion for methylmercury of 300 ?g/kg in four of nine samples; however, concentrations are typical of mercury concentrations in fish tissues for the State of Arkansas. Constituent concentrations at some sites indicate that environmental conditions are being influenced by on-post activities. Of the 55 (excluding total organic carbon) organic constituents analyzed in water samples, only 10 were detected above the minimum detection limit but four of those were explosives. Bed-sediment samples from one site located on Grayson Creek, and nearest the administrative and residential (cantonment) area, had detections for arsenic, copper, lead, manganese, nickel, and zinc that were above background concentrations, and concentrations for arsenic and nickel at this site exceeded lowest effect level criteria established by the U.S. Environmental Protection Agency. The site on Grayson Creek also had the only detections of DDT metabolites in bed sedi

  13. Tenth U.S. National Conference on Earthquake Engineering Frontiers of Earthquake Engineering