Science.gov

Sample records for earthquake monitoring center

  1. Earthquake Monitoring in Haiti

    USGS Multimedia Gallery

    Following the devastating 2010 Haiti earthquake, the USGS has been helping with earthquake awareness and monitoring in the country, with continued support from the U.S. Agency for International Development (USAID). This assistance has helped the Bureau des Mines et de l'Energie (BME) in Port-au-Prin...

  2. Earthquake engineering research center annual report, 1991-1992

    SciTech Connect

    Not Available

    1992-10-01

    The Earthquake Engineering Research Center exists to conduct research and develop technical information in all areas pertaining to earthquake engineering, including strong ground motion, response of natural and manmade structures to earthquakes, design of structures to resist earthquakes, development of new systems for earthquake protection, and development of architectural and public policy aspects of earthquake engineering. The purpose of the Center is achieved through three major functions. The first and primary function is academic research that is performed by graduate students, research engineers, and visiting postdoctoral scholars working with the Center's faculty participants. The research is funded by extramural grants awarded to individual faculty participants from private, state, and federal agencies.

  3. Earthquake Processing System at the Alaska Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Hansen, Roger; Staff, Aeic

    2010-05-01

    The Alaska Earthquake Information Center (AEIC) has the responsibility to record, locate, catalog, and alert Government entities and the public about the occurrence of earthquakes originating within the State of Alaska. Currently, we catalog about 25,000 events per year in and around the State of Alaska, utilizing a network of over 550 seismic stations. In order to handle this many stations recording such a large number of events, we have had to choose operating procedures that are both efficient and robust to be able to function with our staff of 12 people. After much evaluation of competing systems, we chose Antelope as the architecture that would allow us to best grow our capabilities in the proper directions. In this presentation we will illustrate many of our unique implementations of the Antelope tools, and the many additional modules constructed with the Antelope toolbox that have been developed to fit particular needs of AEIC. In addition to simply cataloging the many events in Alaska, we are responsible for rapid notification, ShakeMaps, several local, regional and teleseismic magnitudes (including regional moment tensors), early warning of critical structures such as the Trans-Alaska Oil Pipeline, and assistance with tsunami mitigation and warnings.

  4. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    USGS Publications Warehouse

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-01-01

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  5. Earthquake Observation through Groundwater Monitoring in South Korea

    NASA Astrophysics Data System (ADS)

    Piao, J.; Woo, N. C.

    2014-12-01

    According to previous researches, the influence of the some earthquakes can be detected by groundwater monitoring. Even in some countries groundwater monitoring is being used as an important tool to identify earthquake precursors and prediction measures. Thus, in this study we attempt to catch the anomalous changes in groundwater produced by earthquakes occurred in Korea through the National Groundwater Monitoring Network (NGMN). For observing the earthquake impacts on groundwater more effectively, from the National Groundwater Monitoring Network we selected 28 stations located in the five earthquake-prone zones in South Korea. And we searched the responses to eight earthquakes with M ?2.5 which occurred in the vicinity of five earthquake-prone zones in 2012. So far, we tested the groundwater monitoring data (water-level, temperature and electrical conductivity). Those data have only been treated to remove barometric pressure changes. Then we found 29 anomalous changes, confirming that groundwater monitoring data can provide valuable information on earthquake effects. To identify the effect of the earthquake from mixture signals of water-level, other signals must be separated from the original data. Periodic signals will be separated from the original data using Fast Fourier Transform (FFT). After that we will attempt to separate precipitation effect, and determine if the anomalies were generated by earthquake or not.

  6. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  7. Monitoring the Pollino Earthquake Swarm (Italy)

    NASA Astrophysics Data System (ADS)

    Roessler, D.; Passarelli, L.; Govoni, A.; Rivalta, E.

    2014-12-01

    The Mercure Basin (MB) and the Castrovillari Fault (CF) in the Pollino range (southern Apennines, Italy) representone of the most prominent seismic gaps in the Italian seismic catalog, with no M>6 earthquakes during the lastcenturies. In recent times, the MB has been repeatedly interested by seismic swarms.The most energetic swarm started in 2010 and still active in 2014. The seismicity culminated in autumn 2012 with a M=5 event on October 25. In contrast, the CF appears aseismic. Only the northern part of the CF has experienced microseismicity.The range host a number of additional sub-parallel faults.Their rheology is unclear. Current debates include the potential of the MB and the CF to host largeearthquakes and the level and the style of deformation.Understanding the seismicity and the behaviour of the faultsis therefore necessary to assess the seismic hazard. The GFZ German Research Centre for Geosciences and INGV, Italy, have been jointly monitoring the ongoing seismicity using a small-aperture seismic array, integrated in a temporary seismic network. Using the array, we automatically detect about ten times more earthquakes than currently included inlocal catalogues corresponding to completeness above M~0.5.In the course of the swarm, seismicity has mainly migrated within the Mercure Basin.However, the eastward spread towards the northern tio of the CF in 2013 marksa phase with seismicity located outside of the Mercure Basin.The event locations indicate spatially distinct clusters with different mechanisms across the E-W trending Pollino Fault.The clusters differ in strike and dip.Calibration of the local magnitude scale confirms earlier studies further north in the Apennines. The station corrections show N-S variation indicating that the Pollino Fault forms an important structural boundary.

  8. Disaster monitoring for Japan Earthquake with satellites by JAXA

    NASA Astrophysics Data System (ADS)

    Takahashi, Masuo; Shimada, Masanobu; Miyagi, Yousuke; Ohki, Masato; Kawano, Noriyuki; Shiraishi, Tomohiro; Motohka, Takeshi

    2011-11-01

    The Japan Aerospace Exploration Agency (JAXA) performed disaster monitoring of the Great East Japan Earthquake in 2011. The Advanced Land Observing Satellite (ALOS), "Daichi," acquired 450 scenes of disaster monitoring of the earthquake. JAXA also received more than 5,000 scenes via the International Disaster Charter and Sentinel Asia. JAXA analyzed these images and provided the results to the Government of Japan as well as to the local governments.

  9. Earthquake monitoring for multi-temporal images of Ziyuan-3

    NASA Astrophysics Data System (ADS)

    Wang, Qing; Jiang, Yong-hua; Zhang, Guo; Sheng, Qing-hong

    2015-12-01

    With frequent occurrence of earthquake disaster, earthquake monitoring becomes increasingly concerned. Global observing by optical remote sensing is an emerging technology widely applied in monitoring temporal changes of topography in earthquake. It provides advantages of large width of observation, fast data acquisition and high time effectiveness. This technique takes advantages of accurate image registration of pre-seismic and post-seismic to spot surface rupture zones. Therefore, the spatial alignment accuracy of multi temporal images becomes a problem that hinder the earthquake monitoring. Considering the adverse impact of different imaging angle, camera lens distortion and other factors on image registration, a new approach of high accurate registration based on constraining positioning consistency in rational function model (RFM) is proposed. Ziyuan3 images of Yutian country in Xinjiang are used to perform the earthquake monitoring experiment. After applying the proposed method, registration accuracy of pre-seismic and postseismic images is better than 0.6 pixel; surface rupture zones caused by earthquake are acquired promptly.

  10. Recent improvements in earthquake and tsunami monitoring in the Caribbean

    NASA Astrophysics Data System (ADS)

    Gee, L.; Green, D.; McNamara, D.; Whitmore, P.; Weaver, J.; Huang, P.; Benz, H.

    2007-12-01

    Following the catastrophic loss of life from the December 26, 2004, Sumatra-Andaman Islands earthquake and tsunami, the U.S. Government appropriated funds to improve monitoring along a major portion of vulnerable coastal regions in the Caribbean Sea, the Gulf of Mexico, and the Atlantic Ocean. Partners in this project include the United States Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the Puerto Rico Seismic Network (PRSN), the Seismic Research Unit of the University of the West Indies, and other collaborating institutions in the Caribbean region. As part of this effort, the USGS is coordinating with Caribbean host nations to design and deploy nine new broadband and strong-motion seismic stations. The instrumentation consists of an STS-2 seismometer, an Episensor accelerometer, and a Q330 high resolution digitizer. Six stations are currently transmitting data to the USGS National Earthquake Information Center, where the data are redistributed to the NOAA's Tsunami Warning Centers, regional monitoring partners, and the IRIS Data Management Center. Operating stations include: Isla Barro Colorado, Panama; Gun Hill Barbados; Grenville, Grenada; Guantanamo Bay, Cuba; Sabaneta Dam, Dominican Republic; and Tegucigalpa, Honduras. Three additional stations in Barbuda, Grand Turks, and Jamaica will be completed during the fall of 2007. These nine stations are affiliates of the Global Seismographic Network (GSN) and complement existing GSN stations as well as regional stations. The new seismic stations improve azimuthal coverage, increase network density, and provide on-scale recording throughout the region. Complementary to this network, NOAA has placed Deep-ocean Assessment and Reporting of Tsunami (DART) stations at sites in regions with a history of generating destructive tsunamis. Recently, NOAA completed deployment of 7 DART stations off the coasts of Montauk Pt, NY; Charleston, SC; Miami, FL; San Juan, Puerto Rico; New Orleans, LA; and Bermuda as part of the U.S. tsunami warning system expansion. DART systems consist of an anchored seafloor pressure recorder (BPR) and a companion moored surface buoy for real-time communications. The new stations are a second-generation design (DART II) equipped with two- way satellite communications that allow NOAA's Tsunami Warning Centers to set stations in event mode in anticipation of possible tsunamis or retrieve the high-resolution (15-s intervals) data in one-hour blocks for detailed analysis. Combined with development of sophisticated wave propagation and site-specific inundation models, the DART data are being used to forecast wave heights for at-risk coastal communities. NOAA expects to deploy a total of 39 DART II buoy stations by 2008 (32 in the Pacific and 7 in the Atlantic, Caribbean and Gulf regions). The seismic and DART networks are two components in a comprehensive and fully-operational global observing system to detect and warn the public of earthquake and tsunami threats. NOAA and USGS are working together to make important strides in enhancing communication networks so residents and visitors can receive earthquake and tsunami watches and warnings around the clock.

  11. Southern California Earthquake Center (SCEC) Summer Internship Programs

    NASA Astrophysics Data System (ADS)

    Benthien, M. L.; Perry, S.; Jordan, T. H.

    2004-12-01

    For the eleventh consecutive year, the Southern California Earthquake Center (SCEC) coordinated undergraduate research experiences in summer 2004, allowing 35 students with a broad array of backgrounds and interests to work with the world's preeminent earthquake scientists and specialists. Students participate in interdisciplinary, system-level earthquake science and information technology research, and several group activities throughout the summer. Funding for student stipends and activities is made possible by the NSF Research Experiences for Undergraduates (REU) program. SCEC coordinates two intern programs: The SCEC Summer Undergraduate Research Experience (SCEC/SURE) and the SCEC Undergraduate Summer in Earthquake Information Technology (SCEC/USEIT). SCEC/SURE interns work one-on-one with SCEC scientists at their institutions on a variety of earthquake science research projects. The goals of the program are to expand student participation in the earth sciences and related disciplines, encourage students to consider careers in research and education, and to increase diversity of students and researchers in the earth sciences. 13 students participated in this program in 2004. SCEC/USEIT is an NSF REU site that brings undergraduate students from across the country to the University of Southern California each summer. SCEC/USEIT interns interact in a team-oriented research environment and are mentored by some of the nation's most distinguished geoscience and computer science researchers. The goals of the program are to allow undergraduates to use advanced tools of information technology to solve problems in earthquake research; close the gap between computer science and geoscience; and engage non-geoscience majors in the application of earth science to the practical problems of reducing earthquake risk. SCEC/USEIT summer research goals are structured around a grand challenge problem in earthquake information technology. For the past three years the students have developed a new earthquake and fault visualization platform named "LA3D." 22 students participated in this program in 2004. SCEC Interns come together several times during the summer, beginning with a Communication Workshop that develops the student's oral and written communication skills. In mid-summer, a one-day SCEC Intern Colloquium is held, where student researchers present status reports on their research, followed by a three-day field trip of southern California geology and SCEC research locations. Finally, at the end of the summer each student presents a poster at the SCEC Annual Meeting.

  12. Monitoring seismic velocity changes associated with the 2014 Mw 6.0 South Napa earthquake

    NASA Astrophysics Data System (ADS)

    Taira, T.; Brenguier, F.; Kong, Q.

    2014-12-01

    We analyze ambient seismic noise wavefield to explore temporal variations in seismic velocity associated with the 24 August 2014 Mw 6.0 South Napa earthquake. We estimate relative velocity changes (dv/v) with MSNoise [Lecocq et al., 2014, SRL] by analyzing continuous waveforms collected at 10 seismic stations that locate near the epicenter of the 2014 South Napa earthquake. Following Brenguier et al. [2008, Science], our preliminary analysis focuses on the vertical component waveforms in a frequency range of 0.1-0.9 Hz. We determine the reference Green's function (GF) for each station pair as the average of 1-day stacks of GFs obtained in the time interval, January through July 2014. We estimate the time history of dv/v by measuring delay times between 10-day stacks of GF and reference GF. We find about 0.07% velocity reduction immediately after the 2014 South Napa earthquake by measuring the delay times between stacked and reference GFs. Our preliminary result also reveals a post-seismic relaxation process. The velocity reduction is down to 0.04% about 20 days after the 2014 South Napa earthquake. We have implemented an automated system to monitor the time history of dv/v (http://earthquakes.berkeley.edu/~taira/SNapa/SNapa_Noise.html) by using waveforms archived at the Northern California Earthquake Data Center. We will characterize the detailed temporal evolution of velocity change associated with the 2014 South Napa earthquake.

  13. Towards an Earthquake Monitoring System for Indian Ocean Tsunami Early Warning

    NASA Astrophysics Data System (ADS)

    Kraft, T.; Hanka, W.; Saul, J.; Heinloo, A.; Reinhardt, J.; Weber, B.; Becker, J.; Thoms, H.; Pahlke, D.

    2006-12-01

    The Mw=9.3 Sumatra earthquake of December 26, 2004, generated a tsunami that effected the entire Indian Ocean region and caused approximately 230,000 fatalities. The German human aid program for the Indian Ocean region started immediately after the disaster with substantial funding of 45M Euro for the proposed German Indian Ocean Tsunami Early Warning System (GITEWS). In this presentation we describe the concept of the Earthquake Monitoring System and report on its present status: The major challenge for a Earthquake Monitoring System (ESM) is to deliver information about location, size, source parameters and possibly rupture process as early as possible before the potential tsunami hits the neighboring coastal areas. Tsunamigenic earthquakes are expected to occur in subduction zones close to coast lines. This is particularly true for the Sunda trench off-shore Indonesia, but also in the Macran subduction zone off-shore Iran. Key for an Indian Ocean monitoring system with short warning times is therefore a dense real-time seismic network in Indonesia, supplemented by a substantial number of stations in other countries and territories within and around the Indian Ocean. 40 new broadband and strong motion stations will be installed during the GITEWS project until 2010. The EMS Control Center will be based on an enhanced version of the widely used SeisComP software and the GEOFON earthquake information system prototype presently operated at the GFZ-Potsdam (http://geofon.gfz- potsdam.de/db/eqinfo.php). However, the Control Center software under development at the moment will be more reliable, faster and automatic but with operator supervison. It will use sophisticated visualisation tools, offer the posibility for manual correction and re-calculation, flexible configuration and support for distributed processing. Is large redundancy for algorithms, moduls and hardware assures easy integration into larger multi-sensor, multi-hazard control centers and decision support systems. A first prototype of the EMS Control Center software will be ready in mid 2007.

  14. Real-time earthquake monitoring: Early warning and rapid response

    NASA Technical Reports Server (NTRS)

    1991-01-01

    A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.

  15. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These data can be accessed through the above web services and through special NCEDC web pages.

  16. USGS Hires Students to Help Improve Earthquake Monitoring

    USGS Multimedia Gallery

    A USGS student employee and sophomore at the Colorado School of Mines, was among the first hired by USGS using Recovery Act funding to upgrade the seismic stations of the Advanced National Seismic System (ANSS) Backbone. The USGS is using Recovery Act funding to upgrade its earthquake monitoring net...

  17. Lessons Learned from Creating the Public Earthquake Resource Center at CERI

    NASA Astrophysics Data System (ADS)

    Patterson, G. L.; Michelle, D.; Johnston, A.

    2004-12-01

    The Center for Earthquake Research and Information (CERI) at the University of Memphis opened the Public Earthquake Resource Center (PERC) in May 2004. The PERC is an interactive display area that was designed to increase awareness of seismology, Earth Science, earthquake hazards, and earthquake engineering among the general public and K-12 teachers and students. Funding for the PERC is provided by the US Geological Survey, The NSF-funded Mid America Earthquake Center, and the University of Memphis, with input from the Incorporated Research Institutions for Seismology. Additional space at the facility houses local offices of the US Geological Survey. PERC exhibits are housed in a remodeled residential structure at CERI that was donated by the University of Memphis and the State of Tennessee. Exhibits were designed and built by CERI and US Geological Survey staff and faculty with the help of experienced museum display subcontractors. The 600 square foot display area interactively introduces the basic concepts of seismology, real-time seismic information, seismic network operations, paleoseismology, building response, and historical earthquakes. Display components include three 22" flat screen monitors, a touch sensitive monitor, 3 helicorder elements, oscilloscope, AS-1 seismometer, life-sized liquefaction trench, liquefaction shake table, and building response shake table. All displays include custom graphics, text, and handouts. The PERC website at www.ceri.memphis.edu/perc also provides useful information such as tour scheduling, ask a geologist, links to other institutions, and will soon include a virtual tour of the facility. Special consideration was given to address State science standards for teaching and learning in the design of the displays and handouts. We feel this consideration is pivotal to the success of any grass roots Earth Science education and outreach program and represents a valuable lesson that has been learned at CERI over the last several years. Another critical lesson that has been learned is to employ K-12 education professionals and utilize undergrad and graduate student workers in the University's Department of Education. Such staff members are keenly aware of the pressures and needs in diverse communities such as Shelby County, Tennessee and are uniquely suited to design and implement new and innovative programs that provide substantive short-term user benefits and promote long-term relationships with the K-12 teachers, students, and teacher's organizations.

  18. Towards a Better Earthquake and Tsunami Monitoring System: Indian Effort

    NASA Astrophysics Data System (ADS)

    Bansal, B.; Gupta, G.

    2005-12-01

    The December 26, 2004 earthquake (mw 9.3) in the Andaman-Sumatra subduction zone was unprecedented in its size, rupture extent as well as tsunamigenic capacity. Knowledge about the lack of a predecessor to this event was part of the reason for the apparent lack of anticipation and preparedness. Clearly, this event has changed the perception of earthquake/tsunami hazard along the Andaman and Nicobar Islands as well as regions along the southwest coast of India, far removed from the source earthquake. The government of India is embarking on a major programme to study the earthquake processes in the Andaman and Nicobar regions, part of the subduction zone stretching about 1000 km, most of which was affected by the earthquake. These efforts include expansion and modernization of the existing seismic network, continuous and campaign-mode GPS surveys, geological and geophysical investigations and inundation mapping. Research programmes being funded by the DST aims at improved understanding of the seismic sources, their past behavior, rupture characteristics, physical processing related to earthquakes in this subduction zone and style of deformation using geodetic techniques. A network of more than 100 seismological stations operate in India presently, most of them being operated by the India Meteorological Department, the nodal agency for seismological studies. Linking and modernization and addition of more seismic observatories are underway. The station at Port Blair has been upgraded as broadband and a good network of portable stations are now operational. Added to these are the GPS campaign mode surveys that are being done along the entire arc. Establishment of a multiparametric geophysical observatory to monitor physical processes prior to large earthquakes is another experiment in plan. The structure of the Tsunami Warning System being proposed also involves establishment of more tide gauges and pressure sensors at strategic locations. It is expected that the data generated through various research initiatives will provide the necessary scientific basis for the proposed warning system.

  19. Artificial neural network model for earthquake prediction with radon monitoring.

    PubMed

    Külahci, Fatih; Inceöz, Murat; Do?ru, Mahmut; Aksoy, Ercan; Baykara, Oktay

    2009-01-01

    Apart from the linear monitoring studies concerning the relationship between radon and earthquake, an artificial neural networks (ANNs) model approach is presented starting out from non-linear changes of the eight different parameters during the earthquake occurrence. A three-layer Levenberg-Marquardt feedforward learning algorithm is used to model the earthquake prediction process in the East Anatolian Fault System (EAFS). The proposed ANN system employs individual training strategy with fixed-weight and supervised models leading to estimations. The average relative error between the magnitudes of the earthquakes acquired by ANN and measured data is about 2.3%. The relative error between the test and earthquake data varies between 0% and 12%. In addition, the factor analysis was applied on all data and the model output values to see the statistical variation. The total variance of 80.18% was explained with four factors by this analysis. Consequently, it can be concluded that ANN approach is a potential alternative to other models with complex mathematical operations. PMID:18789709

  20. Enhanced Earthquake Monitoring in the European Arctic

    NASA Astrophysics Data System (ADS)

    Antonovskaya, Galina; Konechnaya, Yana; Kremenetskaya, Elena O.; Asming, Vladimir; Kværna, Tormod; Schweitzer, Johannes; Ringdal, Frode

    2015-03-01

    This paper presents preliminary results from a cooperative initiative between the Norwegian Seismic Array (NORSAR) institution in Norway and seismological institutions in NW Russia (Arkhangelsk and Apatity). We show that the joint processing of data from the combined seismic networks of all these institutions leads to a considerable increase in the number of located seismic events in the European Arctic compared to standard seismic bulletins such as the NORSAR reviewed regional seismic bulletin and the Reviewed Event Bulletin (REB) issued by the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) organization. The increase is particularly pronounced along the Gakkel Ridge to the north of the Svalbard and Franz-Josef Land archipelagos. We also note that the vast majority of the events along the Gakkel Ridge have been located slightly to the south of the ridge. We interpret this as an effect of the lack of recording stations closer to and north of the Gakkel Ridge, and the use of a one-dimensional velocity model which is not fully representative for travel-times along observed propagation paths. We conclude that while the characteristics of earthquake activity in the European Arctic is currently poorly known, the knowledge can be expected to be significantly improved by establishing the appropriate cooperative seismic recording infrastructures.

  1. Recent Progress and Development on Multi-parameters Remote Sensing Application in Earthquake Monitoring in China

    NASA Astrophysics Data System (ADS)

    Shen, Xuhui; Zhang, Xuemin; Hong, Shunying; Jing, Feng; Zhao, Shufan

    2014-05-01

    In the last ten years, a few national research plans and scientific projects on remote sensing application in Earthquake monitoring research are implemented in China. Focusing on advancing earthquake monitoring capability searching for the way of earthquake prediction, satellite electromagnetism, satellite infrared and D-InSAR technology were developed systematically and some remarkable progress were achieved by statistical research on historical earthquakes and summarized initially the space precursory characters, which laid the foundation for gradually promoting the practical use. On the basis of these works, argumentation on the first space-based platform has been finished in earthquake stereoscope observation system in China, and integrated earthquake remote sensing application system has been designed comprehensively. To develop the space-based earthquake observational system has become a major trend of technological development in earthquake monitoring and prediction. We shall pay more emphasis on the construction of the space segment of China earthquake stereoscope observation system and Imminent major scientific projects such as earthquake deformation observation system and application research combined INSAR, satellite gravity and GNSS with the goal of medium and long term earthquake monitoring and forcasting, infrared observation and technical system and application research with the goal of medium and short term earthquake monitoring and forcasting, and satellite-based electromagnetic observation and technical system and application system with the goal of short term and imminent earthquake monitoring.

  2. Helping safeguard Veterans Affairs' hospital buildings by advanced earthquake monitoring

    USGS Publications Warehouse

    Kalkan, Erol; Banga, Krishna; Ulusoy, Hasan S.; Fletcher, Jon Peter B.; Leith, William S.; Blair, James L.

    2012-01-01

    In collaboration with the U.S. Department of Veterans Affairs (VA), the National Strong Motion Project of the U.S. Geological Survey has recently installed sophisticated seismic systems that will monitor the structural integrity of hospital buildings during earthquake shaking. The new systems have been installed at more than 20 VA medical campuses across the country. These monitoring systems, which combine sensitive accelerometers and real-time computer calculations, are capable of determining the structural health of each structure rapidly after an event, helping to ensure the safety of patients and staff.

  3. Earthquake Monitoring at Different Scales with Seiscomp3

    NASA Astrophysics Data System (ADS)

    Grunberg, M.; Engels, F.

    2013-12-01

    In the last few years, the French National Network of Seismic Survey (BCSF-RENASS) had to modernize its old and aging earthquake monitoring system coming from an inhouse developement. After having tried and conducted intensive tests on several real time frameworks such as EarthWorm and Seiscomp3 we have finaly adopted in 2012 Seiscomp3. Our actual system runs with two pipelines in parallel: the first one is tuned at a global scale to monitor the world seismicity (for event's magnitude > 5.5) and the second one is tuned at a national scale for the monitoring of the metropolitan France. The seismological stations used for the "world" pipeline are coming mainly from Global Seismographic Network (GSN), whereas for the "national" pipeline the stations are coming from the RENASS short period network and from the RESIF broadband network. More recently we have started to tune seiscomp3 at a smaller scale to monitor in real time the geothermal project (a R&D program in Deep Geothermal Energy) in the North-East part of France. Beside the use of the real time monitoring capabilities of Seiscomp3 we have also used a very handy feature to playback a 4 month length dataset at a local scale for the Rambervillers earthquake (22/02/2003, Ml=5.4) leading to the build of roughly 2000 aftershock's detections and localisations.

  4. Integrating geomatics and structural investigation in post-earthquake monitoring of ancient monumental Buildings

    NASA Astrophysics Data System (ADS)

    Dominici, Donatella; Galeota, Dante; Gregori, Amedeo; Rosciano, Elisa; Alicandro, Maria; Elaiopoulos, Michail

    2014-06-01

    The old city center of L’Aquila is rich in historical buildings of considerable merit. On April 6th 2009 a devastating earthquake caused significant structural damages, affecting especially historical and monumental masonry buildings. The results of a study carried out on a monumental building, former headquarters of the University of L’Aquila (The Camponeschi building, XVI century) are presented in this paper. The building is situated in the heart of the old city center and was seriously damaged by the earthquake. Preliminary visual damage analysis carried out immediately after the quake, clearly evidenced the building’s complexity, raising the need for direct and indirect investigation on the structure. Several non-destructive test methods were then performed in situ to better characterize the masonry typology and the damage distribution, as well. Subsequently, a number of representative control points were identified on the building’s facades to represent, by their motion over time, the evolution of the structural displacements and deformations. In particular, a surveying network consisting of 27 different points was established. A robotic total station mounted on top of a concrete pillar was used for periodically monitoring the surveying control network. Stability of the pillar was checked through a GNSS static survey repeated before any set of measurements. The present study evidences the interesting possibilities of combining geomatics with structural investigation during post-earthquake monitoring of ancient monumental buildings.

  5. Building Capacity for Earthquake Monitoring: Linking Regional Networks with the Global Community

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Lerner-Lam, A.

    2006-12-01

    Installing or upgrading a seismic monitoring network is often among the mitigation efforts after earthquake disasters, and this is happening in response to the events both in Sumatra during December 2004 and in Pakistan during October 2005. These networks can yield improved hazard assessment, more resilient buildings where they are most needed, and emergency relief directed more quickly to the worst hit areas after the next large earthquake. Several commercial organizations are well prepared for the fleeting opportunity to provide the instruments that comprise a seismic network, including sensors, data loggers, telemetry stations, and the computers and software required for the network center. But seismic monitoring requires more than hardware and software, no matter how advanced. A well-trained staff is required to select appropriate and mutually compatible components, install and maintain telemetered stations, manage and archive data, and perform the analyses that actually yield the intended benefits. Monitoring is more effective when network operators cooperate with a larger community through free and open exchange of data, sharing information about working practices, and international collaboration in research. As an academic consortium, a facility operator and a founding member of the International Federation of Digital Seismographic Networks, IRIS has access to a broad range of expertise with the skills that are required to help design, install, and operate a seismic network and earthquake analysis center, and stimulate the core training for the professional teams required to establish and maintain these facilities. But delivering expertise quickly when and where it is unexpectedly in demand requires advance planning and coordination in order to respond to the needs of organizations that are building a seismic network, either with tight time constraints imposed by the budget cycles of aid agencies following a disastrous earthquake, or as part of more informed national programs for hazard assessment and mitigation.

  6. Quantifying 10 years of improvements in earthquake monitoring in the Caribbean region

    USGS Publications Warehouse

    McNamara, Daniel E.; Hillebrandt-Andrade, Christa; Saurel, Jean-Marie; Huerfano-Moreno, V.; Lynch, Lloyd

    2015-01-01

    Over 75 tsunamis have been documented in the Caribbean and adjacent regions during the past 500 years. Since 1500, at least 4484 people are reported to have perished in these killer waves. Hundreds of thousands are currently threatened along the Caribbean coastlines. Were a great tsunamigenic earthquake to occur in the Caribbean region today, the effects would potentially be catastrophic due to an increasingly vulnerable region that has seen significant population increases in the past 40–50 years and currently hosts an estimated 500,000 daily beach visitors from North America and Europe, a majority of whom are not likely aware of tsunami and earthquake hazards. Following the magnitude 9.1 Sumatra–Andaman Islands earthquake of 26 December 2004, the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Early Warning System for the Caribbean and Adjacent Regions (CARIBE?EWS) was established and developed minimum performance standards for the detection and analysis of earthquakes. In this study, we model earthquake?magnitude detection threshold and P?wave detection time and demonstrate that the requirements established by the UNESCO ICG CARIBE?EWS are met with 100% of the network operating. We demonstrate that earthquake?monitoring performance in the Caribbean Sea region has improved significantly in the past decade as the number of real?time seismic stations available to the National Oceanic and Atmospheric Administration tsunami warning centers have increased. We also identify weaknesses in the current international network and provide guidance for selecting the optimal distribution of seismic stations contributed from existing real?time broadband national networks in the region.

  7. Array monitoring of swarm earthquakes in the Pollino range (Italy)

    NASA Astrophysics Data System (ADS)

    Roessler, Dirk; Passarelli, Luigi; Govoni, Aladino; Rivalta, Eleonora

    2014-05-01

    The Mercure Basin (MB) and the Castrovillari Fault (CF) in the Pollino range (southern Apennines, Italy) represent one of the most prominent seismic gaps in the Italian seismic catalog, with no M>6 earthquakes during the last centuries. In recent times, the MB has been repeatedly interested by seismic swarms, with the most energetic swarm started in 2010 and still active in 2013. The seismic activity culminated in autumn 2012 with a M=5 event on October 25. In contrast, the CF appears aseismic. Only the northern part of the CF has experienced microseismicity. The rheology of these faults is unclear. Current debates include the potential of the MB and the CF to host large earthquakes and the level and the style of deformation. Understanding the seismicity and the behaviour of the faults is therefore necessary to assess the seismic hazard. We have been monitoring the ongoing seismicity using a small-aperture seismic array, integrated in a temporary seismic network. The instruments are provided by the GFZ German Research Centre for Geosciences and INGV, Italy, and are operated in close collaboration between both institutes. Automatized seismic array methods are applied to resolve the spatio-temporal evolution of the seismicity in great detail. Using the GFZ array, we detect about ten times more earthquakes than currently included in automatic local catalogues. The increase corresponds to an improvement in complete event detection down to M~0.5. Event locations and the magnitude-frequency distribution are analysed to characterise the swarm and investigate the possible role of fluids for earthquake triggering. In the course of the swarm, seismicity has mainly migrated within the Mercure Basin. However, the spread towards the northern end of the Castrovillari fault to the east in 2013 marks a swarm phase with seismicity located outside of the Mercure Basin. The observations characterize the behaviour of the faults and their inter-connection.

  8. Monitoring fault zone environments with correlations of earthquake waveforms

    NASA Astrophysics Data System (ADS)

    Roux, Philippe; Ben-Zion, Yehuda

    2014-02-01

    We develop a new technique for monitoring temporal changes in fault zone environments based on cross-correlation of earthquake waveforms recorded by pairs of stations. The method is applied to waveforms of ˜10 000 earthquakes observed during 100 d around the 1999 M 7.1 Duzce mainshock by a station located in the core damage zone of the North Anatolian Fault and a nearby station. To overcome clock problems, the correlation functions are realigned on a dominant peak. Consequently, the analysis focuses on measurements of coherency rather than traveltimes, and is associated with correlation coefficient of groups of events with a reference wavelet. Examination of coherency in different frequency bands reveals clear changes at a narrow band centred around 0.8 Hz. The results show a rapid drop of ˜1-2 per cent of the coherency at the time of the Duzce event followed by gradual recovery with several prominent oscillations over 4 d. The observed changes likely reflect evolution of permeability and fluid motion in the core damage zone of the North Anatolian Fault. Compared to noise correlation processing, our analysis of earthquake waveform correlation (i) benefits from high level of coherence with short duration recorded signals, (ii) has considerably finer temporal sampling of fault dynamics after mainshocks than is possible with noise correlation, (iii) uses the coherence level to track property variations, which may be more robust than traveltime fluctuations in the coda of noise correlations. Studies utilizing both earthquake and noise waveforms at multiple pairs of stations across fault damage zones can improve significantly the understanding of fault zone processes.

  9. Earthquakes

    MedlinePLUS

    ... Weather Workplace Plans School Emergency Plans Main Content Earthquakes Earthquakes are sudden rolling or shaking events caused ... at any time of the year. Before An Earthquake Look around places where you spend time. Identify ...

  10. Earthquakes

    MedlinePLUS

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  11. Archiving and Distributing Seismic Data at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.

    2002-12-01

    The Southern California Earthquake Data Center (SCEDC) archives and provides public access to earthquake parametric and waveform data gathered by the Southern California Seismic Network and since January 1, 2001, the TriNet seismic network, southern California's earthquake monitoring network. The parametric data in the archive includes earthquake locations, magnitudes, moment-tensor solutions and phase picks. The SCEDC waveform archive prior to TriNet consists primarily of short-period, 100-samples-per-second waveforms from the SCSN. The addition of the TriNet array added continuous recordings of 155 broadband stations (20 samples per second or less), and triggered seismograms from 200 accelerometers and 200 short-period instruments. Since the Data Center and TriNet use the same Oracle database system, new earthquake data are available to the seismological community in near real-time. Primary access to the database and waveforms is through the Seismogram Transfer Program (STP) interface. The interface enables users to search the database for earthquake information, phase picks, and continuous and triggered waveform data. Output is available in SAC, miniSEED, and other formats. Both the raw counts format (V0) and the gain-corrected format (V1) of COSMOS (Consortium of Organizations for Strong-Motion Observation Systems) are now supported by STP. EQQuest is an interface to prepackaged waveform data sets for select earthquakes in Southern California stored at the SCEDC. Waveform data for large-magnitude events have been prepared and new data sets will be available for download in near real-time following major events. The parametric data from 1981 to present has been loaded into the Oracle 9.2.0.1 database system and the waveforms for that time period have been converted to mSEED format and are accessible through the STP interface. The DISC optical-disk system (the "jukebox") that currently serves as the mass-storage for the SCEDC is in the process of being replaced with a series of inexpensive high-capacity (1.6 Tbyte) magnetic-disk RAIDs. These systems are built with PC-technology components, using 16 120-Gbyte IDE disks, hot-swappable disk trays, two RAID controllers, dual redundant power supplies and a Linux operating system. The system is configured over a private gigabit network that connects to the two Data Center servers and spans between the Seismological Lab and the USGS. To ensure data integrity, each RAID disk system constantly checks itself against its twin and verifies file integrity using 128-bit MD5 file checksums that are stored separate from the system. The final level of data protection is a Sony AIT-3 tape backup of the files. The primary advantage of the magnetic-disk approach is faster data access because magnetic disk drives have almost no latency. This means that the SCEDC can provide better "on-demand" interactive delivery of the seismograms in the archive.

  12. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake


  13. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  14. Monitoring of influenza viruses in the aftermath of the Great East Japan earthquake.

    PubMed

    Tohma, Kentaro; Suzuki, Akira; Otani, Kanako; Okamoto, Michiko; Nukiwa, Nao; Kamigaki, Taro; Kawamura, Kazuhisa; Nakagawa, Hiroshi; Oshitani, Hitoshi

    2012-01-01

    Influenza has a significant impact on public health when a natural disaster occurs during the influenza season. However, the epidemiological characteristics of influenza following natural disasters have not been well documented due to the difficulty of implementing laboratory-based influenza surveillance in such situations. The Great East Japan Earthquake occurred on March 11, 2011, when influenza was already circulating. Since routine influenza surveillance was not performed in Miyagi Prefecture, we initiated an ad hoc laboratory-based monitoring system immediately after the earthquake. From March 15 to May 19, we tested 277 samples for influenza virus collected around Sendai City and from evacuation centers in Miyagi Prefecture. Influenza A (H3N2) was detected in 112 cases, influenza A (H1N1) 2009 in one case, and influenza B in 92 cases. The H3N2 virus was dominant until the 14th week. However, a sudden increase in the number of influenza B cases occurred after schools were reopened. According to phylogenetic analysis, a major clade switch of the H3N2 virus took place after the earthquake. The Yamagata lineage of influenza B was detected in one patient from western Japan, indicating the importing of viruses into the affected area. PMID:23183209

  15. Monitoring road losses for Lushan 7.0 earthquake disaster utilization multisource remote sensing images

    NASA Astrophysics Data System (ADS)

    Huang, He; Yang, Siquan; Li, Suju; He, Haixia; Liu, Ming; Xu, Feng; Lin, Yueguan

    2015-12-01

    Earthquake is one major nature disasters in the world. At 8:02 on 20 April 2013, a catastrophic earthquake with Ms 7.0 in surface wave magnitude occurred in Sichuan province, China. The epicenter of this earthquake located in the administrative region of Lushan County and this earthquake was named the Lushan earthquake. The Lushan earthquake caused heavy casualties and property losses in Sichuan province. After the earthquake, various emergency relief supplies must be transported to the affected areas. Transportation network is the basis for emergency relief supplies transportation and allocation. Thus, the road losses of the Lushan earthquake must be monitoring. The road losses monitoring results for Lushan earthquake disaster utilization multisource remote sensing images were reported in this paper. The road losses monitoring results indicated that there were 166 meters' national roads, 3707 meters' provincial roads, 3396 meters' county roads, 7254 meters' township roads, and 3943 meters' village roads were damaged during the Lushan earthquake disaster. The damaged roads mainly located at Lushan County, Baoxing County, Tianquan County, Yucheng County, Mingshan County, and Qionglai County. The results also can be used as a decision-making information source for the disaster management government in China.

  16. Southern California Earthquake Center (SCEC) Communication, Education and Outreach Program

    NASA Astrophysics Data System (ADS)

    Benthien, M. L.

    2003-12-01

    The SCEC Communication, Education, and Outreach Program (CEO) offers student research experiences, web-based education tools, classroom curricula, museum displays, public information brochures, online newsletters, and technical workshops and publications. This year, much progress has been made on the development of the Electronic Encyclopedia of Earthquakes (E3), a collaborative project with CUREE and IRIS. The E3 development system is now fully operational, and 165 entries are in the pipeline. When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. To coordinate activities for the 10-year anniversary of the Northridge Earthquake in 2004 (and beyond), the "Earthquake Country Alliance" is being organized by SCEC CEO to present common messages, to share or promote existing resources, and to develop new activities and products jointly (such as a new version of Putting Down Roots in Earthquake Country). The group includes earthquake science and engineering researchers and practicing professionals, preparedness experts, response and recovery officials, news media representatives, and education specialists. A web portal, http://www.earthquakecountry.info, is being developed established with links to web pages and descriptions of other resources and services that the Alliance members provide. Another ongoing strength of SCEC is the Summer Intern program, which now has a year-round counterpart with students working on IT projects at USC. Since Fall 2002, over 32 students have participated in the program, including 7 students working with scientists throughout SCEC, 17 students involved in the USC "Earthquake Information Technology" intern program, and 7 students involved in CEO projects. These and other activities of the SCEC CEO program will be presented, along with lessons learned during program design and implementation.

  17. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  18. Earthquakes

    MedlinePLUS

    ... Winter Weather Information on Specific Types of Emergencies Earthquakes Language: English Español (Spanish) Recommend on Facebook Tweet ... even building collapse) if you immediately: Before an Earthquake Being Prepared Emergency Supplies Home Hazards During an ...

  19. Remote monitoring of the earthquake cycle using satellite radar interferometry

    NASA Astrophysics Data System (ADS)

    Wright, Tim J.

    2002-12-01

    The earthquake cycle is poorly understood. Earthquakes continue to occur on previously unrecognized faults. Earthquake prediction seems impossible. These remain the facts despite nearly 100 years of intensive study since the earthquake cycle was first conceptualized. Using data acquired from satellites in orbit 800 km above the Earth, a new technique, radar interferometry (InSAR), has the potential to solve these problems. For the first time, detailed maps of the warping of the Earth's surface during the earthquake cycle can be obtained with a spatial resolution of a few tens of metres and a precision of a few millimetres. InSAR does not need equipment on the ground or expensive field campaigns, so it can gather crucial data on earthquakes and the seismic cycle from some of the remotest areas of the planet. In this article, I review some of the remarkable observations of the earthquake cycle already made using radar interferometry and speculate on breakthroughs that are tantalizingly close.

  20. Real-time earthquake monitoring using a search engine method

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1?s after receiving the long-period surface wave data.

  1. Monitoring of ULF (ultra-low-frequency) Geomagnetic Variations Associated with Earthquakes

    PubMed Central

    Hayakawa, Masashi; Hattori, Katsumi; Ohta, Kenji

    2007-01-01

    ULF (ultra-low-frequency) electromagnetic emission is recently recognized as one of the most promising candidates for short-term earthquake prediction. This paper reviews previous convincing evidence on the presence of ULF emissions before a few large earthquakes. Then, we present our network of ULF monitoring in the Tokyo area by describing our ULF magnetic sensors and we finally present a few, latest results on seismogenic electromagnetic emissions for recent large earthquakes with the use of sophisticated signal processings.

  2. Quantifying 10 years of improved earthquake-monitoring performance in the Caribbean region

    USGS Publications Warehouse

    McNamara, Daniel E.; Hillebrandt-Andrade, Christa; Saurel, Jean-Marie; Huerfano-Moreno, V.; Lynch, Lloyd

    2015-01-01

    Over 75 tsunamis have been documented in the Caribbean and adjacent regions during the past 500 years. Since 1500, at least 4484 people are reported to have perished in these killer waves. Hundreds of thousands are currently threatened along the Caribbean coastlines. Were a great tsunamigenic earthquake to occur in the Caribbean region today, the effects would potentially be catastrophic due to an increasingly vulnerable region that has seen significant population increases in the past 40–50 years and currently hosts an estimated 500,000 daily beach visitors from North America and Europe, a majority of whom are not likely aware of tsunami and earthquake hazards. Following the magnitude 9.1 Sumatra–Andaman Islands earthquake of 26 December 2004, the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Early Warning System for the Caribbean and Adjacent Regions (CARIBE‐EWS) was established and developed minimum performance standards for the detection and analysis of earthquakes. In this study, we model earthquake‐magnitude detection threshold and P‐wave detection time and demonstrate that the requirements established by the UNESCO ICG CARIBE‐EWS are met with 100% of the network operating. We demonstrate that earthquake‐monitoring performance in the Caribbean Sea region has improved significantly in the past decade as the number of real‐time seismic stations available to the National Oceanic and Atmospheric Administration tsunami warning centers have increased. We also identify weaknesses in the current international network and provide guidance for selecting the optimal distribution of seismic stations contributed from existing real‐time broadband national networks in the region.

  3. The Pacific Tsunami Warning Center response to the Mw8.1 Samoan earthquake of September 29, 2009

    NASA Astrophysics Data System (ADS)

    Hirshorn, B. F.; Becker, N.; Weinstein, S.

    2009-12-01

    Over 90% of tsunami-related casualties occur within a few hundred km of the causative event, usually an earthquake. The Mw 8.1 (GCMT) Samoan earthquake and tsunami of September 29, 2009, represents a best-case scenario for response and self-evacuation by a population near the epicenter of a tsunamigenic earthquake. The Samoan population felt over 60 seconds of strong ground shaking and saw the first tsunami wave motion as a recession rather than an onshore wave. Their observations coupled with effective public awareness saved many lives. Such phenomena do not precede all dangerous tsunamis, however, and Samoans may not receive these natural warnings for future local tsunamis. For example, a “tsunami earthquake” (Kanamori, 1972) can generate a destructive tsunami with little or no strong ground motion (cf. Nicaragua 1992 and Java 2006). Furthermore, if the Samoan earthquake had ruptured as a thrust mechanism more typical for the nearby subduction zone, then the first observed tsunami wave would have likely caused inundation, and thus the ocean would not have warned the population. The Pacific Tsunami Warning Center (PTWC) mitigates such hazards by monitoring earthquakes in real time and using semi-automated analysis to rapidly characterize seismic sources for their tsunami-generating potential in order to warn coastlines of any tsunami threats. As part of its mission PTWC also uses a dense local seismic network in order to produce local warnings for the State of Hawaii within 3 minutes of earthquake origin time. In this presentation we detail the analysis and response performed by the PTWC for the Samoan event. We highlight how the current sparse deployment of seismometers in the southwest Pacific Ocean resulted in PTWC issuing a warning 16 minutes after the earthquake's origin time, as compared to what can be done using a denser seismic network. Therefore, we advocate for a denser network of seismometers in the region that will allow the PTWC to halve the time needed to issue tsunami warnings after future earthquakes in the region that may not be as well suited for local response and self-evacuation as this recent event. Currently, there are new and developing seismic networks in Tonga, Fiji and Samoa. These data will be needed to reduce the time lapse between the earthquake and the tsunami warning.

  4. Romanian Data Center: A modern way for seismic monitoring

    NASA Astrophysics Data System (ADS)

    Neagoe, Cristian; Marius Manea, Liviu; Ionescu, Constantin

    2014-05-01

    The main seismic survey of Romania is performed by the National Institute for Earth Physics (NIEP) which operates a real-time digital seismic network. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Mark l4c, Ranger, gs21, Mark l22) and acceleration sensors (Episensor Kinemetrics). The data are transmitted at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for the Black Sea tsunami events. NIEP is a data acquisition node for the seismic network of Moldova (FDSN code MD) composed of five seismic stations. NIEP has installed in the northern part of Bulgaria eight seismic stations equipped with broadband sensors and Episensors and nine accelerometers (Episensors) installed in nine districts along the Danube River. All the data are acquired at NIEP for Early Warning System and for primary estimation of the earthquake parameters. The real-time acquisition (RT) and data exchange is done by Antelope software and Seedlink (from Seiscomp3). The real-time data communication is ensured by different types of transmission: GPRS, satellite, radio, Internet and a dedicated line provided by a governmental network. For data processing and analysis at the two data centers Antelope 5.2 TM is being used running on 3 workstations: one from a CentOS platform and two on MacOS. Also a Seiscomp3 server stands as back-up for Antelope 5.2 Both acquisition and analysis of seismic data systems produce information about local and global parameters of earthquakes. In addition, Antelope is used for manual processing (event association, calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV, etc.), generating ShakeMap products and interaction with global data centers. National Data Center developed tools to enable centralizing of data from software like Antelope and Seiscomp3. These tools allow rapid distribution of information about damages observed after an earthquake to the public. Another feature of the developed application is the alerting of designated persons, via email and SMS, based on the earthquake parameters. In parallel, Seiscomp3 sends automatic notifications (emails) with the earthquake parameters. The real-time seismic network and software acquisition and data processing used in the National Data Center development have increased the number of events detected locally and globally, the increase of the quality parameters obtained by data processing and potentially increasing visibility on the national and internationally.

  5. Real-time earthquake monitoring using a search engine method.

    PubMed

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  6. Earthquakes.

    PubMed

    Briggs, Susan M

    2006-06-01

    Major earthquakes have the potential to be one of the most catastrophic natural disasters affecting mankind. Earthquakes of significant size threaten lives and damage property by setting off a chain of events that disrupts all aspects of the environment and significantly impacts the public health and medical infrastructures of the affected region. This article provides an overview of basic earthquake facts and relief protocol for medical personnel. PMID:16781268

  7. Advanced Real-time Monitoring System and Simulation Researches for Earthquakes and Tsunamis in Japan -Towards Disaster Mitigation on Earthquakes and Tsunamis-

    NASA Astrophysics Data System (ADS)

    Hyodo, M.; Kaneda, Y.; Takahashi, N.; Baba, T.; Hori, T.; Kawaguchi, K.; Araki, E.; Matsumoto, H.; Nakamura, T.; Kamiya, S.; Ariyoshi, K.; Nakano, M.; Choi, J. K.; Nishida, S.

    2014-12-01

    How to mitigate and reduce damages by Earthquakes and Tsunamis? This is very important and indispensable problem for Japan and other high seismicity countries.Based on lessons learned from the 2004 Sumatra Earthquake/Tsunamis and the 2011 East Japan Earthquake/Tsunami, we recognized the importance of real-time monitoring of these natural hazards. As real-time monitoring system, DONET1 (Dense Ocean floor Network for Earthquakes and Tsunamis) was deployed and DONET2 is being developed around the Nankai trough Southwestern Japan for Seismology and Earthquake/Tsunami Early Warning. Based on simulation researches, DONET1 and DONET2 with multi-kinds of sensors such as broadband seismometers and precise pressure gauges will be expected to monitor slow events such as low frequency tremors and slow earthquakes for the estimation of seismic stage which is the inter-seismic or pre-seismic stage. In advanced simulation researches such as the recurrence cycle of mega thrust earthquakes, the data assimilation is very powerful tool to improve the reliability. Furthermore, tsunami inundations, seismic responses on buildings/city and agent simulations are very important towards future disaster mitigation programs and related measures. Finally, real-time monitoring data and advanced simulations will be integrated for precise Earthquake/Tsunami Early Warning and Estimation of damages in future compound disasters on Earthquakes and Tsunamis. We will introduce the present progress of advanced researches and future scope for disaster mitigation researches on earthquakes and Tsunamis.

  8. Real-time earthquake monitoring using a search engine method

    PubMed Central

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  9. Remote monitoring of the earthquake cycle using satellite radar interferometry.

    PubMed

    Wright, Tim J

    2002-12-15

    The earthquake cycle is poorly understood. Earthquakes continue to occur on previously unrecognized faults. Earthquake prediction seems impossible. These remain the facts despite nearly 100 years of intensive study since the earthquake cycle was first conceptualized. Using data acquired from satellites in orbit 800 km above the Earth, a new technique, radar interferometry (InSAR), has the potential to solve these problems. For the first time, detailed maps of the warping of the Earth's surface during the earthquake cycle can be obtained with a spatial resolution of a few tens of metres and a precision of a few millimetres. InSAR does not need equipment on the ground or expensive field campaigns, so it can gather crucial data on earthquakes and the seismic cycle from some of the remotest areas of the planet. In this article, I review some of the remarkable observations of the earthquake cycle already made using radar interferometry and speculate on breakthroughs that are tantalizingly close. PMID:12626271

  10. Recent Experiences Operating a Large, International Network of Electromagnetic Earthquake Monitors

    NASA Astrophysics Data System (ADS)

    Bleier, T.; Dunson, J. C.; Lemon, J.

    2014-12-01

    Leading a 5-nation international collaboration, QuakeFinder currently has a network of 168 instruments along with a Data Center that processes the 10 GB of data each day, 7 days a week. Each instrument includes 3-axis induction magnetometers, positive and negative ion sensors, and a geophone. These ground instruments are augmented with GOES weather satellite infrared monitoring of California (and in the future—other countries). The nature of the signals we are trying to detect and identify to enable forecasts for significant earthquakes (>M5) involves refining algorithms that both identify quake-related signals at some distance and remove a myriad of natural and anthropogenic noise. Maximum detection range was further investigated this year. An initial estimated maximum detection distance of 10 miles (16 km) was challenged with the onset of a M8.2 quake near Iquique, Chile on April 1, 2014. We will discuss the different strategies used to push the limits of detection for this quake which was 93 miles (149 km) from the instrument that had just been installed 2 months before the quake. Identifying and masking natural and man-made noise to reduce the number of misses and false alarms, and to increase the number of "hits" in a limited earthquake data set continues to be a top priority. Several novel approaches were tried, and the resulting progress will be discussed.

  11. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    NASA Astrophysics Data System (ADS)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of detections is very small compared to the 5,175 earthquakes in the USGS PDE global earthquake catalog for the same five month time period, and no accurate location or magnitude can be assigned based on Tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 80% occurred within 2 minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided (very) short first-impression narratives from people who experienced the shaking. The USGS will continue investigating how to use Twitter and other forms of social media to augment is current suite of seismographically derived products.

  12. Real time monitoring systems and advanced simulation researches for Earthquakes/ Tsunami disaster mitigation (Invited)

    NASA Astrophysics Data System (ADS)

    Kaneda, Y.

    2013-12-01

    Researches on Nankai trough mega trust earthquakes around south western Japan are very important for Earthquake/Tsunami disaster mitigation in Japan. Especially, the offshore real time monitoring systems are indispensable for the early warning of earthquakes and tsunamis. Actually, lessons from 2004 Sumatra Earthquake/Tsunami and 2011 East Japan Earthquake accelerate deployments of ocean floor networks such as DONET1, DONET2 around the Nankai trough and Inline cable systems off East Japan. Already, DONET1 have deployed on the Tonankai earthquake seismogenic zone, and DONET2 on the Nankai earthquake seismogenic zone is under developing. In the Nankai trough seismogenic zones, mega thrust earthquakes have occurred with the intervals of every 100-200 years. However, recurrence patters among these mega thrust earthquakes in 1944/ 1946, 1854, 177 and 1605 are quite different. Furthermore, these seismogenic zones are located near the coasts of southwestern Japan, tsunami will come very fast, so evacuations from tsunamis are severe problems at coastal cities southwestern Japan. Based on these conditions, Japanese government and research community recognized that real time monitoring systems of earthquakes and tsunamis are very important for EEW and prediction researches. Furthermore, the Ocean floor network equipped with multi kinds of sensors such as seismometers and pressure gauges are very powerful and significant tool to monitor the broad band phenomena in seismogenic zones. In the Nankai trough, we constructed DONET1 which is Dense Ocean floor Network for Earthquakes and Tsunamis around the Tonankai seismogenic zone with 20 observatories. Multi kinds of sensors such as an accelerometer, a broad band seismometer, a precise pressure gauge, a differential pressure gauge and a precise thermometer are equipped in each observatory. Now, we are already developing DONET2 with 31observatories around the Nankai seismogenic zone. Furthermore, advanced earthquake/tsunami simulations for scenario researches, hazard evaluations and evacuations are very important and significant. Finally, we will apply these real time data and advanced simulations to early warning researches, prediction researches, hazard evaluation researches for the Earthquake/Tsunami disaster mitigations and understandings of seismic linkages among mega thrust earthquakes around the Nankai trough.

  13. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  14. Earthquakes

    USGS Publications Warehouse

    Shedlock, Kaye M.; Pakiser, Louis Charles

    1998-01-01

    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  15. Korea Integrated Seismic System tool(KISStool) for seismic monitoring and data sharing at the local data center

    NASA Astrophysics Data System (ADS)

    Park, J.; Chi, H. C.; Lim, I.; Jeong, B.

    2011-12-01

    The Korea Integrated Seismic System(KISS) is a back-bone seismic network which distributes seismic data to different organizations in near-real time at Korea. The association of earthquake monitoring institutes has shared their seismic data through the KISS from 2003. Local data centers operating remote several stations need to send their free field seismic data to NEMA(National Emergency Management Agency) by the law of countermeasure against earthquake hazard in Korea. It is very important the efficient tool for local data centers which want to rapidly detect local seismic intensity and to transfer seismic event information toward national wide data center including PGA, PGV, dominant frequency of P-wave, raw data, and etc. We developed the KISStool(Korea Integrated Seismic System tool) for easy and convenient operation seismic network in local data center. The KISStool has the function of monitoring real time waveforms by clicking station icon on the Google map and real time variation of PGA, PGV, and other data by opening the bar type monitoring section. If they use the KISStool, any local data center can transfer event information to NEMA(National Emergency Management Agency), KMA(Korea Meteorological Agency) or other institutes through the KISS using UDP or TCP/IP protocols. The KISStool is one of the most efficient methods to monitor and transfer earthquake event at local data center in Korea. KIGAM will support this KISStool not only to the member of the monitoring association but also local governments.

  16. Progress and development on multi-parameters remote sensing application in earthquake monitoring in China

    NASA Astrophysics Data System (ADS)

    Shen, Xuhui; Zhang, Xuemin; Hong, Shunying; Jing, Feng; Zhao, Shufan

    2013-12-01

    In this paper, the progress and development on remote sensing technology applied in earthquake monitoring research are summarized, such as differential interference synthetic aperture radar (D-InSAR), infrared remote sensing, and seismo-ionospheric detecting. Many new monitoring data in this domain have been used, and new data processing methods have been developed to obtain high-precision images about crustal deformation, outgoing longwave radiation (OLR), surface latent heat flux (SLHF), and ionospheric parameters. The development in monitoring technology and data processing technique largely enriches earthquake research information and provides new tools for earthquake stereoscope monitoring system, especially on the space part. Finally, new developing trend in this area was introduced, and some key problems in future work were pointed out.

  17. Correlation of major eastern earthquake centers with mafic/ultramafic basement masses

    USGS Publications Warehouse

    Kane, Martin Francis

    1977-01-01

    Extensive gravity highs and associated magnetic anomalies are present in or near seven major eastern North American earthquake areas as defined by Hadley and Devine (1974). The seven include the five largest of the eastern North American earthquake .centers. The immediate localities of the gravity anomalies are, however, relatively free of seismicity, particularly the largest events. The anomalies are presumably caused by extensive mafic or ultramafic masses embedded in the crystalline basement. Laboratory experiments show that serpentinized gabbro and dunite fail under stress in a creep mode rather than in a stick-slip mode. A possible explanation of the correlation between the earthquake patterns and the anomalies is that the mafic/ultramafic masses are serpentinized and can only sustain low stress fields thereby acting to concentrate regional stress outside their boundaries. The proposed model is analogous to the hole-in-plate problem of mechanics whereby stresses around a hole in a stressed plate may. reach values several times the average.

  18. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  19. A Correction to the article "Geo-center movement caused by huge earthquakes" by Wenke Sun and Jie Dong

    NASA Astrophysics Data System (ADS)

    Zhou, Jiangcun; Sun, Wenke; Dong, Jie

    2015-07-01

    Sun and Dong (2014) studied the co-seismic geo-center movement using dislocation theory for a spherical earth model. However, they incorrectly considered the maximum vertical co-seismic displacement as the rigid geo-center motion (i.e., they did not separate the rigid shift and elastic deformation). In this paper, we correct Sun and Dong (2014) by using a new approach. We now define the geo-center motion as a shift of the center of figure of the Earth relative to the center of mass of the Earth. Furthermore, we derive new formulas to compute the co-seismic geo-center and inner core's center movements caused by huge earthquakes. The 2004 Sumatra earthquake and the 2011 Tohoku-Oki earthquake changed the geo-center by 1-4 mm and about 2 mm, respectively, and caused the inner core's center to displace by about 0.05 mm and 0.025 mm, respectively.

  20. 88 hours: the U.S. Geological Survey National Earthquake Information Center response to the March 11, 2011 Mw 9.0 Tohoku earthquake

    USGS Publications Warehouse

    Wald, David J.; Hayes, Gavin P.; Benz, Harley M.; Earle, Paul; Briggs, Richard W.

    2011-01-01

    The M 9.0 11 March 2011 Tohoku, Japan, earthquake and associated tsunami near the east coast of the island of Honshu caused tens of thousands of deaths and potentially over one trillion dollars in damage, resulting in one of the worst natural disasters ever recorded. The U.S. Geological Survey National Earthquake Information Center (USGS NEIC), through its responsibility to respond to all significant global earthquakes as part of the National Earthquake Hazards Reduction Program, quickly produced and distributed a suite of earthquake information products to inform emergency responders, the public, the media, and the academic community of the earthquake's potential impact and to provide scientific background for the interpretation of the event's tectonic context and potential for future hazard. Here we present a timeline of the NEIC response to this devastating earthquake in the context of rapidly evolving information emanating from the global earthquake-response community. The timeline includes both internal and publicly distributed products, the relative timing of which highlights the inherent tradeoffs between the requirement to provide timely alerts and the necessity for accurate, authoritative information. The timeline also documents the iterative and evolutionary nature of the standard products produced by the NEIC and includes a behind-the-scenes look at the decisions, data, and analysis tools that drive our rapid product distribution.

  1. Effects of a major earthquake on calls to regional poison control centers.

    PubMed Central

    Nathan, A. R.; Olson, K. R.; Everson, G. W.; Kearney, T. E.; Blanc, P. D.

    1992-01-01

    We retrospectively evaluated the effect of the Loma Prieta earthquake on calls to 2 designated regional poison control centers (San Francisco and Santa Clara) in the area. In the immediate 12 hours after the earthquake, there was an initial drop (31%) in call volume, related to telephone system overload and other technical problems. Calls from Bay Area counties outside of San Francisco and Santa Clara decreased more dramatically than those from within the host counties where the poison control centers are located. In the next 2 days, each poison control center then handled a 27% increase in call volume. Requests for information regarding safety of water supplies and other environmental concerns were significantly increased. The number of cases of actual poisoning exposure decreased, particularly poison and drug ingestions in children. Most calls directly related to the earthquake included spills and leaks of hazardous materials and questions about water and food safety. Regional poison control centers play an essential role in the emergency medical response to major disasters and are critically dependent on an operational telephone system. PMID:1595244

  2. Glacier quakes mimicking volcanic earthquakes: The challenge of monitoring ice-clad volcanoes and some solutions

    NASA Astrophysics Data System (ADS)

    Allstadt, K.; Carmichael, J. D.; Malone, S. D.; Bodin, P.; Vidale, J. E.; Moran, S. C.

    2012-12-01

    Swarms of repeating earthquakes at volcanoes are often a sign of volcanic unrest. However, glaciers also can generate repeating seismic signals, so detecting unrest at glacier-covered volcanoes can be a challenge. We have found that multi-day swarms of shallow, low-frequency, repeating earthquakes occur regularly at Mount Rainier, a heavily glaciated stratovolcano in Washington, but that most swarms had escaped recognition until recently. Typically such earthquakes were too small to be routinely detected by the seismic network and were often buried in the noise on visual records, making the few swarms that had been detected seem more unusual and significant at the time they were identified. Our comprehensive search for repeating earthquakes through the past 10 years of continuous seismic data uncovered more than 30 distinct swarms of low-frequency earthquakes at Rainier, each consisting of hundreds to thousands of events. We found that these swarms locate high on the glacier-covered edifice, occur almost exclusively between late fall and early spring, and that their onset coincides with heavy snowfalls. We interpret the correlation with snowfall to indicate a seismically observable glacial response to snow loading. Efforts are underway to confirm this by monitoring glacier motion before and after a major snowfall event using ground based radar interferometry. Clearly, if the earthquakes in these swarms reflect a glacial source, then they are not directly related to volcanic activity. However, from an operational perspective they make volcano monitoring difficult because they closely resemble earthquakes that often precede and accompany volcanic eruptions. Because we now have a better sense of the background level of such swarms and know that their occurrence is seasonal and correlated with snowfall, it will now be easier to recognize if future swarms at Rainier are unusual and possibly related to volcanic activity. To methodically monitor for such unusual activity, we are implementing an automatic detection algorithm to continuously search for repeating earthquakes at Mount Rainier, an algorithm that we eventually intend to apply to other Cascade volcanoes. We propose that a comprehensive routine that characterizes background levels of repeating earthquakes and the degree of correlation with weather and seasonal forcing, combined with real-time monitoring for repeating earthquakes, will provide a means to more rapidly discriminate between glacier seismicity and seismicity related to volcanic activity on monitored glacier-clad volcanoes.

  3. Conversion of Historic Seismic Data at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.; Clayton, R. W.

    2003-12-01

    The Southern California Earthquake Data Center (SCEDC) archives and provides public access to continuous and event-based earthquake parametric and waveform data gathered by the Southern California Seismic Network. The mission of the SCEDC is to maintain an easily-accessible, well-organized, high-quality, searchable archive of earthquake data for research in seismology and earthquake engineering. The SCEDC has compiled and converted all available historic seismic data to create a single source of southern California earthquake data from 1932-present. The 1932-1976 era of seismic data was key-punched from the original phase cards into CUSP-format on a VAX system. The data was then imported into the SCEDC Oracle database, so phase and epicenter data is available for direct retrieval by users via STP. A problematic four-year span of CEDAR data from 1977-1980 is currently not accessible, but has been converted and is being processed to include magnitude information. The parametric data from 1981 to present has been loaded into the Oracle 9i database system and the waveforms for that time period have been converted to mSEED format and are accessible through the STP interface. Quality control of 1981-2000 historic parametric and waveform data has progressed using a detailed reverse-chronological examination and verification of magnitudes. Current efforts at the SCEDC are focused on continuing to expand the available seismic datasets, enhancing and expanding distribution methods, and providing rapid access to all datasets, historic and modern. Through the California Integrated Seismic Network, the SCEDC is working with the NCEDC to provide unified access to California earthquake data.

  4. Earthquakes and submarine volcanism in the Northeast Pacific: Exploration in the time domain based on 21-years of hydroacoustic monitoring

    NASA Astrophysics Data System (ADS)

    Hammond, S. R.; Dziak, R. P.; Fox, C. G.

    2012-12-01

    Monitoring of regional seismic activity in the Northeast Pacific has been accomplished for the past 21 years using US Navy's Sound Surveillance System (SOSUS) hydrophone arrays. Seafloor seismic activity in this region occurs along the spreading center and transform boundaries between the Juan de Fuca, Pacific and North American plates. During the time span, from 1991 through 2011, nearly 50,000 earthquakes were detected and located. The majority of these events were associated with these tectonic boundaries but sections of several plate boundaries were largely aseismic during the this time span. While most of the earthquakes were associated with geological structures revealed in bathymetric maps of the region, there were also less easily explained intraplate events including a swarm of events within the interior of the southern portion of the Juan de Fuca plate. The location and sequential timing of events on portions of the plate boundaries also suggests ordered patterns of stress release. Among the most scientifically significant outcomes of acoustic monitoring was the discovery that deep seafloor magmatic activity can be accompanied by intense (> 1000 events/day) earthquake swarms. The first swarm detected by SOSUS, in 1993, was confirmed to have been associated with an extrusive volcanic eruption which occurred along a segment of the Juan de Fuca spreading center. Notably, this was the first deep spreading center eruption detected, located, and studied while it was active. Subsequently, two more swarms were confirmed to have been associated with volcanic eruptions, one on the Gorda spreading center in 1996 and the other at Axial volcano in 1998. One characteristic of these swarm events is migration of their earthquake locations 10s of km along the ridge axis tracking the movement of magma down-rift. The most rapid magma propagation events have been shown to be associated with seafloor eruptions and dramatic, transient changes in hydrothermal circulation as well as discharges of large volumes of hot water, i.e., megaplumes. Hydroacoustic monitoring using SOSUS, and now augmented with hydrophones deployed on stationary moorings as well as mobile platforms (e.g. gliders), provides a unique means for gaining knowledge concerning a broad diversity of present-day topics of scientific importance including, sources and fate of carbon in the deep ocean, deep ocean micro- and macro-ecosystems, and changes in ocean ambient noise levels.

  5. Real-Time Data Processing Systems and Products at the Alaska Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Ruppert, N. A.; Hansen, R. A.

    2007-05-01

    The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.

  6. Onsite infectious agents and toxins monitoring in 12 May Sichuan earthquake affected areas.

    PubMed

    Yao, Maosheng; Zhu, Tong; Li, Kejun; Dong, Shuofei; Wu, Yan; Qiu, Xinghua; Jiang, Bo; Chen, Liansheng; Zhen, Shiqi

    2009-11-01

    At 14:28 on 12 May 2008, Sichuan Province of China suffered a devastating earthquake measuring 8.0 on the Richter scale with more than 80 000 human lives lost and millions displaced. With inadequate shelter, poor access to health services, and disrupted ecology, the survivors were at enormous risk of infectious disease outbreaks. This work, believed to be unprecedented, was carried out to contain a possible outbreak through onsite monitoring of airborne biological agents in the high-risk areas. In such a mission, a mobile laboratory was developed using a customized vehicle along with state-of-art bioaerosol and molecular equipment and tools, and deployed to Sichuan 11 days after the earthquake. Using a high volume bioaerosol sampler (RCS High Flow) and Button Inhalable Aerosol Sampler equipped with gelatin filters, a total of 55 air samples, among which are 28 filter samples, were collected from rubble, medical centers, and camps of refugees, troops and rescue workers between 23 May and 9 June, 2008. After pre-treatment of the air samples, quantitative polymerase chain reaction (qPCR), gel electrophoresis, limulus amebocyte lysate (LAL) assay and enzyme-linked immunosorbent assay (ELISA) were applied to detect infectious agents and to quantify environmental toxins and allergens. The results revealed that, while high levels of endotoxin (180 approximately 975 ng/m3) and (1,3)-beta-d-glucans (11 approximately 100 ng/m3) were observed, infectious agents such as Bacillus anthracis, Bordetella pertussis, Neisseria meningitidis, Mycobacterium tuberculosis, influenza A virus, bird flu virus (H5N1), enteric viruses, and Meningococcal meningitis were found below their detection limits. The total bacterial concentrations were found to range from 250 to 2.5 x 10(5) DNA copies/L. Aspergillus fumigatus (Asp f 1) and dust mite allergens (Der p 1 and Der f 1) were also found below their detection limits. PMID:19890556

  7. Data Sets and Data Services at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2014-12-01

    The Northern California Earthquake Data Center (NCEDC) houses a unique and comprehensive data archive and provides real-time services for a variety of seismological and geophysical data sets that encompass northern and central California. We have over 80 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates in both raw and RINEX format. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 890,000 events from 1984 to the present, and the NCEDC provides catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also host and provide event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a variety of ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  8. EQInfo - earthquakes world-wide

    NASA Astrophysics Data System (ADS)

    Weber, Bernd; Herrnkind, Stephan

    2014-05-01

    EQInfo is a free Android app providing recent earthquake information from various earthquake monitoring centers as GFZ, EMSC, USGS and others. It allows filtering of agency, region and magnitude as well as controlling update interval, institute priority and alarm types. Used by more than 25k active users and beeing in the top ten list of Google Play, EQInfo is one of the most popular apps for earthquake information.

  9. Development of regional earthquake early warning and structural health monitoring system and real-time ground motion forecasting using front-site waveform data (Invited)

    NASA Astrophysics Data System (ADS)

    Motosaka, M.

    2009-12-01

    This paper presents firstly, the development of an integrated regional earthquake early warning (EEW) system having on-line structural health monitoring (SHM) function, in Miyagi prefecture, Japan. The system makes it possible to provide more accurate, reliable and immediate earthquake information for society by combining the national (JMA/NIED) EEW system, based on advanced real-time communication technology. The author has planned to install the EEW/SHM system to the public buildings around Sendai, a million city of north-eastern Japan. The system has been so far implemented in two buildings; one is in Sendai, and the other in Oshika, a front site on the Pacific Ocean coast for the approaching Miyagi-ken Oki earthquake. The data from the front-site and the on-site are processed by the analysis system which was installed at the analysis center of Disaster Control Research Center, Tohoku University. The real-time earthquake information from JMA is also received at the analysis center. The utilization of the integrated EEW/SHM system is addressed together with future perspectives. Examples of the obtained data are also described including the amplitude depending dynamic characteristics of the building in Sendai before, during, and after the 2008/6/14 Iwate-Miyagi Nairiku Earthquake, together with the historical change of dynamic characteristics for 40 years. Secondary, this paper presents an advanced methodology based on Artificial Neural Networks (ANN) for forward forecasting of ground motion parameters, not only PGA, PGV, but also Spectral information before S-wave arrival using initial part of P-waveform at a front site. The estimated ground motion information can be used as warning alarm for earthquake damage reduction. The Fourier Amplitude Spectra (FAS) estimated before strong shaking with high accuracy can be used for advanced engineering applications, e.g. feed-forward structural control of a building of interest. The validity and applicability of the method have been verified by using observation data sets of the K-NET sites of 39 earthquakes occurred in Miyagi Oki area. The initial part of P waveform data at the Oshika site (MYG011) of K-NET were used as the front-site waveform data. The earthquake observation data for 35 earthquakes among the 39 earthquakes, as well as the positional-information and site repartition information, were used as training data to construct the ANN structure. The data set for the remaining 4 earthquakes were used as the test data in the blind prediction of PGA and PGV at the 4 sites, namely, Sendai (MYG013), Taiwa (MYG009), Shiogama (MYG012), and Ishinomaki (MYG010).

  10. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    NASA Astrophysics Data System (ADS)

    Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

    2009-12-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this collaboration, and lessons learned from interacting with free-choice learning institutions.

  11. Providing Seismotectonic Information to the Public Through Continuously Updated National Earthquake Information Center Products

    NASA Astrophysics Data System (ADS)

    Bernardino, M. J.; Hayes, G. P.; Dannemann, F.; Benz, H.

    2012-12-01

    One of the main missions of the United States Geological Survey (USGS) National Earthquake Information Center (NEIC) is the dissemination of information to national and international agencies, scientists, and the general public through various products such as ShakeMap and earthquake summary posters. During the summer of 2012, undergraduate and graduate student interns helped to update and improve our series of regional seismicity posters and regional tectonic summaries. The "Seismicity of the Earth (1900-2007)" poster placed over a century's worth of global seismicity data in the context of plate tectonics, highlighting regions that have experienced great (M+8.0) earthquakes, and the tectonic settings of those events. This endeavor became the basis for a series of more regionalized seismotectonic posters that focus on major subduction zones and their associated seismicity, including the Aleutian and Caribbean arcs. The first round of these posters were inclusive of events through 2007, and were made with the intent of being continually updated. Each poster includes a regional tectonic summary, a seismic hazard map, focal depth cross-sections, and a main map that illustrates the following: the main subduction zone and other physiographic features, seismicity, and rupture zones of historic great earthquakes. Many of the existing regional seismotectonic posters have been updated and new posters highlighting regions of current seismological interest have been created, including the Sumatra and Java arcs, the Middle East region and the Himalayas (all of which are currently in review). These new editions include updated lists of earthquakes, expanded tectonic summaries, updated relative plate motion vectors, and major crustal faults. These posters thus improve upon previous editions that included only brief tectonic discussions of the most prominent features and historic earthquakes, and which did not systematically represent non-plate boundary faults. Regional tectonic summaries provide the public with immediate background information useful for teaching and media related purposes and are an essential component to many NEIC products. As part of the NEIC's earthquake response, rapid earthquake summary posters are created in the hours following a significant global earthquake. These regional tectonic summaries are included in each earthquake summary poster along with a discussion of the event, written by research scientists at the NEIC, often with help from regional experts. Now, through the efforts of this and related studies, event webpages will automatically contain a regional tectonic summary immediately after an event has been posted. These new summaries include information about plate boundary interactions and other associated tectonic elements, trends in seismicity and brief descriptions of significant earthquakes that have occurred in a region. The tectonic summaries for the following regions have been updated as part of this work: South America, the Caribbean, Alaska and the Aleutians, Kuril-Kamchatka, Japan and vicinity, and Central America, with newly created summaries for Sumatra and Java, the Mediterranean, Middle East, and the Himalayas. The NEIC is currently planning to integrate concise stylized maps with each tectonic summary for display on the USGS website.

  12. Application of Collocated GPS and Seismic Sensors to Earthquake Monitoring and Early Warning

    PubMed Central

    Li, Xingxing; Zhang, Xiaohong; Guo, Bofeng

    2013-01-01

    We explore the use of collocated GPS and seismic sensors for earthquake monitoring and early warning. The GPS and seismic data collected during the 2011 Tohoku-Oki (Japan) and the 2010 El Mayor-Cucapah (Mexico) earthquakes are analyzed by using a tightly-coupled integration. The performance of the integrated results is validated by both time and frequency domain analysis. We detect the P-wave arrival and observe small-scale features of the movement from the integrated results and locate the epicenter. Meanwhile, permanent offsets are extracted from the integrated displacements highly accurately and used for reliable fault slip inversion and magnitude estimation. PMID:24284765

  13. Application of collocated GPS and seismic sensors to earthquake monitoring and early warning.

    PubMed

    Li, Xingxing; Zhang, Xiaohong; Guo, Bofeng

    2013-01-01

    We explore the use of collocated GPS and seismic sensors for earthquake monitoring and early warning. The GPS and seismic data collected during the 2011 Tohoku-Oki (Japan) and the 2010 El Mayor-Cucapah (Mexico) earthquakes are analyzed by using a tightly-coupled integration. The performance of the integrated results is validated by both time and frequency domain analysis. We detect the P-wave arrival and observe small-scale features of the movement from the integrated results and locate the epicenter. Meanwhile, permanent offsets are extracted from the integrated displacements highly accurately and used for reliable fault slip inversion and magnitude estimation. PMID:24284765

  14. Space Radiation Monitoring Center at SINP MSU

    NASA Astrophysics Data System (ADS)

    Kalegaev, Vladimir; Barinova, Wera; Barinov, Oleg; Bobrovnikov, Sergey; Dolenko, Sergey; Mukhametdinova, Ludmila; Myagkova, Irina; Nguen, Minh; Panasyuk, Mikhail; Shiroky, Vladimir; Shugay, Julia

    2015-04-01

    Data on energetic particle fluxes from Russian satellites have been collected in Space monitoring data center at Moscow State University in the near real-time mode. Web-portal http://smdc.sinp.msu.ru/ provides operational information on radiation state of the near-Earth space. Operational data are coming from space missions ELECTRO-L1, Meteor-M2. High-resolution data on energetic electron fluxes from MSU's satellite VERNOV with RELEC instrumentation on board are also available. Specific tools allow the visual representation of the satellite orbit in 3D space simultaneously with particle fluxes variations. Concurrent operational data coming from other spacecraft (ACE, GOES, SDO) and from the Earth's surface (geomagnetic indices) are used to represent geomagnetic and radiation state of near-Earth environment. Internet portal http://swx.sinp.msu.ru provides access to the actual data characterizing the level of solar activity, geomagnetic and radiation conditions in heliosphere and the Earth's magnetosphere in the real-time mode. Operational forecasting services automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons, using data from LEO and GEO orbits. The models of space environment working in autonomous mode are used to generalize the information obtained from different missions for the whole magnetosphere. On-line applications created on the base of these models provide short-term forecasting for SEP particles and relativistic electron fluxes at GEO and LEO, Dst and Kp indices online forecasting up to 1.5 hours ahead. Velocities of high-speed streams in solar wind on the Earth orbit are estimated with advance time of 3-4 days. Visualization system provides representation of experimental and modeling data in 2D and 3D.

  15. The Northern California Earthquake Data Center: Seismic and Geophysical Data for Northern California and Beyond

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Klein, F.; Zuzlewski, S.; Gee, L.; Oppenheimer, D.; Romanowicz, B.

    2004-12-01

    The Northern California Earthquake Data Center (NCEDC) is an archive and distribution center for geophysical data for networks in northern and central California. The NCEDC provides timeseries data from seismic, strain, electro-magnetic, a variety of creep, tilt, and environmental sensors, and continuous and campaign GPS data in raw and RINEX formats. The NCEDC has a wide variety of interfaces for data retrieval. Timeseries data are available via a web interface and standard queued request methods such as NetDC (developed in collaboration with the IRIS DMC and other international data centers), BREQ_FAST, and EVT_FAST. Interactive data retrieval methods include STP, developed by the SCEDC, and FISSURES DHI (Data Handling Interface), an object-oriented interface developed by IRIS. The Sandia MATSEIS system is being adapted to use the FISSURES DHI interface to provide an enhanced GUI-based seismic analysis system for MATLAB. Northern California and prototype ANSS worldwide earthquake catalogs are searchable from web interfaces, and supporting phase and amplitude data can be retrieved when available. Future data sets planned for the NCEDC are seismic and strain data from the EarthScope Plate Boundary Observatory (PBO) and SAFOD. The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and USGS Menlo Park.

  16. Implications of the World Trade Center Health Program (WTCHP) for the Public Health Response to the Great East Japan Earthquake

    PubMed Central

    CRANE, Michael A.; CHO, Hyunje G.; LANDRIGAN, Phillip J.

    2013-01-01

    The attacks on the World Trade Center (WTC) on September 11, 2001 resulted in a serious burden of physical and mental illness for the 50,000 rescue workers that responded to 9/11 as well as the 400,000 residents and workers in the surrounding areas of New York City. The Zadroga Act of 2010 established the WTC Health Program (WTCHP) to provide monitoring and treatment of WTC exposure-related conditions and health surveillance for the responder and survivor populations. Several reports have highlighted the applicability of insights gained from the WTCHP to the public health response to the Great East Japan Earthquake. Optimal exposure monitoring processes and attention to the welfare of vulnerable exposed sub-groups are critical aspects of the response to both incidents. The ongoing mental health care concerns of 9/11 patients accentuate the need for accessible and appropriately skilled mental health care in Fukushima. Active efforts to demonstrate transparency and to promote community involvement in the public health response will be highly important in establishing successful long-term monitoring and treatment programs for the exposed populations in Fukushima. PMID:24317449

  17. Implications of the World Trade Center Health Program (WTCHP) for the public health response to the Great East Japan Earthquake.

    PubMed

    Crane, Michael A; Cho, Hyunje G; Landrigan, Phillip J

    2014-01-01

    The attacks on the World Trade Center (WTC) on September 11, 2001 resulted in a serious burden of physical and mental illness for the 50,000 rescue workers that responded to 9/11 as well as the 400,000 residents and workers in the surrounding areas of New York City. The Zadroga Act of 2010 established the WTC Health Program (WTCHP) to provide monitoring and treatment of WTC exposure-related conditions and health surveillance for the responder and survivor populations. Several reports have highlighted the applicability of insights gained from the WTCHP to the public health response to the Great East Japan Earthquake. Optimal exposure monitoring processes and attention to the welfare of vulnerable exposed sub-groups are critical aspects of the response to both incidents. The ongoing mental health care concerns of 9/11 patients accentuate the need for accessible and appropriately skilled mental health care in Fukushima. Active efforts to demonstrate transparency and to promote community involvement in the public health response will be highly important in establishing successful long-term monitoring and treatment programs for the exposed populations in Fukushima. PMID:24317449

  18. Monitoring of Ecological Restoration at the Central Quake-Hit Areas of Wenchuan Earthquake Using RS & GIS Remote Sensing

    NASA Astrophysics Data System (ADS)

    Zhang, Q.

    2014-12-01

    The 2008 Sichuan earthquake, occurred on 12 May 2008 with a magnitude of 8.0 and the center at Wenchuan (31.021°N, 103.367°E), has not only caused a large number of human casualties and property loss, but also severely damaged the ecological system in its surrounding 10 counties, threatening the local ecological safety. As part of the post-disaster reconstruction services, a systematic monitoring of the ecological restoration at the central quake-hit areas has been made based on RS & GIS remote sensing. In this paper we selected the Dujiangyan area for analysis. The reason to select this region is because that Dujiangyan area is about 40 km from the epicenter, and as a region in the subtropical monsoon climate zone, it has a well developed forest ecosystem in the northern part before the earth quake. The coverage of grassland in this region is relatively less. Since the ecological restoration after the earthquake is a long term process, the restoration for different vegetation types has different characteristics. From the analysis of the spatiotemporal change of land-use and vegetation cover in Dujiangyan area from the post-earthquake in 2008 to 2013, we found: (1) During the earthquake, the major vegetation type destroyed is the woodland, which accounts for 99.34% of the destroyed area, and the next are arable land and grassland. (2) The ecological restoration started from the grassland and gradually transited to shrub. In two years after the earthquake, the most significant increase in both area of coverage and magnitude is the grassland, and by 2013, the area of grassland decreased slightly, and instead the area of shrub increased, demonstrating a transition trend from the grassland to the shrub. (3) From the map of vegetation cover, we can see these change occurs mainly in the northern mountain area, while the change of land use mainly occurred in the southern part of the city. These changes can be linked clearly with the earthquake disaster and the post- reconstruction human activities.

  19. A new Automatic Phase Picker for the National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Buland, R.

    2002-12-01

    The increasing need for rapid accurate earthquake locations for timely notification and damage assessment has placed greater demands on automatic phase picking technology. We are developing a new automatic phase picker for use by the National Earthquake Information Center (NEIC). Since the NEIC provides rapid notification for all felt earthquakes in the US and significant events worldwide, the picking algorithm must provide accurate arrival times for the wide range of waveforms generated by local, regional, and teleseismic events. The current picker applies a Short-Term-Average over Long-Term-Average algorithm (STA/LTA) to vertical-component records that have been narrow band filtered into two data streams with peaks at 1.5 Hz and 3.0 Hz. The use of this relatively high-frequency narrow-band data provides accurate arrival-time estimates. The travel-time residuals for 10,000 teleseismic P-wave picks have a spread (scaled median average deviation) of 1.3 seconds; this is similar to the spread of human made picks. Additionally, at these high-frequencies teleseismic picks are generally limited to compressional waves. This aids identification of arrival type and therefore simplifies the association of picks to events. Although the current picker works well, plans to improve the accuracy, reliability, and detection threshold of automatic locations require the picking of secondary phases and analysis of a larger frequency band. Several previous studies have presented picking methods but few published studies test them on numerous seismograms selected from a wide range of distances and magnitudes. Published techniques include: STA/LTA, auto-regressive, cross-correlation, and neural networks. We will present comparisons of several methods and discuss their fitness for implementation on our realtime system. Preference will be given to methods that provide the most reliable and accurate earthquake locations, not necessarily those which best reproduce human picks.

  20. Geomorphic and seismic coupled monitoring of post-earthquake subsurface weakening.

    NASA Astrophysics Data System (ADS)

    Marc, Odin; Sawazaki, Kaoru; Sens-Schönfelder, Christoph; Hovius, Niels; Meunier, Patrick; Uchida, Taro

    2014-05-01

    We present integrated geomorphic data constraining an elevated landslide rate following 3 continental shallow earthquakes, the Mw 6.9 Finisterre (1993), the Mw 7.6 ChiChi (1999) and the Mw 6.8 Iwate-Miyagi (2008) earthquakes. We have constrained the magnitude and decay time of seismically enhanced landslide rates and investigated the mechanism at the source of this prolonged geomorphic response. We provide evidence ruling out aftershocks and rain forcing as possible mechanisms and identify substrate weakening as a likely cause. We have used ambient noise autocorrelation to monitor subsurface seismic velocity within earthquake epicentral areas. Observed station response patterns are diverse, illustrating potential lithological or other local effects. However, some stations were strongly affected by the earthquake in relatively high frequency ranges (1-2 and 2-4 Hz). This may be related to shallow subsurface change. At several stations we have found a velocity drop followed by a recovery over several years, in fair agreement with the recovery time of landslide rates in the area. This prompts a search for common processes altering the strength of the topmost layers of soil/rock in epicentral areas, simultaneously driving a landslide rate increase and a seismic velocity drop. This search requires additional constraints on the seismic signal interpretation. It may yield a useful tool for post-earthquake risk management.

  1. Basin-centered asperities in great subduction zone earthquakes: A link between slip, subsidence, and subduction erosion?

    USGS Publications Warehouse

    Wells, R.E.; Blakely, R.J.; Sugiyama, Y.; Scholl, D. W.; Dinterman, P.A.

    2003-01-01

    Published areas of high coseismic slip, or asperities, for 29 of the largest Circum-Pacific megathrust earthquakes are compared to forearc structure revealed by satellite free-air gravity, bathymetry, and seismic profiling. On average, 71% of an earthquake's seismic moment and 79% of its asperity area occur beneath the prominent gravity low outlining the deep-sea terrace; 57% of an earthquake's asperity area, on average, occurs beneath the forearc basins that lie within the deep-sea terrace. In SW Japan, slip in the 1923, 1944, 1946, and 1968 earthquakes was largely centered beneath five forearc basins whose landward edge overlies the 350??C isotherm on the plate boundary, the inferred downdip limit of the locked zone. Basin-centered coseismic slip also occurred along the Aleutian, Mexico, Peru, and Chile subduction zones but was ambiguous for the great 1964 Alaska earthquake. Beneath intrabasin structural highs, seismic slip tends to be lower, possibly due to higher temperatures and fluid pressures. Kilometers of late Cenozoic subsidence and crustal thinning above some of the source zones are indicated by seismic profiling and drilling and are thought to be caused by basal subduction erosion. The deep-sea terraces and basins may evolve not just by growth of the outer arc high but also by interseismic subsidence not recovered during earthquakes. Basin-centered asperities could indicate a link between subsidence, subduction erosion, and seismogenesis. Whatever the cause, forearc basins may be useful indicators of long-term seismic moment release. The source zone for Cascadia's 1700 A.D. earthquake contains five large, basin-centered gravity lows that may indicate potential asperities at depth. The gravity gradient marking the inferred downdip limit to large coseismic slip lies offshore, except in northwestern Washington, where the low extends landward beneath the coast. Transverse gravity highs between the basins suggest that the margin is seismically segmented and could produce a variety of large earthquakes. Published in 2003 by the American Geophysical Union.

  2. Infrasonic monitoring of UGTs and earthquakes for discrimination

    SciTech Connect

    Whitaker, R.W.; Noel, S.; Mutschlecner, J.P.; Davidson, M.

    1992-01-01

    Over the last several years, as part of Los Alamos verification activity, low frequency acoustics measurements have been made of underground nuclear tests (UGTs). The measurements have been made with arrays at St. George, Utah, and Los Alamos, New Mexico; both arrays operate continuously, and many earthquakes (EQs) have been recorded as well. The general frequency range is 0.1 to 10 Hz, in the infrasonic domain. In this domain the atmospheric signals are still longitudinal pressure waves. Propagation for frequencies around 1 hz is excellent with little excess attenuation over simple geometric spreading. Measured peak to peak pressure levels range from 0.1 to 60 {mu}bars, where one bar is normal atmospheric pressure. We employ standard array processing techniques (beamforming) to derive the usual outputs of correlation coefficient, trace velocity, duration, power spectrum, and azimuth for sequential windows of data. Undesired signals can be subtracted from the beam, and frequency filtering can be used to improve signal to noise in desired passbands.

  3. Infrasonic monitoring of UGTs and earthquakes for discrimination

    SciTech Connect

    Whitaker, R.W.; Noel, S.; Mutschlecner, J.P.; Davidson, M.

    1992-08-01

    Over the last several years, as part of Los Alamos verification activity, low frequency acoustics measurements have been made of underground nuclear tests (UGTs). The measurements have been made with arrays at St. George, Utah, and Los Alamos, New Mexico; both arrays operate continuously, and many earthquakes (EQs) have been recorded as well. The general frequency range is 0.1 to 10 Hz, in the infrasonic domain. In this domain the atmospheric signals are still longitudinal pressure waves. Propagation for frequencies around 1 hz is excellent with little excess attenuation over simple geometric spreading. Measured peak to peak pressure levels range from 0.1 to 60 {mu}bars, where one bar is normal atmospheric pressure. We employ standard array processing techniques (beamforming) to derive the usual outputs of correlation coefficient, trace velocity, duration, power spectrum, and azimuth for sequential windows of data. Undesired signals can be subtracted from the beam, and frequency filtering can be used to improve signal to noise in desired passbands.

  4. The German Task Force for Earthquakes - A temporary network aftershock monitoring

    NASA Astrophysics Data System (ADS)

    Sobiesiak, M.; Eggert, S.; Grosser, H.; Hainzl, S.; GĂŒnther, E.

    2009-04-01

    The German Task Force for Earthquakes (GTF) is an interdisciplinary group for immediate response on disastrous earthquakes with the aim to monitor the post-seismic processes and assess the impact of the seismic event on the disaster stricken area. For accomplishing this task, 20 short-period seismic stations for aftershock monitoring; and 10 strong motion instruments for purposes in engineering seismology are available exclusively for the use of the GTF. Furthermore, the GTF is equipped with tools for hydro-geological investigations and has 6 GPS instruments at hand for studying post-seismic deformation. Geological, sociological and remote sensing expertise is provided by a number of scientists at the GFZ or other national and international universities and organisations. We would like to present a variety of results achieved in 20 missions we were able to conduct up to date. This will give an overview on the scientific opportunities which lie in collecting and investigating high resolution local earthquake data. In future, online data transmission is envisaged to allow for aftershock hazard assessment to benefit the work of rescue teams and local authorities in the area concerned. In general, decision making in the beginning of a Task Force activity requires a system of fast dissemination of earthquake information which in our case, is provided by GEOFON.

  5. Novel Algorithms Enabling Rapid, Real-Time Earthquake Monitoring and Tsunami Early Warning Worldwide

    NASA Astrophysics Data System (ADS)

    Lomax, A.; Michelini, A.

    2012-12-01

    We have introduced recently new methods to determine rapidly the tsunami potential and magnitude of large earthquakes (e.g., Lomax and Michelini, 2009ab, 2011, 2012). To validate these methods we have implemented them along with other new algorithms within the Early-est earthquake monitor at INGV-Rome (http://early-est.rm.ingv.it, http://early-est.alomax.net). Early-est is a lightweight software package for real-time earthquake monitoring (including phase picking, phase association and event detection, location, magnitude determination, first-motion mechanism determination, ...), and for tsunami early warning based on discriminants for earthquake tsunami potential. In a simulation using archived broadband seismograms for the devastating M9, 2011 Tohoku earthquake and tsunami, Early-est determines: the epicenter within 3 min after the event origin time, discriminants showing very high tsunami potential within 5-7 min, and magnitude Mwpd(RT) 9.0-9.2 and a correct shallow-thrusting mechanism within 8 min. Real-time monitoring with Early-est givess similar results for most large earthquakes using currently available, real-time seismogram data. Here we summarize some of the key algorithms within Early-est that enable rapid, real-time earthquake monitoring and tsunami early warning worldwide: >>> FilterPicker - a general purpose, broad-band, phase detector and picker (http://alomax.net/FilterPicker); >>> Robust, simultaneous association and location using a probabilistic, global-search; >>> Period-duration discriminants TdT0 and TdT50Ex for tsunami potential available within 5 min; >>> Mwpd(RT) magnitude for very large earthquakes available within 10 min; >>> Waveform P polarities determined on broad-band displacement traces, focal mechanisms obtained with the HASH program (Hardebeck and Shearer, 2002); >>> SeisGramWeb - a portable-device ready seismogram viewer using web-services in a browser (http://alomax.net/webtools/sgweb/info.html). References (see also: http://alomax.net/pub_list.html): Lomax, A. and A. Michelini (2012), Tsunami early warning within 5 minutes, Pure and Applied Geophysics, 169, nnn-nnn, doi: 10.1007/s00024-012-0512-6. Lomax, A. and A. Michelini (2011), Tsunami early warning using earthquake rupture duration and P-wave dominant period: the importance of length and depth of faulting, Geophys. J. Int., 185, 283-291, doi: 10.1111/j.1365-246X.2010.04916.x. Lomax, A. and A. Michelini (2009b), Tsunami early warning using earthquake rupture duration, Geophys. Res. Lett., 36, L09306, doi:10.1029/2009GL037223. Lomax, A. and A. Michelini (2009a), Mwpd: A Duration-Amplitude Procedure for Rapid Determination of Earthquake Magnitude and Tsunamigenic Potential from P Waveforms, Geophys. J. Int.,176, 200-214, doi:10.1111/j.1365-246X.2008.03974.x

  6. Disasters; the 2010 Haitian earthquake and the evacuation of burn victims to US burn centers.

    PubMed

    Kearns, Randy D; Holmes, James H; Skarote, Mary Beth; Cairns, Charles B; Strickland, Samantha Cooksey; Smith, Howard G; Cairns, Bruce A

    2014-09-01

    Response to the 2010 Haitian earthquake included an array of diverse yet critical actions. This paper will briefly review the evacuation of a small group of patients with burns to burn centers in the southeastern United States (US). This particular evacuation brought together for the first time plans, groups, and organizations that had previously only exercised this process. The response to the Haitian earthquake was a glimpse at what the international community working together can do to help others, and relieve suffering following a catastrophic disaster. The international response was substantial. This paper will trace one evacuation, one day for one unique group of patients with burns to burn centers in the US and review the lessons learned from this process. The patient population with burns being evacuated from Haiti was very small compared to the overall operation. Nevertheless, the outcomes included a better understanding of how a larger event could challenge the limited resources for all involved. This paper includes aspects of the patient movement, the logistics needed, and briefly discusses reimbursement for the care provided. PMID:24411582

  7. Illnesses and injuries reported at Disaster Application Centers following the 1994 Northridge Earthquake.

    PubMed

    Teeter, D S

    1996-09-01

    The 1994 Northridge, California, earthquake caused extensive structural damage and disrupted lives for thousands of residents. Local resources treated those initially injured. Many victims were unable or unwilling to reenter their dwellings. Record numbers of victims spent many hours at Disaster Application Centers (DACs) applying for financial assistance and other services. This created a concern for the provision of primary health care services at these centers. Under the Federal Response Plan, registered nurses, nurse practitioners, and physician assistants from the Department of Veterans Affairs treated 17,883 patients at the DACs. This report documents the injuries and illnesses sustained by the public and response workers at the DACs. The findings demonstrate that this care eased the burden on the local health care system. This article illustrates applications for estimating health services needs and demands at similar mass gatherings that might be experienced in response to catastrophic events and in U.S. military operations involving humanitarian relief missions. PMID:8840792

  8. New approach for earthquake/tsunami monitoring using dense GPS networks

    PubMed Central

    Li, Xingxing; Ge, Maorong; Zhang, Yong; Wang, Rongjiang; Xu, Peiliang; Wickert, Jens; Schuh, Harald

    2013-01-01

    In recent times increasing numbers of high-rate GPS stations have been installed around the world and set-up to provide data in real-time. These networks provide a great opportunity to quickly capture surface displacements, which makes them important as potential constituents of earthquake/tsunami monitoring and warning systems. The appropriate GPS real-time data analysis with sufficient accuracy for this purpose is a main focus of the current GPS research. In this paper we propose an augmented point positioning method for GPS based hazard monitoring, which can achieve fast or even instantaneous precise positioning without relying on data of a specific reference station. The proposed method overcomes the limitations of the currently mostly used GPS processing approaches of relative positioning and global precise point positioning. The advantages of the proposed approach are demonstrated by using GPS data, which was recorded during the 2011 Tohoku-Oki earthquake in Japan. PMID:24045328

  9. Quantifying 10 years of Improvements in Earthquake and Tsunami Monitoring in the Caribbean and Adjacent Regions

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.; Huerfano Moreno, V. A.; McNamara, D. E.; Saurel, J. M.

    2014-12-01

    The magnitude-9.3 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness to the destructive hazard of earthquakes and tsunamis. Post event assessments of global coastline vulnerability highlighted the Caribbean as a region of high hazard and risk and that it was poorly monitored. Nearly 100 tsunamis have been reported for the Caribbean region and Adjacent Regions in the past 500 years and continue to pose a threat for its nations, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North and South America. Significant efforts to improve monitoring capabilities have been undertaken since this time including an expansion of the United States Geological Survey (USGS) Global Seismographic Network (GSN) (McNamara et al., 2006) and establishment of the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS). The minimum performance standards it recommended for initial earthquake locations include: 1) Earthquake detection within 1 minute, 2) Minimum magnitude threshold = M4.5, and 3) Initial hypocenter error of <30 km. In this study, we assess current compliance with performance standards and model improvements in earthquake and tsunami monitoring capabilities in the Caribbean region since the first meeting of the UNESCO ICG-Caribe EWS in 2006. The three measures of network capability modeled in this study are: 1) minimum Mw detection threshold; 2) P-wave detection time of an automatic processing system and; 3) theoretical earthquake location uncertainty. By modeling three measures of seismic network capability, we can optimize the distribution of ICG-Caribe EWS seismic stations and select an international network that will be contributed from existing real-time broadband national networks in the region. Sea level monitoring improvements both offshore and along the coast will also be addressed. With the support of Member States and other countries and organizations it has been possible to significantly expand the sea level network thus reducing the amount of time it now takes to verify tsunamis.

  10. Network of seismo-geochemical monitoring observatories for earthquake prediction research in India

    NASA Astrophysics Data System (ADS)

    Chaudhuri, Hirok; Barman, Chiranjib; Iyengar, A.; Ghose, Debasis; Sen, Prasanta; Sinha, Bikash

    2013-08-01

    Present paper deals with a brief review of the research carried out to develop multi-parametric gas-geochemical monitoring facilities dedicated to earthquake prediction research in India by installing a network of seismo-geochemical monitoring observatories at different regions of the country. In an attempt to detect earthquake precursors, the concentrations of helium, argon, nitrogen, methane, radon-222 (222Rn), polonium-218 (218Po), and polonium-214 (214Po) emanating from hydrothermal systems are monitored continuously and round the clock at these observatories. In this paper, we make a cross correlation study of a number of geochemical anomalies recorded at these observatories. With the data received from each of the above observatories we attempt to make a time series analysis to relate magnitude and epicentral distance locations through statistical methods, empirical formulations that relate the area of influence to earthquake scale. Application of the linear and nonlinear statistical techniques in the recorded geochemical data sets reveal a clear signature of long-range correlation in the data sets.

  11. Earthquake monitoring of eastern Washington: Annual technical report, 1987

    SciTech Connect

    Malone, S.

    1987-10-01

    This report covers operations and research on the seismicity and structure of eastern Washington and northeastern Oregon for the year, July 1, 1986 to June 30, 1987. The report covers the operation of the regional seismograph network including station maintenance and calibration, data processing and operational problems. A detailed description of the current network including telemetry routing through the BPA microwave system is included. This section also details a new VCO system developed to improve the center-frequency stability and reliability of our field stations. The seismicity of the past year and a description of the catalog is covered.

  12. Monitoring of Earthquake Disasters by Satellite Radio Tomography

    NASA Astrophysics Data System (ADS)

    Kunitsyn, V.; Andreeva, E.; Nesterov, I.; Rekenthaler, D. A.

    2011-12-01

    This work addresses lithospheric-ionospheric coupling during strong earthquakes (EQ). Particular interest is placed on the physical phenomena preceding EQs - the precursors. We discuss both the ionospheric implications of EQs, and the ionospheric precursors to EQ. The requisite ionospheric sounding is carried out using satellite navigational system data; the data are analyzed using the methods of satellite radio tomography (RT). Signals from both low-orbiting beacons (Transit, Tsikada, etc.) and high orbiting global navigational satellite systems (GNSS including GPS and GLONASS) are used. The resulting 2D and 3D tomographic images and their time flow (4D RT) make it possible to study the spatiotemporal structure of ionospheric perturbations induced by EQs and EQ precursors, and to distinguish ionospheric responses to processes of EQ preparation against the effects of other factors. Low-orbital RT (LORT) provides almost "instantaneous" (with a time span of 5-8 min) 2-D snapshots of the electron density over the seismically active region of interest. LORT allows 2D imaging of various anomalies, including wave structures such as ionospheric manifestations of acoustic-gravity waves (AGW), wave-like disturbances, and solitary waves with the gaps between images, depending on the number of operating satellites (currently, 30-100 minutes). High-orbital RT (HORT) is capable of imaging 4D distributions of ionospheric plasma (resulting in 3D snapshots every 20-30 minutes). Using this approach, one can reconstruct RT images of ionospheric irregularities, wave structures, and perturbations such as solitary waves. In regions with a sufficient number of GNSS receivers (California, Japan), 4-D RT images can be generated every 2-4 minutes. The spatial resolution of LORT and HORT systems is on the order of 20-40, and 100 km, respectively. The combination of LORT and HORT systems has the potential for exploiting data provided by other experimental techniques, including radio occultation, ionosonde, and radar measurements, inter alia. Further integration of RT systems with other multi-instrumental observations of EQ-related phenomena is possible. We present the results of long-term RT studies of the ionosphere over California, Alaska, and Southeast Asia (Taiwan region). We used the experimental data from the LORT systems in Alaska and Taiwan. At present, LORT system in California is put into operation. The input for HORT imaging was the data from IGS, UNAVCO, and Japan GPS network stations. A variety of examples are given to illustrate the ionospheric perturbations associated with EQs and to illustrate EQ-related, ionospheric precursors including specific ionospheric disturbances, AGW, and solitary-wave-like perturbations. Several dozen precursors are identified from the results of many years of RT studies in Alaska and the Taiwan region during the period from 2006-2008. We discuss the results of a HORT analysis of a series of recent EQs including San Simeon (2003), Parkfield (2004), Sumatra (2004), Sichuan (China, 2008), Haiti (2010), Chile (2010), Japan (Tohoku, 2011), and other events. We are grateful to Dr. L.-C.Tsai and Northwest Research Associates, Inc., for providing raw RT data for Taiwan and Alaska.

  13. Real-time seismic monitoring of the integrated cape girardeau bridge array and recorded earthquake response

    USGS Publications Warehouse

    Celebi, M.

    2006-01-01

    This paper introduces the state of the art, real-time and broad-band seismic monitoring network implemented for the 1206 m [3956 ft] long, cable-stayed Bill Emerson Memorial Bridge in Cape Girardeau (MO), a new Mississippi River crossing, approximately 80 km from the epicentral region of the 1811-1812 New Madrid earthquakes. The bridge was designed for a strong earthquake (magnitude 7.5 or greater) during the design life of the bridge. The monitoring network comprises a total of 84 channels of accelerometers deployed on the superstructure, pier foundations and at surface and downhole free-field arrays of the bridge. The paper also presents the high quality response data obtained from the network. Such data is aimed to be used by the owner, researchers and engineers to assess the performance of the bridge, to check design parameters, including the comparison of dynamic characteristics with actual response, and to better design future similar bridges. Preliminary analyses of ambient and low amplitude small earthquake data reveal specific response characteristics of the bridge and the free-field. There is evidence of coherent tower, cable, deck interaction that sometimes results in amplified ambient motions. Motions at the lowest tri-axial downhole accelerometers on both MO and IL sides are practically free from any feedback from the bridge. Motions at the mid-level and surface downhole accelerometers are influenced significantly by feedback due to amplified ambient motions of the bridge. Copyright ASCE 2006.

  14. Two Decades of Seismic Monitoring by WEBNET: Disclosing a Lifecycle of an Earthquake Swarm Zone

    NASA Astrophysics Data System (ADS)

    Fischer, T.; Horalek, J.; Cermakova, H.; Michalek, J.; Doubravova, J.; Bouskova, A.; Bachura, M.

    2014-12-01

    The area of West Bohemia/Vogtland in western Eger Rift is typified by earthquake swarm activity with maximum magnitudes not exceeding ML 5. The seismicity is dominated by the area near Novy Kostel where earthquakes cluster along a narrow and steeply dipping focal zone of 8 km length that strikes about N-S in the depth range 7-11 km. Detailed seismic monitoring has been carried out by the WEBNET seismic network since 1992. During that period earthquake swarms with several mainshocks exceeding magnitude level ML 3 took place in 2000, 2008 and 2011. These swarms were characteristic by episodic character where the activity of individual episodes overlapped in time and space. Interestingly, the rate of activity of individual swarms increased with each subsequent swarm; the 2000 swarm being the slowest and the 2011 swarm the most rapid one. In 2014 the character of seismicity has changed from a swarm-like activity to a mainshock-aftershock activity. Already three mainshocks has occurred since May 2014; the ML 3.6 event of May 24, the ML 4.5 event of May 31 and the ML 3.5 event of August 3. All these events were followed by a short aftershock sequence of one to four days duration. All three events exceeded the following aftershocks by more than one magnitude level and none of these mainshocks were preceded by foreshocks, which differentiates this activity from the preceding swarm seismicity. Interestingly, the hypocenters of the mentioned earthquake swarms and mainshock-aftershock sequences share a common fault zone and overlap significantly. We present detailed analysis of precise hypocenter locations and statistical characteristics of the activity in order to find the origin of different behavior of seismic activity, which results in either earthquake swarms or mainshock-aftershock activity.

  15. Cloud-based systems for monitoring earthquakes and other environmental quantities

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Olson, M.; Liu, A.; Chandy, M.; Bunn, J.; Guy, R.

    2013-12-01

    There are many advantages to using a cloud-based system to record and analyze environmental quantities such as earthquakes, radiation, various gases, dust and meteorological parameters. These advantages include robustness and dynamic scalability, and also reduced costs. In this paper, we present our experiences over the last three years in developing a cloud-based earthquake monitoring system (the Community Seismic Network). This network consists of over 600 sensors (accelerometers) in the S. California region that send data directly to the Google App Engine where they are analyzed. The system is capable of handing many other types of sensor data and generating a situation-awareness analysis as a product. Other advantages to the cloud-based system are integration with other peer networks, and being able to deploy anywhere in the world without have to build addition computing infrastructure.

  16. Search for earthquake precursors in multidisciplinary data monitoring of geophysical and biological parameters

    NASA Astrophysics Data System (ADS)

    Sidorin, A. Ya.

    Short-term variations in sets of geophysical and biological parameters that were monitored at the Garm research site for a long time are considered in relation to an earthquake of magnitude M = 5.3. Daily average data of electrical resistivity, electrototelluric field, electrochemical potential, water conductivity and hourly average data of electrical activity of weakly electric fishes were used. All the data, including bioindicators activity, were obtained by high precision instrumental methods. Contemporary disturbances of all the geoelectrical parameters were discovered when observations were carried out directly in the epicentral zone of the impending earthquake. At the distance of about 20-30 km from the epicenter short-term precursors were not found.

  17. Recorded earthquake responses from the integrated seismic monitoring network of the Atwood Building, Anchorage, Alaska

    USGS Publications Warehouse

    Celebi, M.

    2006-01-01

    An integrated seismic monitoring system with a total of 53 channels of accelerometers is now operating in and at the nearby free-field site of the 20-story steel-framed Atwood Building in highly seismic Anchorage, Alaska. The building has a single-story basement and a reinforced concrete foundation without piles. The monitoring system comprises a 32-channel structural array and a 21-channel site array. Accelerometers are deployed on 10 levels of the building to assess translational, torsional, and rocking motions, interstory drift (displacement) between selected pairs of adjacent floors, and average drift between floors. The site array, located approximately a city block from the building, comprises seven triaxial accelerometers, one at the surface and six in boreholes ranging in depths from 15 to 200 feet (???5-60 meters). The arrays have already recorded low-amplitude shaking responses of the building and the site caused by numerous earthquakes at distances ranging from tens to a couple of hundred kilometers. Data from an earthquake that occurred 186 km away traces the propagation of waves from the deepest borehole to the roof of the building in approximately 0.5 seconds. Fundamental structural frequencies [0.58 Hz (NS) and 0.47 Hz (EW)], low damping percentages (2-4%), mode coupling, and beating effects are identified. The fundamental site frequency at approximately 1.5 Hz is close to the second modal frequencies (1.83 Hz NS and 1.43 EW) of the building, which may cause resonance of the building. Additional earthquakes prove repeatability of these characteristics; however, stronger shaking may alter these conclusions. ?? 2006, Earthquake Engineering Research Institute.

  18. Feasibility of Acoustic Monitoring of Strength Drop Precursory to Earthquake Occurrence

    NASA Astrophysics Data System (ADS)

    Kame, N.; Nagata, K.; Nakatani, M.; Kusakabe, T.

    2014-12-01

    Rate- and state-dependent friction law (RSF), proposed on the basis of laboratory experiments, has been extensively applied to modeling of earthquake stick-slip cycles. A simple spring-slider model obeying RSF predicts a significant decrease of the frictional strength Phi (the state of contact) that is localized within a few years preceding the earthquake occurrence. On the other hand, recent laboratory experiments successfully monitored the history of the strength by simultaneously measuring the P-wave transmissivity |T| across the frictional interface using a 1-MHz transducer. This suggests a possibility of earthquake forecast by monitoring the strength of a natural fault by acoustic methods.The present paper explores the feasibility of such monitoring in the field on the basis of the physics of RSF combined with the linear slip model (LSM) employed in the classical acoustic methodology for monitoring an imperfectly welded interface. The characteristic frequency fc, around which |T| (or reflectivity |R|) has a good sensitivity to the interface strength, is shown to be proportional to the strength and inversely proportional to the representative scale of real contacts. For natural faults fc is estimated to be 1 to 100Hz, which is practicable in the field. The changes of |T| and |R| depend on the ratio of the strength drop to the absolute strength level, the latter of which is not constrained by RSF simulations. Expected changes in wave amplitude in the preslip period would be several percent for strong faults and several tens percent for weak faults, which may be detectable by acoustic methods such as seismic reflection surveys. [Reference] Kame, N., Nagata, K., Kusakabe, T., Nakatani, M., Earth, Planets and Space, Volume 66, Issue 1, 2014, doi:10.1186/1880-5981-66-41

  19. Feasibility of acoustic monitoring of strength drop precursory to earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kame, Nobuki; Nagata, Kohei; Nakatani, Masao; Kusakabe, Tetsuya

    2014-12-01

    Rate- and state-dependent friction law (RSF), proposed on the basis of laboratory experiments, has been extensively applied to modeling of earthquake stick-slip cycles. A simple spring-slider model obeying RSF predicts a significant decrease of the frictional strength ? (the state of contact) that is localized within a few years preceding the earthquake occurrence. On the other hand, recent laboratory experiments successfully monitored the history of the strength by simultaneously measuring the P-wave transmissivity | T| across the frictional interface using a 1-MHz transducer. This suggests a possibility of earthquake forecast by monitoring the strength of a natural fault by acoustic methods. The present paper explores the feasibility of such monitoring in the field on the basis of the physics of RSF combined with the linear slip model (LSM) employed in the classical acoustic methodology for monitoring an imperfectly welded interface. The characteristic frequency f c , around which | T| (or reflectivity | R|) has a good sensitivity to the interface strength, is shown to be proportional to the strength and inversely proportional to the representative scale of real contacts. For natural faults, f c is estimated to be 1 to 100 Hz, which is practicable in the field. The changes of | T| and | R| depend on the ratio of the strength drop to the absolute strength level, the latter of which is not constrained by RSF simulations. Expected changes in wave amplitude in the preslip period would be several percent for strong faults and several tens percent for weak faults, which may be detectable by acoustic methods such as seismic reflection surveys.

  20. Continuous Monitoring of Potential Geochemical and Geomagnetic Earthquake Precursors: Lessons Learned

    NASA Astrophysics Data System (ADS)

    Burjanek, J.; Faeh, D.; Surbeck, H.; Balderer, W.; Kaestli, P.; Gassner, G.

    2014-12-01

    In the last decades different studies have addressed short-term earthquake precursors by focusing on the observation of a wide variety of physical phenomena that can precede strong earthquakes. This includes anomalous seismicity patterns, ground water level changes, gas emissions, geochemical changes in groundwater, seismo-electromagnetic phenomena, seismo-ionospheric coupling, surface deformations etc. Some cases of the potential precursors observed in the past where later found as an artifact of sensor-system malfunctioning. Therefore, such monitoring needs to be long-term, time-stamped and continuous, with a professional quality assurance procedure. Moreover, it is important to correlate data recorded with multi-sensor systems. We present a case study of running a multi-sensor system in Valais, Switzerland. The Valais is the area of highest seismic hazard in Switzerland and has experienced a magnitude 6 or larger event every 100 years on average. The system consists of seismic, geodetic (GPS), geochemical and geomagnetic instruments. Here we focus on the latter two. In particular, the observation of possible geochemical earthquake precursory signals is carried out by the installation of two instruments: 1) field fluorometer for monitoring of the fluorescence spectral analysis of water, which can monitor 3 different wavelength bands, temperature and turbidity; 2) geochemical gas concentration sensors, which monitor radon, CO2, and CH4 gases. The geomagnetic observations are performed by three component coil magnetometers. All instrument are designed to run in continuous mode and stream data in real-time. In this presentation we focus mainly on operational aspects of such system. We discuss problems faced during operation, feasibility of the installation, and in general lessons learned for potential future applications.

  1. Viscoelastic solutions to tectonic problems of extinct spreading centers, earthquake triggering, and subduction zone dynamics

    NASA Astrophysics Data System (ADS)

    Freed, Andrew Mark

    This dissertation uses a finite element technique to explore the role of viscoelastic behavior in a wide range of plate tectonic processes. We consider problems associated with spreading centers, earthquake triggering, and subduction zone dynamics. We simulated the evolution of a slow-spreading center upon cessation of active spreading in order to predict long-term changes in the axial valley morphology. Results suggest that the axial valley created at a slow-spreading center persists because the crust is too strong to deform ductily and because no effective mechanism exists to reverse the topography created by rift-bounding normal faults. These results suggest that the persistence of axial valleys at extinct spreading centers is consistent with a lithospheric stretching model based on dynamic forces for active slow-spreading ridges. In our study of earthquake triggering, results suggest that if a ductile lower crust or upper mantle flows viscously following a thrust event, relaxation may cause a transfer of stress to the upper crust. Under certain conditions this may lead to further increases and a lateral expansion of high Coulomb stresses along the base of the upper crust. Analysis of experimentally determined non-Newtonian flow laws suggests that wet granitic, quartz, and feldspar aggregates may yield a viscosity on the order of 10sp{19} Pa-s. The calculated rate of stress transfer from a viscous lower crust or upper mantle to the upper crust becomes faster with increasing values of the power law exponent and the presence of a regional compressive strain rate. In our study of subduction zone dynamics, we model the density and strength structures that drive the Nazca and South American plates. Results suggest that chemical buoyancy and phase changes associated with a cool subducting slab strongly influence the magnitude of driving forces, and the downgoing slab behaves weaker than the strength that would be expected based solely on temperature. Additionally, results suggest that large stresses are produced on the western margin of South American due to forces associated with asthenospheric cornerflow. These forces may be responsible for the high topography of the South American Cordilleran.

  2. Monitoring the mental well-being of caregivers during the Haiti-earthquake.

    PubMed Central

    Van der Auwera, Marcel; Debacker, Michel; Hubloue, Ives

    2012-01-01

    Introduction During disaster relief, personnel’s safety is very important. Mental well being is a part of this safety issue. There is however a lack of objective mental well being monitoring tools, usable on scene, during disaster relief. This study covers the use of validated tools towards detection of psychological distress and monitoring of mental well being of disaster relief workers, during the Belgian First Aid and Support Team deployment after the Haiti earthquake in 2010. Methodology The study was conducted using a demographic questionnaire combined with validated measuring instruments: Belbin Team Role, Compassion Fatigue and Satisfaction Self-Test for Helpers, DMAT PsySTART, K6+ Self Report. A baseline measurement was performed before departure on mission, and measurements were repeated at day 1 and day 7 of the mission, at the end of mission, and 7 days, 30 days and 90 days post mission. Results 23 out of the 27 team members were included in the study. Using the Compassion Fatigue and Satisfaction Self-Test for Helpers as a monitoring tool, a stable condition was monitored in 7 participants, a dip in 5 participants, an arousal in 10 participants and a double pattern in 1 participant. Conclusions The study proved the ability to monitor mental well being and detect psychological distress, by self administered validated tools, during a real disaster relief mission. However for practical reasons some tools should be adapted to the specific use in the field. This study opens a whole new research area within the mental well being and monitoring field. Citation: Van der Auwera M, Debacker M, Hubloue I. Monitoring the mental well-being of caregivers during the Haiti-earthquake.. PLoS Currents Disasters. 2012 Jul 18 PMID:22953241

  3. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    NASA Astrophysics Data System (ADS)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  4. USGS contributions to earthquake and tsunami monitoring in the Caribbean Region

    NASA Astrophysics Data System (ADS)

    McNamara, D.; Caribbean Project Team, U.; Partners, C.

    2007-05-01

    USGS Caribbean Project Team: Lind Gee, Gary Gyure, John Derr, Jack Odum, John McMillan, David Carver, Jim Allen, Susan Rhea, Don Anderson, Harley Benz Caribbean Partners: Christa von Hillebrandt-Andrade-PRSN, Juan Payero ISU-UASD,DR, Eduardo Camacho - UPAN, Panama, Lloyd Lynch - SRU,Gonzalo Cruz - UNAH,Honduras, Margaret Wiggins-Grandison - Jamaica, Judy Thomas - CERO Barbados, Sylvan McIntyre - NADMA Grenada, E. Bermingham - STRI. The magnitude-9 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness of the destructive hazard posed by earthquakes and tsunamis. In response to this tragedy, the US government undertook a collaborative project to improve earthquake and tsunami monitoring along a major portion of vulnerable coastal regions, in the Caribbean Sea, the Gulf of Mexico, and the Atlantic Ocean. Seismically active areas of the Caribbean Sea region pose a tsunami risk for Caribbean islands, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North America. Nearly 100 tsunamis have been reported for the Caribbean region in the past 500 years, including 14 tsunamis reported in Puerto Rico and the U.S. Virgin Islands. Partners in this project include the United States Geological Survey (USGS), the Smithsonian Institute, the National Oceanic and Aeronautic Administration (NOAA), and several partner institutions in the Caribbean region. This presentation focuses on the deployment of nine broadband seismic stations to monitor earthquake activity in the Caribbean region that are affiliated with the Global Seismograph Network (GSN). By the end of 2006, five stations were transmitting data to the USGS National Earthquake Information Service (NEIS), and regional partners through Puerto Rico seismograph network (PRSN) Earthworm systems. The following stations are currently operating: SDDR - Sabaneta Dam Dominican Republic, BBGH - Gun Hill Barbados, GRGR - Grenville, Grenada, BCIP - Barro Colorado, Panama, TGUH - Tegucigalpa, Honduras. These stations complement the existing GSN stations SJG - San Juan, Puerto Rico, SDV - Santo Domingo, Venezuela, TEIG - Tepich, Yucatan, Mexico, and JTS - Costa, Rica. 2007 will see the construction of two additional stations in Guantanamo Bay, Cuba and Barbuda. Planned stations in Jamaica and Grand Turks are awaiting local approval. In this presentation we examine noise conditions at the five operating sites and assess the capabilities of the current seismic network using three different measures of capability. The three measures of network capability are: 1) minimum Mw detection threshold; 2) response time of the automatic processing system and; 3) theoretical earthquake location errors. The new seismic stations are part of a larger effort to monitor and mitigate tsunami hazard in the region. Destructive earthquakes and tsunamis are known to be a threat in various parts of the Caribbean. We demonstrate that considerable improvement in network magnitude threshold, response time and earthquake location error have been achieved.

  5. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    NASA Astrophysics Data System (ADS)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  6. Grand Canyon Monitoring and Research Center

    USGS Publications Warehouse

    Hamill, John F.

    2009-01-01

    The Grand Canyon of the Colorado River, one of the world's most spectacular gorges, is a premier U.S. National Park and a World Heritage Site. The canyon supports a diverse array of distinctive plants and animals and contains cultural resources significant to the region's Native Americans. About 15 miles upstream of Grand Canyon National Park sits Glen Canyon Dam, completed in 1963, which created Lake Powell. The dam provides hydroelectric power for 200 wholesale customers in six western States, but it has also altered the Colorado River's flow, temperature, and sediment-carrying capacity. Over time this has resulted in beach erosion, invasion and expansion of nonnative species, and losses of native fish. Public concern about the effects of Glen Canyon Dam operations prompted the passage of the Grand Canyon Protection Act of 1992, which directs the Secretary of the Interior to operate the dam 'to protect, mitigate adverse impacts to, and improve values for which Grand Canyon National Park and Glen Canyon National Recreation Area were established...' This legislation also required the creation of a long-term monitoring and research program to provide information that could inform decisions related to dam operations and protection of downstream resources.

  7. New Continuous Timeseries Data at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Dietz, L.; Zuzlewski, S.; Kohler, W.; Gee, L.; Oppenheimer, D.; Romanowicz, B.

    2005-12-01

    The Northern California Earthquake Data Center (NCEDC) is an archive and distribution center for geophysical data for networks in northern and central California. Recent discovery of non-volcanic tremors in northern and central California has sparked user interest in access to a wider range of continuous seismic data in the region. The NCEDC has responded by expanding its archiving and distribution to all new available continuous data from northern California seismic networks (the USGS NCSN, the UC Berkeley BDSN, the Parkfield HRSN borehole network, and local USArray stations) at all available sample rates, to provide access to all recent real-time timeseries data, and to restore from tape and archive all NCSN continuous data from 2001-present. All new continuous timeseries data will also be available in near-real-time from the NCEDC via the DART (Data Available in Real Time) system, which allows users to directly download daily Telemetry MiniSEED files or to extract and retrieve the timeseries of their selection. The NCEDC will continue to create and distribute event waveform collections for all events detected by the Northern California Seismic System (NCSS), the northern California component of the California Integrated Seismic Network (CISN). All new continuous and event timeseries will be archived in daily intervals and are accessible via the same data request tools (NetDC, BREQ_FAST, EVT_FAST, FISSURES/DHI, STP) as previously archived waveform data. The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and USGS Menlo Park.

  8. Near-Real time, High Resolution Reservoir Monitoring and Modeling with Micro-earthquake Data

    NASA Astrophysics Data System (ADS)

    Hutchings, L. J.; Jarpe, S.; Boyle, K. L.; Bonner, B. P.; Viegas, G.; Philson, H.; Statz-Boyer, P.; Majer, E.

    2011-12-01

    We present a micro-earthquake recording and automated processing system along with a methodology to provide near-real time, high resolution reservoir monitoring and modeling. An interactive program for testing micro-earthquake network designs helps identify configurations for optimum accuracy and resolution. We select the Northwest Geysers, California geothermal field to showcase the usefulness of the system. The system's inexpensive recorders requires very little time or expertise to install, and the automated processing requires merely placing flash memory chips (or telemetry) into a computer. Together these make the deployment of a large numbers of sensors feasible and thus rapid, high resolution results possible. Data are arranged into input files for tomography for Vp, Vs, Qp and Qs, and their combinations to provide for interpretation in terms of rock properties. Micro-earthquake source parameters include seismic moments, full moment tensor solutions, stress drops, source durations, radiated energy, and hypocentral locations. The methodology for interpretation is to utilize visualization with GUI analysis to cross compare tomography and source property results along with borehole or other independent information and rock physics to identify reservoir properties. The system can potentially provide information heretofore unattainable or affordable to many small companies, organizations, and countries around the world.

  9. Web Services and Data Enhancements at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Zuzlewski, S.; Lombard, P. N.; Allen, R. M.

    2013-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC offers the following web services that are compliant with the International Federation of Digital Seismograph Networks (FDSN) web services specifications: (1) fdsn-dataselect: time series data delivered in MiniSEED format, (2) fdsn-station: station and channel metadata and time series availability delivered in StationXML format, (3) fdsn-event: earthquake event information delivered in QuakeML format. In addition, the NCEDC offers the the following IRIS-compatible web services: (1) sacpz: provide channel gains, poles, and zeros in SAC format, (2) resp: provide channel response information in RESP format, (3) dataless: provide station and channel metadata in Dataless SEED format. The NCEDC is also developing a web service to deliver timeseries from pre-assembled event waveform gathers. The NCEDC has waveform gathers for ~750,000 northern and central California events from 1984 to the present, many of which were created by the USGS NCSN prior to the establishment of the joint NCSS (Northern California Seismic System). We are currently adding waveforms to these older event gathers with time series from the UCB networks and other networks with waveforms archived at the NCEDC, and ensuring that the waveform for each channel in the event gathers have the highest quality waveform from the archive.

  10. Role of WEGENER (World Earthquake GEodesy Network for Environmental Hazard Research) in monitoring natural hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Ozener, H.; Zerbini, S.; Bastos, M. L.; Becker, M. H.; Meghraoui, M.; Reilinger, R. E.

    2013-12-01

    WEGENER was originally the acronym for Working Group of European Geoscientists for the Establishment of Networks for Earth-science Research. It was founded in March 1981 in response to an appeal delivered at the Journées Luxembourgeoises de Geodynamique in December 1980 to respond with a coordinated European proposal to a NASA Announcement of Opportunity inviting participation in the Crustal Dynamics and Earthquake Research Program. WEGENER, during the past 33 years, has always kept a close contact with the Agencies and Institutions responsible for the development and maintenance of the global space geodetic networks with the aim to make them aware of the scientific needs and outcomes of the project which might have an influence on the general science policy trends. WEGENER served as Inter-commission Project 3.2, between Commission 1 and Commission 3, of the International Association of Geodesy (IAG) until 2012. Since then, WEGENER project has become the Sub-commission 3.5 of IAG commission 3, namely Tectonics and Earthquake Geodesy. In this presentation, we briefly review the accomplishments of WEGENER as originally conceived and outline and justify the new focus of the WEGENER consortium. The remarkable and rapid evolution of the present state of global geodetic monitoring in regard to the precision of positioning capabilities (and hence deformation) and global coverage, the development of InSAR for monitoring strain with unprecedented spatial resolution, and continuing and planned data from highly precise satellite gravity and altimetry missions, encourage us to shift principal attention from mainly monitoring capabilities by a combination of space and terrestrial geodetic techniques to applying existing observational methodologies to the critical geophysical phenomena that threaten our planet and society. Our new focus includes developing an improved physical basis to mitigate earthquake, tsunami, and volcanic risks, and the effects of natural and anthropogenic climate change (sea level, ice degradation). In addition, expanded applications of space geodesy to atmospheric studies will remain a major focus with emphasis on ionospheric and tropospheric monitoring to support forecasting extreme events. Towards these ends, we will encourage and foster interdisciplinary, integrated initiatives to develop a range of case studies for these critical problems. Geological studies are needed to extend geodetic deformation studies to geologic time scales, and new modeling approaches will facilitate full exploitation of expanding geodetic databases. In light of this new focus, the WEGENER acronym now represents, 'World Earthquake GEodesy Network for Environmental Hazard Research.

  11. UNLV’s environmentally friendly Science and Engineering Building is monitored for earthquake shaking

    USGS Publications Warehouse

    Kalkan, Erol; Savage, Woody; Reza, Shahneam; Knight, Eric; Tian, Ying

    2013-01-01

    The University of Nevada Las Vegas’ (UNLV) Science and Engineering Building is at the cutting edge of environmentally friendly design. As the result of a recent effort by the U.S. Geological Survey’s National Strong Motion Project in cooperation with UNLV, the building is now also in the forefront of buildings installed with structural monitoring systems to measure response during earthquakes. This is particularly important because this is the first such building in Las Vegas. The seismic instrumentation will provide essential data to better understand the structural performance of buildings, especially in this seismically active region.

  12. Monitoring the Galactic Center with ATCA

    NASA Astrophysics Data System (ADS)

    Borkar, A.; Eckart, A.; Straubmeier, C.; Kunneriath, D.; Jalali, B.; Sabha, N.; Shahzamanian, B.; García-Marín, M.; Valencia-S, M.; Sjouwerman, L.; Britzen, S.; Karas, V.; Dovčiak, M.; Donea, A.; Zensus, A.

    2016-02-01

    The supermassive black hole, Sagittarius A* (Sgr A*), at the centre of the Milky Way undergoes regular flaring activity which is thought to arise from the innermost region of the accretion flow. We performed the monitoring observations of the Galactic Centre to study the flux-density variations at 3mm using the Australia Telescope Compact Array (ATCA) between 2010 and 2014. We obtain the light curves of Sgr A* by subtracting the contributions from the extended emission around it, and the elevation and time dependent gains of the telescope. We perform structure function analysis and the Bayesian blocks representation to detect flare events. The observations detect six instances of significant variability in the flux density of Sgr A* in three observations, with variations between 0.5 to 1.0 Jy, which last for 1.5 - 3 hours. We use the adiabatically expanding plasmon model to explain the short time-scale variations in the flux density. We derive the physical quantities of the modelled flare emission, such as the source expansion speed vexp, source sizes, spectral indices, and the turnover frequency. These parameters imply that the expanding source components are either confined to the immediate vicinity of Sgr A* by contributing to the corona or the disc, or have a bulk motion greater than vexp. No exceptional flux density variation on short flare time-scales was observed during the approach and the flyby of the dusty S-cluster object (DSO/G2). This is consistent with its compactness and the absence of a large bow shock.

  13. Modeling and Monitoring for Predictive Simulation of Earthquake Generation in the Japan Region

    NASA Astrophysics Data System (ADS)

    Matsu'Ura, M.; Noda, A.; Terakawa, T.; Hashimoto, C.; Fukuyama, E.

    2008-12-01

    We can regard earthquakes as releases of tectonically accumulated elastic strain energy through dynamic fault ruptures. Given this, the entire earthquake generation process generally consists of tectonic loading due to relative plate motion, quasi-static rupture nucleation, dynamic rupture propagation and stop, and fault strength recovery. In the 1990s earthquake generation physics has made great progress, and so we can now quantitatively describe the entire earthquake generation process with coupled nonlinear equations, consisting of a slip-response function that relates fault slip to shear stress change, a fault constitutive law that prescribes shear strength change with fault slip and contact time, and relative plate motion as driving forces. Recently, we completed a physics-based simulation system for the entire earthquake generation process in and around Japan, where the four plates of Pacific, North American, Philippine Sea and Eurasian are interacting with each other. The total system consists of three basic simulation models for quasi-static stress accumulation, dynamic rupture propagation and seismic wave propagation, developed on a realistic 3- D structure model. Then, given past slip histories and present stress states, we can now predict next step seismic/aseismic fault-slip motion through computation with the combined simulation system. We show two examples of the combined simulation for the 1968 Tokachi-oki earthquake (Mw=8.2) and the 2003 Tokachi- oki earthquake (Mw=8.1). The first example demonstrates that when the stress state is close to a critical level, dynamic rupture develops into a large earthquake, but when the stress state is much lower than the critical level, started rupture is not accelerated. The second example demonstrates that we can quantitatively evaluate the strong ground motions produced by potential interplate earthquakes through computer simulation, if the realistic plate-interface geometry, fault constitutive parameters and crustal structure are given. Thus, our problem is how to extract useful information to estimate the past slip history and the present stress state from observed seismic and geodetic data. To address this problem we developed two inversion methods using Akaike"fs Bayesian Information Criterion (ABIC), one of which is the method to estimate the spatiotemporal variation of interplate coupling from geodetic data, and another is the method to estimate tectonic stress fields from CMT data of seismic events. From the inversion analysis of GPS data we revealed slip-deficit rate distribution on the North American-Pacific plate interface off northeast Japan, which shows good correlation with the source regions of past large interplate events along the Kuril-Japan trench. From the inversion analysis of CMT data we revealed 3-D tectonic stress fields in and around Japan, which explains complex tectonics in Japan very well. Furthermore, we are now developing another inversion method to estimate 3-D elastic/inelastic strain fields from GPS data. Combining these inversion methods with the computer simulation of tectonic loading, we will be able to monitor the spatiotemporal variation of interplate coupling and seismogenic stress fields in the Japan region.

  14. Postseismic Deformation after the 1964 Great Alaskan Earthquake: Collaborative Research with Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Freymueller, Jeffrey T.

    1999-01-01

    The purpose of this project was to carry out GPS observations on the Kenai Peninsula, southern Alaska, in order to study the postseismic and contemporary deformation following the 1964 Alaska earthquake. All of the research supported in this grant was carried out in collaboration with Dr. Steven Cohen of Goddard Space Flight Center. The research funding from this grant primarily supported GPS fieldwork, along with the acquisition of computer equipment to allow analysis and modeling of the GPS data. A minor amount of salary support was provided by the PI, but the great majority of the salary support was provided by the Geophysical Institute. After the expiration of this grant, additional funding was obtained from the National Science Foundation to continue the work. This grant supported GPS field campaigns in August 1995, June 1996, May-June and September 1997, and May-June 1998. We initially began the work by surveying leveling benchmarks on the Kenai peninsula that had been surveyed after the 1964 earthquake. Changes in height from the 1964 leveling data to the 1995+ GPS data, corrected for the geoid-ellipsoid separation, give the total elevation change since the earthquake. Beginning in 1995, we also identified or established sites that were suitable for long-term surveying using GPS. In the subsequent annual GPS campaigns, we made regular measurements at these GPS marks, and steadily enhanced our set of points for which cumulative postseismic uplift data were available. From 4 years of Global Positioning System (GPS) measurements, we find significant spatial variations in present-day deformation between the eastern and western Kenai peninsula, Alaska. Sites in the eastern Kenai peninsula and Prince William Sound move to the NNW relative to North America, in the direction of Pacific-North America relative plate motion. Velocities decrease in magnitude from nearly the full plate rate in southern Prince William Sound to about 30 mm/yr at Seward and to about 5 mm/yr near Anchorage. In contrast, sites in the western Kenai peninsula move to the SW, in a nearly trenchward direction, with a velocity of about 20 mm/yr. The data are consistent with the shallow plate interface offshore and beneath the eastern Kenai and Prince William Sound being completely locked or nearly so, with elastic strain accumulation resulting in rapid motion in the direction of relative plate motion of sites in the overriding plate. The velocities of sites in the western Kenai, along strike to the southwest, are opposite in sign with those predicted from elastic strain accumulation. These data are incompatible with a significant locked region in this segment of the plate boundary. Trenchward velocities are found also for some sites in the Anchorage area. We interpret the trenchward velocities as being caused by a continuing postseismic transient from the 1964 great Alaska earthquake.

  15. Real-Time Seismic Monitoring of Thenewcape Girardeau (mo) Bridge and Recorded Earthquake Response

    NASA Astrophysics Data System (ADS)

    çelebi, Mehmet

    This paper introduces the state of the art, real-time and broad-band seismic monitoring network implemented for the 1206 m [3956 ft] long, cable-stayed Bill Emerson Memorial Bridge in Cape Girardeau (MO), a new Mississippi River crossing, approximately 80 km from the epicentral region of the 1811-1812 New Madrid earthquakes. Design of the bridge accounted for the possibility of a strong earthquake (magnitude 7.5 or greater) during the design life of the bridge. The monitoring network consists of a superstructure and two free-field arrays and comprises a total of 84 channels of accelerometers deployed on the superstructure, pier foundations and free-field in the vicinity of the bridge. The paper also introduces the high quality response data obtained from the network. Such data is aimed to be used by the owner, researchers and engineers to (1) assess the performance of the bridge, (2) check design parameters, including the comparison of dynamic characteristics with actual response, and (3) better design future similar bridges. Preliminary analyses of low-amplitude ambient vibration data and that from a small earthquake reveal specific response characteristics of this new bridge and the free-field in its proximity. There is coherent tower-cabledeck interaction that sometimes results in amplified ambient motions. Also, while the motions at the lowest (tri-axial) downhole accelerometers on both MO and IL sides are practically free-from any feedback from the bridge, the motions at the middle downhole and surface accelerometers are significantly influenced by amplified ambient motions of the bridge.

  16. Analogue models of subduction megathrust earthquakes: improving rheology and monitoring technique

    NASA Astrophysics Data System (ADS)

    Brizzi, Silvia; Corbi, Fabio; Funiciello, Francesca; Moroni, Monica

    2015-04-01

    Most of the world's great earthquakes (Mw > 8.5, usually known as mega-earthquakes) occur at shallow depths along the subduction thrust fault (STF), i.e., the frictional interface between the subducting and overriding plates. Spatiotemporal occurrences of mega-earthquakes and their governing physics remain ambiguous, as tragically demonstrated by the underestimation of recent megathrust events (i.e., 2011 Tohoku). To help unravel seismic cycle at STF, analogue modelling has become a key-tool. First properly scaled analogue models with realistic geometries (i.e., wedge-shaped) suitable for studying interplate seismicity have been realized using granular elasto-plastic [e.g., Rosenau et al., 2009] and viscoelastic materials [i.e., Corbi et al., 2013]. In particular, viscoelastic laboratory experiments realized with type A gelatin 2.5 wt% simulate, in a simplified yet robust way, the basic physics governing subduction seismic cycle and related rupture process. Despite the strength of this approach, analogue earthquakes are not perfectly comparable to their natural prototype. In this work, we try to improve subduction seismic cycle analogue models by modifying the rheological properties of the analogue material and adopting a new image analysis technique (i.e., PEP - ParticlE and Prediction velocity). We test the influence of lithosphere elasticity by using type A gelatin with greater concentration (i.e., 6 wt%). Results show that gelatin elasticity plays important role in controlling seismogenic behaviour of STF, tuning the mean and the maximum magnitude of analogue earthquakes. In particular, by increasing gelatin elasticity, we observe decreasing mean magnitude, while the maximum magnitude remains the same. Experimental results therefore suggest that lithosphere elasticity could be one of the parameters that tunes seismogenic behaviour of STF. To increase gelatin elasticity also implies improving similarities with their natural prototype in terms of coseismic duration and rupture width. Experimental monitoring has been performed by means of both PEP and PIV (i.e., Particle Image Velocimetry) algorithms. PEP differs from classic cross-correlation techniques (i.e., PIV) in its ability to provide sparse velocity vectors at points coincident with particle barycentre positions, allowing a lagrangian description of the velocity field and a better spatial resolution (i.e., ≈ 0.03 mm2) with respect to PIV. Results show that PEP algorithm is able to identify a greater number of analogue earthquakes (i.e., ≈ 20% more than PIV algorithm), decreasing the minimum detectable magnitude from 6.6 to 4.5. Furthermore, earthquake source parameters (e.g., hypocentre position, rupture limits and slip distribution) are more accurately defined. PEP algorithm is then suitable to potentially gain new insights on seismogenic process of STF, by extending the analysable magnitude range of analogue earthquakes and having implications on applicability of scaling relationship, such as Gutenberg - Richter law, to experimental results.

  17. Real-time earthquake monitoring for tsunami warning in the Indian Ocean and beyond

    NASA Astrophysics Data System (ADS)

    Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Harjadi, P.; Fauzi; Gitews Seismology Group

    2010-12-01

    The Mw = 9.3 Sumatra earthquake of 26 December 2004 generated a tsunami that affected the entire Indian Ocean region and caused approximately 230 000 fatalities. In the response to this tragedy the German government funded the German Indonesian Tsunami Early Warning System (GITEWS) Project. The task of the GEOFON group of GFZ Potsdam was to develop and implement the seismological component. In this paper we describe the concept of the GITEWS earthquake monitoring system and report on its present status. The major challenge for earthquake monitoring within a tsunami warning system is to deliver rapid information about location, depth, size and possibly other source parameters. This is particularly true for coast lines adjacent to the potential source areas such as the Sunda trench where these parameters are required within a few minutes after the event in order to be able to warn the population before the potential tsunami hits the neighbouring coastal areas. Therefore, the key for a seismic monitoring system with short warning times adequate for Indonesia is a dense real-time seismic network across Indonesia with densifications close to the Sunda trench. A substantial number of supplementary stations in other Indian Ocean rim countries are added to strengthen the teleseismic monitoring capabilities. The installation of the new GITEWS seismic network - consisting of 31 combined broadband and strong motion stations - out of these 21 stations in Indonesia - is almost completed. The real-time data collection is using a private VSAT communication system with hubs in Jakarta and Vienna. In addition, all available seismic real-time data from the other seismic networks in Indonesia and other Indian Ocean rim countries are acquired also directly by VSAT or by Internet at the Indonesian Tsunami Warning Centre in Jakarta and the resulting "virtual" network of more than 230 stations can jointly be used for seismic data processing. The seismological processing software as part of the GITEWS tsunami control centre is an enhanced version of the widely used SeisComP software and the well established GEOFON earthquake information system operated at GFZ in Potsdam (http://geofon.gfz-potsdam.de/db/eqinfo.php). This recently developed software package (SeisComP3) is reliable, fast and can provide fully automatic earthquake location and magnitude estimates. It uses innovative visualization tools, offers the possibility for manual correction and re-calculation, flexible configuration, support for distributed processing and data and parameter exchange with external monitoring systems. SeisComP3 is not only used for tsunami warning in Indonesia but also in most other Tsunami Warning Centres in the Indian Ocean and Euro-Med regions and in many seismic services worldwide.

  18. The academic health center in complex humanitarian emergencies: lessons learned from the 2010 Haiti earthquake.

    PubMed

    Babcock, Christine; Theodosis, Christian; Bills, Corey; Kim, Jimin; Kinet, Melodie; Turner, Madeleine; Millis, Michael; Olopade, Olufunmilayo; Olopade, Christopher

    2012-11-01

    On January 12, 2010, a 7.0-magnitude earthquake struck Haiti. The event disrupted infrastructure and was marked by extreme morbidity and mortality. The global response to the disaster was rapid and immense, comprising multiple actors-including academic health centers (AHCs)-that provided assistance in the field and from home. The authors retrospectively examine the multidisciplinary approach that the University of Chicago Medicine (UCM) applied to postearthquake Haiti, which included the application of institutional structure and strategy, systematic deployment of teams tailored to evolving needs, and the actual response and recovery. The university mobilized significant human and material resources for deployment within 48 hours and sustained the effort for over four months. In partnership with international and local nongovernmental organizations as well as other AHCs, the UCM operated one of the largest and more efficient acute field hospitals in the country. The UCM's efforts in postearthquake Haiti provide insight into the role AHCs can play, including their strengths and limitations, in complex disasters. AHCs can provide necessary intellectual and material resources as well as technical expertise, but the cost and speed required for responding to an emergency, and ongoing domestic responsibilities, may limit the response of a large university and hospital system. The authors describe the strong institutional backing, the detailed predeployment planning and logistical support UCM provided, the engagement of faculty and staff who had previous experience in complex humanitarian emergencies, and the help of volunteers fluent in the local language which, together, made UCM's mission in postearthquake Haiti successful. PMID:23018336

  19. Earthquake Monitoring: SeisComp3 at the Swiss National Seismic Network

    NASA Astrophysics Data System (ADS)

    Clinton, J. F.; Diehl, T.; Cauzzi, C.; Kaestli, P.

    2011-12-01

    The Swiss Seismological Service (SED) has an ongoing responsibility to improve the seismicity monitoring capability for Switzerland. This is a crucial issue for a country with low background seismicity but where a large M6+ earthquake is expected in the next decades. With over 30 stations with spacing of ~25km, the SED operates one of the densest broadband networks in the world, which is complimented by ~ 50 realtime strong motion stations. The strong motion network is expected to grow with an additional ~80 stations over the next few years. Furthermore, the backbone of the network is complemented by broadband data from surrounding countries and temporary sub-networks for local monitoring of microseismicity (e.g. at geothermal sites). The variety of seismic monitoring responsibilities as well as the anticipated densifications of our network demands highly flexible processing software. We are transitioning all software to the SeisComP3 (SC3) framework. SC3 is a fully featured automated real-time earthquake monitoring software developed by GeoForschungZentrum Potsdam in collaboration with commercial partner, gempa GmbH. It is in its core open source, and becoming a community standard software for earthquake detection and waveform processing for regional and global networks across the globe. SC3 was originally developed for regional and global rapid monitoring of potentially tsunamagenic earthquakes. In order to fulfill the requirements of a local network recording moderate seismicity, SED has tuned configurations and added several modules. In this contribution, we present our SC3 implementation strategy, focusing on the detection and identification of seismicity on different scales. We operate several parallel processing "pipelines" to detect and locate local, regional and global seismicity. Additional pipelines with lower detection thresholds can be defined to monitor seismicity within dense subnets of the network. To be consistent with existing processing procedures, the nonlinloc algorithm was implemented for manual and automatic locations using 1D and 3D velocity models; plugins for improved automatic phase picking and Ml computation were developed; and the graphical user interface for manual review was extended (including pick uncertainty definition; first motion focal mechanisms; interactive review of station magnitude waveforms; full inclusion of strong motion data). SC3 locations are fully compatible with those derived from the existing in-house processing tools and are stored in a database derived from the QuakeML data model. The database is shared with the SED alerting software, which merges origins from both SC3 and external sources in realtime and handles the alerting procedure. With the monitoring software being transitioned to SeisComp3, acquisition, archival and dissemination of SED waveform data now conforms to the seedlink and ArcLink protocols and continuous archives can be accessed via SED and all EIDA (European Integrated Data Archives) web-sites. Further, a SC3 module for waveform parameterisation has been developed, allowing rapid computation of peak values of ground motion and other engineering parameters within minutes of a new event. An output of this module is USGS ShakeMap XML. n minutes of a new event. An output of this module is USGS ShakeMap XML.

  20. Rapid monitoring in vaccination campaigns during emergencies: the post-earthquake campaign in Haiti

    PubMed Central

    Sugerman, David; Brennan, Muireann; Cadet, Jean Ronald; Ernsly, Jackson; Lacapère, François; Danovaro-Holliday, M Carolina; Mubalama, Jean-Claude; Nandy, Robin

    2013-01-01

    Abstract Problem The earthquake that struck Haiti in January 2010 caused 1.5 million people to be displaced to temporary camps. The Haitian Ministry of Public Health and Population and global immunization partners developed a plan to deliver vaccines to those residing in these camps. A strategy was needed to determine whether the immunization targets set for the campaign were achieved. Approach Following the vaccination campaign, staff from the Ministry of Public Health and Population interviewed convenience samples of households – in specific predetermined locations in each of the camps – regarding receipt of the emergency vaccinations. A camp was targeted for “mop-up vaccination” – i.e. repeat mass vaccination – if more than?25% of the children aged 9 months to 7 years in the sample were found not to have received the emergency vaccinations. Local setting Rapid monitoring was implemented in camps located in the Port-au-Prince metropolitan area. Camps that housed more than?5000 people were monitored first. Relevant changes By the end of March 2010, 72 (23%) of the 310 vaccinated camps had been monitored. Although 32 (44%) of the monitored camps were targeted for mop-up vaccination, only six of them had received such repeat mass vaccination when checked several weeks after monitoring. Lessons learnt Rapid monitoring was only marginally beneficial in achieving immunization targets in the temporary camps in Port-au-Prince. More research is needed to evaluate the utility of conventional rapid monitoring, as well as other strategies, during post-disaster vaccination campaigns that involve mobile populations, particularly when there is little capacity to conduct repeat mass vaccination. PMID:24347735

  1. Application for temperature and humidity monitoring of data center environment

    NASA Astrophysics Data System (ADS)

    Albert, ƞ.; TruƟcǎ, M. R. C.; Soran, M. L.

    2015-12-01

    The technology and computer science registered a large development in the last years. Most systems that use high technologies require special working conditions. The monitoring and the controlling are very important. The temperature and the humidity are important parameters in the operation of computer systems, industrial and research, maintaining it between certain values to ensure their proper functioning being important. Usually, the temperature is maintained in the established range using an air conditioning system, but the humidity is affected. In the present work we developed an application based on a board with own firmware called "AVR_NET_IO" using a microcontroller ATmega32 type for temperature and humidity monitoring in Data Center of INCDTIM. On this board, temperature sensors were connected to measure the temperature in different points of the Data Center and outside of this. Humidity monitoring is performed using data from integrated sensors of the air conditioning system, thus achieving a correlation between humidity and temperature variation. It was developed a software application (CM-1) together with the hardware, which allows temperature monitoring and register inside Data Center and trigger an alarm when variations are greater with 3°C than established limits of the temperature.

  2. Robust Satellite Techniques (RST) for monitoring earthquake prone areas by satellite TIR observations: The case of 1999 Chi-Chi earthquake (Taiwan)

    NASA Astrophysics Data System (ADS)

    Genzano, N.; Filizzola, C.; Paciello, R.; Pergola, N.; Tramutoli, V.

    2015-12-01

    For more than 13 years a multi-temporal data-analysis method, named Robust Satellite Techniques (RST), has been being applied to satellite Thermal InfraRed (TIR) monitoring of seismically active regions. It gives a clear definition of a TIR anomaly within a validation/confutation scheme devoted to verify if detected anomalies can be associated or not to the time and location of the occurrence of major earthquakes. In this scheme, the confutation part (i.e. verifying if similar anomalies do not occur in the absence of a significant seismic activity) assumes a role even much important than the usual validation component devoted to verify the presence of anomalous signal transients before (or in association with) specific seismic events. Since 2001, RST approach has been being used to study tens of earthquakes with a wide range of magnitudes (from 4.0 to 7.9) occurred in different continents and in various geo-tectonic settings. In this paper such a long term experience is exploited in order to give a quantitative definition of a significant sequence of TIR anomalies (SSTA) in terms of the required space-time continuity constraints (persistence), identifying also the different typologies of known spurious sequences of TIR anomalies that have to be excluded from the following validation steps. On the same basis, taking also into account for the physical models proposed for justifying the existence of a correlation between TIR anomalies and earthquakes occurrence, specific validation rules (in line with the ones used by the Collaboratory for the Study of Earthquake Predictability - CSEP - Project) have been defined to drive the validation process. In this work, such an approach is applied for the first time to a long-term dataset of night-time GMS-5/VISSR (Geostationary Meteorological Satellite/Visible and Infrared Spin-Scan Radiometer) TIR measurements, comparing SSTAs and earthquakes with M > 4 which occurred in a wide area around Taiwan, in the month of September of different years (from 1995 to 2002). In this dataset the Chi-Chi earthquake (MW = 7.6) which occurred on September 20, 1999 represents the major, but not unique, event. The analysis shows that all identified SSTAs occur in the pre-fixed space-time window around (in terms of time and location) earthquakes with M > 4. The false positive rate remains zero even if only earthquakes with M > 4.5 are considered. In the case of the Chi-Chi earthquake, 3 SSTAs were identified (all within the established space-time correlation window), one of them appearing about 2 weeks before and very close to the epicentre of the earthquake just along the associated tectonic lineaments. The wide considered space-time window, together with the high seismicity of the considered area, surely positively conditioned the achieved results, so that further analyses should be carried out by using longer datasets and different geographic areas. However, also considering the coincidence with other (possible) precursor phenomena, independently reported (particularly within the iSTEP project) at the time of the Chi Chi earthquake, achieved results seem already sufficient (at least) to qualify TIR anomalies (identified by RST) among the parameters to be considered in the framework of a multi-parametric approach to a time-Dependent Assessment of Seismic Hazard (t-DASH).

  3. Federal Radiological Monitoring and Assessment Center Overview of FRMAC Operations

    SciTech Connect

    1998-03-01

    In the event of a major radiological emergency, 17 federal agencies with various statutory responsibilities have agreed to coordinate their efforts at the emergency scene under the umbrella of the Federal Radiological Emergency Response Plan. This cooperative effort will ensure that all federal radiological assistance fully supports their efforts to protect the public. the mandated federal cooperation ensures that each agency can obtain the data critical to its specific responsibilities. This Overview of Federal Radiological Monitoring and Assessment Center (FRMAC) describes the FRMAC response activities to a major radiological emergency. It also describes the federal assets and subsequent operational activities which provide federal radiological monitoring and assessment of the off-site areas.

  4. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  5. Towards real-time regional earthquake simulation I: real-time moment tensor monitoring (RMT) for regional events in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liang, Wen-Tzong; Cheng, Hui-Wen; Tu, Feng-Shan; Ma, Kuo-Fong; Tsuruoka, Hiroshi; Kawakatsu, Hitoshi; Huang, Bor-Shouh; Liu, Chun-Chi

    2014-01-01

    We have developed a real-time moment tensor monitoring system (RMT) which takes advantage of a grid-based moment tensor inversion technique and real-time broad-band seismic recordings to automatically monitor earthquake activities in the vicinity of Taiwan. The centroid moment tensor (CMT) inversion technique and a grid search scheme are applied to obtain the information of earthquake source parameters, including the event origin time, hypocentral location, moment magnitude and focal mechanism. All of these source parameters can be determined simultaneously within 117 s after the occurrence of an earthquake. The monitoring area involves the entire Taiwan Island and the offshore region, which covers the area of 119.3°E to 123.0°E and 21.0°N to 26.0°N, with a depth from 6 to 136 km. A 3-D grid system is implemented in the monitoring area with a uniform horizontal interval of 0.1° and a vertical interval of 10 km. The inversion procedure is based on a 1-D Green's function database calculated by the frequency-wavenumber (fk) method. We compare our results with the Central Weather Bureau (CWB) catalogue data for earthquakes occurred between 2010 and 2012. The average differences between event origin time and hypocentral location are less than 2 s and 10 km, respectively. The focal mechanisms determined by RMT are also comparable with the Broadband Array in Taiwan for Seismology (BATS) CMT solutions. These results indicate that the RMT system is realizable and efficient to monitor local seismic activities. In addition, the time needed to obtain all the point source parameters is reduced substantially compared to routine earthquake reports. By connecting RMT with a real-time online earthquake simulation (ROS) system, all the source parameters will be forwarded to the ROS to make the real-time earthquake simulation feasible. The RMT has operated offline (2010-2011) and online (since January 2012 to present) at the Institute of Earth Sciences (IES), Academia Sinica (http://rmt.earth.sinica.edu.tw). The long-term goal of this system is to provide real-time source information for rapid seismic hazard assessment during large earthquakes.

  6. Hydrochemical monitoring results in relation to the vogtland-nw bohemian earthquake swarm period 2000

    NASA Astrophysics Data System (ADS)

    Kämpf, H.; Bräuer, K.; Dulski, P.; Faber, E.; Koch, U.; Mrlina, J.; Strauch, G.; Weise, S. M.

    2003-04-01

    The Vogtland-NW Bohemian earthquake swarm area/Central Europe is characterised by carbon dioxide- rich mineral springs and mofetts. The August-December 2000 earthquake period was the strongest compared with the December 1985/86 swarms occurred in the area of Novy Kostel, Czech Republic. Here, we present first results of long-term hydrochemical monitoring studies before, during and after the 2000 swarm period. The swarm 2000 lasted from August 28 until December 26 and consisted of altogether nine sub-swarm episodes, each of them lasting for several days. At the mineral spring Wettinquelle, Bad Brambach/Germany the water chemistry and isotope (D, 18O) composition was monitored weekly and two-weekly, respectively, since May 2000. The mineral spring Wettinquelle is located in a distance of about 10 km from the epicentral area of Novy Kostel. The aim of our investigation was to look for seismic induced or seismic coupled changes of the chemical and isotope composition of the mineral water. We had to separate seismohydrological effects from seasonal and hydrological caused changes. The seasonal caused shifts were found for water temperature and alkaline elements (Li, Na, K, Rb and Cs) as well as for discharge, conductivity, hydrogenecarbonate- concentration, and the concentration of the alkaline earth's (Ca, Mg, Sr). Strain related anomalies which could influence the hydrogeochemistry of the mineral water seems to be visible in the iron- concentration of the spring water, in the methane- concentration of the free gas component and caused probably changes of the groundwater level of the well H3 located about 5 km SE of the Wettinquelle at Skalna.

  7. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 2, Radiation Monitoring and Sampling

    SciTech Connect

    NSTec Aerial Measurement Systems

    2012-07-31

    The FRMAC Monitoring and Sampling Manual, Volume 2 provides standard operating procedures (SOPs) for field radiation monitoring and sample collection activities that are performed by the Monitoring group during a FRMAC response to a radiological emergency.

  8. Student-centered Experiments on Earthquake Occurrence Using the Seismic/Eruption Program

    NASA Astrophysics Data System (ADS)

    Barker, J. S.; Jones, A. L.; Hubenthal, M.

    2005-12-01

    Seismic/Eruption is a free Windows program that plots the locations of earthquakes and volcanic eruptions through time on maps of the world or various geographical areas. The hypocenter database can be updated via internet to include the NEIC catalog from 1960 to present. Many teaching activities based on this program (e.g. Braile and Braile, 2001) can help students draw conclusions about the distribution and rate of occurrence of earthquakes. In this activity students, individually or in small groups, select a seismically active region of interest and make their own map. They select a time window, perhaps 20 years. By changing the minimum magnitude setting in Seismic/Eruption and replaying the plots, they observe first-hand that large earthquakes occur less often than smaller earthquakes. The total number of earthquakes plotted is easily read from a counter on the screen. Students compile a table of the number of earthquakes per year with magnitude greater or equal to a certain magnitude, using a range of magnitude thresholds. These are then plotted on semi-log paper in the form of a Gutenberg-Richter plot. Connecting the points on the plot allows students to see a linear trend, and to think about why there may be departures from that linear trend for very small and very large magnitudes. If they assume earthquake occurrence is equally distributed in time, they can predict how often an earthquake of a given magnitude is likely to occur in their chosen region. They can also replay Seismic/Eruption to see whether that assumption is valid. Allowing students to interrogate the most accurate, complete and up-to-date earthquake catalog about a region of their own choosing provides ownership of the experiment. Students may choose an area of a recent newsworthy earthquake (e.g. Sumatra), or their family's ancestral region, or an area they are studying in another class. Students should be encouraged to pose questions and hypotheses about earthquake occurrence, knowing that they have the data and a display tool at hand to answer those questions.

  9. Monitoring of the stress state variations of the Southern California for the purpose of earthquake prediction

    NASA Astrophysics Data System (ADS)

    Gokhberg, M.; Garagash, I.; Bondur, V.; Steblov, G. M.

    2014-12-01

    The three-dimensional geomechanical model of Southern California was developed, including a mountain relief, fault tectonics and characteristic internal features such as the roof of the consolidated crust and Moho surface. The initial stress state of the model is governed by the gravitational forces and horizontal tectonic motions estimated from GPS observations. The analysis shows that the three-dimensional geomechanical models allows monitoring of the changes in the stress state during the seismic process in order to constrain the distribution of the future places with increasing seismic activity. This investigation demonstrates one of possible approach to monitor upcoming seismicity for the periods of days - weeks - months. Continuous analysis of the stress state was carried out during 2009-2014. Each new earthquake with ?~1 and above from USGS catalog was considered as the new defect of the Earth crust which has some definite size and causes redistribution of the stress state. Overall calculation technique was based on the single function of the Earth crust damage, recalculated each half month. As a result each half month in the upper crust layers and partially in the middle layers we revealed locations of the maximal values of the stress state parameters: elastic energy density, shear stress, proximity of the earth crust layers to their strength limit. All these parameters exhibit similar spatial and temporal distribution. How follows from observations all four strongest events with ? ~ 5.5-7.2 occurred in South California during the analyzed period were prefaced by the parameters anomalies in peculiar advance time of weeks-months in the vicinity of 10-50 km from the upcoming earthquake. After the event the stress state source disappeared. The figure shows migration of the maximums of the stress state variations gradients (parameter D) in the vicinity of the epicenter of the earthquake 04.04.2010 with ?=7.2 in the period of 01.01.2010-01.05.2010. Grey lines show the major faults. In the table the values are sampled by 2 weeks, "-" indicates time before the event, "+" indicates time after the event.

  10. Advanced earthquake monitoring system for U.S. Department of Veterans Affairs medical buildings--instrumentation

    USGS Publications Warehouse

    Kalkan, Erol; Banga, Krishna; Ulusoy, Hasan S.; Fletcher, Jon Peter B.; Leith, William S.; Reza, Shahneam; Cheng, Timothy

    2012-01-01

    In collaboration with the U.S. Department of Veterans Affairs (VA), the National Strong Motion Project (NSMP; http://nsmp.wr.usgs.gov/) of the U.S. Geological Survey has been installing sophisticated seismic systems that will monitor the structural integrity of 28 VA hospital buildings located in seismically active regions of the conterminous United States, Alaska, and Puerto Rico during earthquake shaking. These advanced monitoring systems, which combine the use of sensitive accelerometers and real-time computer calculations, are designed to determine the structural health of each hospital building rapidly after an event, helping the VA to ensure the safety of patients and staff. This report presents the instrumentation component of this project by providing details of each hospital building, including a summary of its structural, geotechnical, and seismic hazard information, as well as instrumentation objectives and design. The structural-health monitoring component of the project, including data retrieval and processing, damage detection and localization, automated alerting system, and finally data dissemination, will be presented in a separate report.

  11. The effects of educational program on health volunteers’ knowledge regarding their approach to earthquake in health centers in Tehran

    PubMed Central

    JOUHARI, ZAHRA; PIRASTEH, AFSHAR; GHASSEMI, GHOLAM REZA; BAZRAFKAN, LEILA

    2015-01-01

    Introduction The people's mental, intellectual and physical non-readiness to confront earthquake may result in disastrous outcomes. This research aimed to study of effects of a training intervention on health connector’s knowledge regarding their approach to earthquake in health-training centers in East of Tehran. Methods This research which is a semi-experimental study was designed and executed in 2011, using a questionnaire with items based on the information of Crisis Management Org. After a pilot study and making the questionnaire valid and reliable, we determined the sample size. Then, the questionnaires were completed before and after the training program by 82 health connectors at health-treatment centers in the East of Tehran. Finally, the collected data were analyzed by SPSS 14, using paired sample t–test and Pearson's correlation coefficient. Results Health connectors were women with the mean age of 43.43±8.51 years. In this research, the mean score of connectors’ knowledge before and after the training was 35.15±4.3 and 43.73±2.91 out of 48, respectively. The difference was statistically significant (p=0.001). The classes were the most important source of information for the health connectors. Conclusion The people's knowledge to confront earthquake can be increased by holding training courses and workshops. Such training courses and workshops have an important role in data transfer and readiness of health connectors. PMID:25927068

  12. Seismic ACROSS Transmitter Installed at Morimachi above the Subducting Philippine Sea Plate for the Test Monitoring of the Seismogenic Zone of Tokai Earthquake not yet to Occur

    NASA Astrophysics Data System (ADS)

    Kunitomo, T.; Kumazawa, M.; Masuda, T.; Morita, N.; Torii, T.; Ishikawa, Y.; Yoshikawa, S.; Katsumata, A.; Yoshida, Y.

    2008-12-01

    Here we report the first seismic monitoring system in active and constant operation for the wave propagation characteristics in tectonic region just above the subducting plate driving the coming catastrophic earthquakes. Developmental works of such a system (ACROSS; acronym for Accurately Controlled, Routinely Operated, Signal System) have been started in 1994 at Nagoya University and since 1996 also at TGC (Tono Geoscience Center) of JAEA promoted by Hyogoken Nanbu Earthquakes (1995 Jan.17, Mj=7.3). The ACROSS is a technology system including theory of signal and data processing based on the brand new concept of measurement methodology of Green function between a signal source and observation site. The works done for first generation system are reported at IWAM04 and in JAEA report (Kumazawa et al.,2007). The Meteorological Research Institute of JMA has started a project of test monitoring of Tokai area in 2004 in corporation with Shizuoka University to realize the practical use of the seismic ACROSS for earthquake prediction researches. The first target was set to Tokai Earthquake not yet to take place. The seismic ACROSS transmitter was designed so as to be appropriate for the sensitive monitoring of the deep active fault zone on the basis of the previous technology elements accumulated so far. The ground coupler (antenna) is a large steel-reinforced concrete block (over 20m3) installed in the basement rocks in order to preserve the stability. Eccentric moment of the rotary transmitter is 82 kgm at maximum, 10 times larger than that of the first generation. Carrier frequency of FM signal for practical use can be from 3.5 to 15 Hz, and the signal phase is accurately controlled by a motor with vector inverter synchronized with GPS clock with a precision of 10-4 radian or better. By referring to the existing structure model in this area (Iidaka et al., 2003), the site of the transmitting station was chosen at Morimachi so as to be appropriate for detecting the reflected wave from an anticipated fault plane of Tokai Earthquake, the boundary between Eurasian lithosphere and the subducting Philippine Sea Plate. Further several trials of new transmission protocol and also remote control are being made for the transmitter network of the next generation. The whole system appears working well as reported by Yoshida et al. (2008, This meeting).

  13. Development a Heuristic Method to Locate and Allocate the Medical Centers to Minimize the Earthquake Relief Operation Time

    PubMed Central

    AGHAMOHAMMADI, Hossein; SAADI MESGARI, Mohammad; MOLAEI, Damoon; AGHAMOHAMMADI, Hasan

    2013-01-01

    Background Location-allocation is a combinatorial optimization problem, and is defined as Non deterministic Polynomial Hard (NP) hard optimization. Therefore, solution of such a problem should be shifted from exact to heuristic or Meta heuristic due to the complexity of the problem. Locating medical centers and allocating injuries of an earthquake to them has high importance in earthquake disaster management so that developing a proper method will reduce the time of relief operation and will consequently decrease the number of fatalities. Methods: This paper presents the development of a heuristic method based on two nested genetic algorithms to optimize this location allocation problem by using the abilities of Geographic Information System (GIS). In the proposed method, outer genetic algorithm is applied to the location part of the problem and inner genetic algorithm is used to optimize the resource allocation. Results: The final outcome of implemented method includes the spatial location of new required medical centers. The method also calculates that how many of the injuries at each demanding point should be taken to any of the existing and new medical centers as well. Conclusions: The results of proposed method showed high performance of designed structure to solve a capacitated location-allocation problem that may arise in a disaster situation when injured people has to be taken to medical centers in a reasonable time. PMID:23514709

  14. Monitoring of fluid-rock interaction in active fault zones: a new method of earthquake prediction/forecasting?

    NASA Astrophysics Data System (ADS)

    Claesson, L.; Skelton, A.; Graham, C.; Dietl, C.; Morth, M.; Torssander, P.

    2003-12-01

    We propose a new method for earthquake forecasting based on the "prediction in hindsight" of a Mw 5.8 earthquake on Iceland, on September 16, 2002. The "prediction in hindsight" is based on geochemical monitoring of geothermal water at site HU-01 located within the Tj”rnes Fracture Zone, northern Iceland, before and after the earthquake. During the 4 weeks before the earthquake exponential (<800%) increases in the concentration of Cu, Zn and Fe in the fluid, was measured, together with a linear increase of Na/Ca and a slight increase of ή 18O. We relate the hydrogeochemical changes before the earthquake to influx of fluid which interacted with the host rock at higher temperatures and suggest that fluid flow was facilitated by stress-induced modification of rock permeability, which enabled more rapid fluid-rock interaction. Stepwise increases (13-35 %) in the concentration of, Ba, Ca, K, Li, Na, Rb, S, Si, Sr, Cl, Br and SO4 and negative shifts in ή 18O and ή D was detected in the fluid immediately after the earthquake, which we relate to seismically-induced source switching and consequent influx of older (or purer) ice age meteoric waters. The newly tapped source reservoir has a chemically and isotopically distinct ice-age meteoric water signature, which is the result of a longer residence in the crust. The immediancy of these changes is consistent with experimentally-derived timescales of fault-sealing in response to coupled deformation and fluid flow, interpreted as source-switching. These precursory changes may be used to "predict" the earthquake up to 2 weeks before it occurs.

  15. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Chen, S.; Chowdhury, F.; Bhaskaran, A.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2009-12-01

    The SCEDC archives continuous and triggered data from nearly 3000 data channels from 375 SCSN recorded stations. The SCSN and SCEDC process and archive an average of 12,000 earthquakes each year, contributing to the southern California earthquake catalog that spans from 1932 to present. The SCEDC provides public, searchable access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP, NETDC and DHI. New data products: ? The SCEDC is distributing synthetic waveform data from the 2008 ShakeOut scenario (Jones et al., USGS Open File Rep., 2008-1150) and (Graves et al. 2008; Geophys. Res. Lett.) This is a M 7.8 earthquake on the southern San Andreas fault. Users will be able to download 40 sps velocity waveforms in SAC format from the SCEDC website. The SCEDC is also distributing synthetic GPS data (Crowell et al., 2009; Seismo. Res. Letters.) for this scenario as well. ? The SCEDC has added a new web page to show the latest tomographic model of Southern California. This model is based on Tape et al., 2009 Science. New data services: ? The SCEDC is exporting data in QuakeML format. This is an xml format that has been adopted by the Advanced National Seismic System (ANSS). This data will also be available as a web service. ? The SCEDC is exporting data in StationXML format. This is an xml format created by the SCEDC and adopted by ANSS to fully describe station metadata. This data will also be available as a web service. ? The stp 1.6 client can now access both the SCEDC and the Northern California Earthquake Data Center (NCEDC) earthquake and waveform archives. In progress - SCEDC to distribute 1 sps GPS data in miniSEED format: ? As part of a NASA Advanced Information Systems Technology project in collaboration with Jet Propulsion Laboratory and Scripps Institution of Oceanography, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP.

  16. Self-Powered WSN for Distributed Data Center Monitoring.

    PubMed

    Brunelli, Davide; Passerone, Roberto; Rizzon, Luca; Rossi, Maurizio; Sartori, Davide

    2016-01-01

    Monitoring environmental parameters in data centers is gathering nowadays increasing attention from industry, due to the need of high energy efficiency of cloud services. We present the design and the characterization of an energy neutral embedded wireless system, prototyped to monitor perpetually environmental parameters in servers and racks. It is powered by an energy harvesting module based on Thermoelectric Generators, which converts the heat dissipation from the servers. Starting from the empirical characterization of the energy harvester, we present a power conditioning circuit optimized for the specific application. The whole system has been enhanced with several sensors. An ultra-low-power micro-controller stacked over the energy harvesting provides an efficient power management. Performance have been assessed and compared with the analytical model for validation. PMID:26729135

  17. Self-Powered WSN for Distributed Data Center Monitoring

    PubMed Central

    Brunelli, Davide; Passerone, Roberto; Rizzon, Luca; Rossi, Maurizio; Sartori, Davide

    2016-01-01

    Monitoring environmental parameters in data centers is gathering nowadays increasing attention from industry, due to the need of high energy efficiency of cloud services. We present the design and the characterization of an energy neutral embedded wireless system, prototyped to monitor perpetually environmental parameters in servers and racks. It is powered by an energy harvesting module based on Thermoelectric Generators, which converts the heat dissipation from the servers. Starting from the empirical characterization of the energy harvester, we present a power conditioning circuit optimized for the specific application. The whole system has been enhanced with several sensors. An ultra-low-power micro-controller stacked over the energy harvesting provides an efficient power management. Performance have been assessed and compared with the analytical model for validation. PMID:26729135

  18. Early Results of Three-Year Monitoring of Red Wood Ants' Behavioral Changes and Their Possible Correlation with Earthquake Events.

    PubMed

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-01-01

    Short-term earthquake predictions with an advance warning of several hours or days are currently not possible due to both incomplete understanding of the complex tectonic processes and inadequate observations. Abnormal animal behaviors before earthquakes have been reported previously, but create problems in monitoring and reliability. The situation is different with red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae)). They have stationary mounds on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas. For three years (2009-2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras with both a color and an infrared sensor. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants' behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the more than 45,000 hours of video streams. Based on this automated approach, a statistical analysis of the ants' behavior will be carried out. In addition, other parameters (climate, geotectonic and biological), which may influence behavior, will be included in the analysis. PMID:26487310

  19. Results of seismological monitoring in the Cascade Range 1962-1989: earthquakes, eruptions, avalanches and other curiosities

    USGS Publications Warehouse

    Weaver, C.S.; Norris, R.D.; Jonientz-Trisler, C.

    1990-01-01

    Modern monitoring of seismic activity at Cascade Range volcanoes began at Longmire on Mount Rainier in 1958. Since then, there has been an expansion of the regional seismic networks in Washington, northern Oregon and northern California. Now, the Cascade Range from Lassen Peak to Mount Shasta in the south and Newberry Volcano to Mount Baker in the north is being monitored for earthquakes as small as magnitude 2.0, and many of the stratovolcanoes are monitored for non-earthquake seismic activity. This monitoring has yielded three major observations. First, tectonic earthquakes are concentrated in two segments of the Cascade Range between Mount Rainier and Mount Hood and between Mount Shasta and Lassen Peak, whereas little seismicity occurs between Mount Hood and Mount Shasta. Second, the volcanic activity and associated phenomena at Mount St. Helens have produced intense and widely varied seismicity. And third, at the northern stratovolcanoes, signals generated by surficial events such as debris flows, icequakes, steam emissions, rockfalls and icefalls are seismically recorded. Such records have been used to alert authorities of dangerous events in progress. -Authors

  20. Magma Ascent to Submarine Volcanoes: Real-Time Monitoring by Means of Teleseismic Observations of Earthquake Swarms

    NASA Astrophysics Data System (ADS)

    Spicak, A.; Vanek, J.; Kuna, V. M.

    2013-12-01

    Earthquake swarm occurrence belongs to reliable indicators of magmatic activity in the Earth crust. Their occurrence beneath submarine portions of volcanic arcs brings valuable information on plumbing systems of this unsufficiently understood environment and reveals recently active submarine volcanoes. Utilisation of teleseismically recorded data (NEIC, GCMT Project) enables to observe magmatic activity in almost real time. We analysed seismicity pattern in two areas - the Andaman-Nicobar region in April 2012 and the southern Ryukyu in April 2013. In both regions, the swarms are situated 80-100 km above the Wadati-Benioff zone of the subducting slab. Foci of the swarm earthquakes delimit a seismogenic layer at depths between 9 - 35 km that should be formed by brittle and fractured rock environment. Repeated occurrence of earthquakes clustered in swarms excludes large accumulations of melted rocks in this layer. Magma reservoirs should be situated at depths greater than 35 km. Upward magma migration from deeper magma reservoirs to shallow magma chambers or to the seafloor induce earthquake swarms by increasing tectonic stress and/or decreasing friction at faults. Frequency of earthquake swarm occurrence in the investigated areas has made a volcanic eruption at the seafloor probable. Moreover, epicentral zones of the swarms often coincide with distinct elevations at the seafloor - seamounts and seamount ranges. High accuracy of global seismological data enabled also to observe migration of earthquakes during individual swarms (Fig. 1), probably reflecting dike and/or sill propagation. Triggering of earthquake swarms by distant strong earthquakes was repeatedly observed in the Andaman-Nicobar region. The presented study documents high accuracy of hypocentral determinations published by the above mentioned data centers and usefulness of the EHB relocation procedure. Epicentral map of the October 2002 earthquake swarm in southern Ryukyu showing E-W migration of events during the swarm. The swarm occurred during 29 hours on October 23 - 25 in the magnitude range 4.0 - 5.2. Open circles - epicenters of all 54 events of the swarm; red circles - epicenters of events that occurred in a particular time interval of the swarm development: (a) - starting 3 hours; (b) - following 4 hours; (c) - final 22 hours.

  1. The Evolution of the Federal Monitoring and Assessment Center

    SciTech Connect

    NSTec Aerial Measurement System

    2012-07-31

    The Federal Radiological Monitoring and Assessment Center (FRMAC) is a federal emergency response asset whose assistance may be requested by the Department of Homeland Security (DHS), the Department of Defense (DoD), the Environmental Protection Agency (EPA), the Nuclear Regulatory Commission (NRC), and state and local agencies to respond to a nuclear or radiological incident. It is an interagency organization with representation from the Department of Energy’s National Nuclear Security Administration (DOE/NNSA), the Department of Defense (DoD), the Environmental Protection Agency (EPA), the Department of Health and Human Services (HHS), the Federal Bureau of Investigation (FBI), and other federal agencies. FRMAC, in its present form, was created in 1987 when the radiological support mission was assigned to the DOE’s Nevada Operations Office by DOE Headquarters. The FRMAC asset, including its predecessor entities, was created, grew, and evolved to function as a response to radiological incidents. Radiological emergency response exercises showed the need for a coordinated approach to managing federal emergency monitoring and assessment activities. The mission of FRMAC is to coordinate and manage all federal radiological environmental monitoring and assessment activities during a nuclear or radiological incident within the United States in support of state,local, tribal governments, DHS, and the federal coordinating agency. Radiological emergency response professionals with the DOE’s national laboratories support the Radiological Assistance Program (RAP), National Atmospheric Release Advisory Center (NARAC), the Aerial MeasuringSystem (AMS), and the Radiation Emergency Assistance Center/Training Site (REAC/TS). These teams support the FRMAC to provide:  Atmospheric transport modeling  Radiation monitoring  Radiological analysis and data assessments  Medical advice for radiation injuries In support of field operations, the FRMAC provides geographic information systems, communications, mechanical, electrical, logistics, and administrative support. The size of the FRMAC is tailored to the incident and is comprised of emergency response professionals drawn from across the federal government. State and local emergency response teams may also integrate their operations with FRMAC, but are not required to.

  2. Emergency radiological monitoring and analysis United States Federal Radiological Monitoring and Assessment Center

    SciTech Connect

    Thome, D.J.

    1994-09-01

    The United States Federal Radiological Emergency Response Plan (FRERP) provides the framework for integrating the various Federal agencies responding to a major radiological emergency. Following a major radiological incident the FRERP authorizes the creation of the Federal Radiological Monitoring and Assessment Center (FRMAC). The FRMAC is established to coordinate all Federal agencies involved in the monitoring and assessment of the off-site radiological conditions in support of the impacted states and the Lead Federal Agency (LFA). Within the FRMAC, the Monitoring and Analysis Division is responsible for coordinating all FRMAC assets involved in conducting a comprehensive program of environmental monitoring, sampling, radioanalysis and quality assurance. This program includes: (1) Aerial Radiological Monitoring - Fixed Wing and Helicopter, (2) Field Monitoring and Sampling, (3) Radioanalysis - Mobile and Fixed Laboratories, (4) Radiation Detection Instrumentation - Calibration and Maintenance, (5) Environmental Dosimetry, and (6) An integrated program of Quality Assurance. To assure consistency, completeness and the quality of the data produced, a methodology and procedures handbook is being developed. This paper discusses the structure, assets and operations of FRMAC monitoring and analysis and the content and preparation of this handbook.

  3. Earthquakes for Kids

    MedlinePLUS

    ... Education FAQ Earthquake Glossary For Kids Prepare Google Earth/KML Files Earthquake Summary Posters Photos Publications Share ... for Education FAQ EQ Glossary For Kids Google Earth/KML Files EQ Summary Posters Photos Publications Monitoring ...

  4. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Bhaskaran, A.; Chen, S.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2010-12-01

    Currently the SCEDC archives continuous and triggered data from nearly 5000 data channels from 425 SCSN recorded stations, processing and archiving an average of 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC in the past year. Updated hardware: ? The SCEDC has more than doubled its waveform file storage capacity by migrating to 2 TB disks. New data holdings: ? Waveform data: Beginning Jan 1, 2010 the SCEDC began continuously archiving all high-sample-rate strong-motion channels. All seismic channels recorded by SCSN are now continuously archived and available at SCEDC. ? Portable data from El Mayor Cucapah 7.2 sequence: Seismic waveforms from portable stations installed by researchers (contributed by Elizabeth Cochran, Jamie Steidl, and Octavio Lazaro-Mancilla) have been added to the archive and are accessible through STP either as continuous data or associated with events in the SCEDC earthquake catalog. This additional data will help SCSN analysts and researchers improve event locations from the sequence. ? Real time GPS solutions from El Mayor Cucapah 7.2 event: Three component 1Hz seismograms of California Real Time Network (CRTN) GPS stations, from the April 4, 2010, magnitude 7.2 El Mayor-Cucapah earthquake are available in SAC format at the SCEDC. These time series were created by Brendan Crowell, Yehuda Bock, the project PI, and Mindy Squibb at SOPAC using data from the CRTN. The El Mayor-Cucapah earthquake demonstrated definitively the power of real-time high-rate GPS data: they measure dynamic displacements directly, they do not clip and they are also able to detect the permanent (coseismic) surface deformation. ? Triggered data from the Quake Catcher Network (QCN) and Community Seismic Network (CSN): The SCEDC in cooperation with QCN and CSN is exploring ways to archive and distribute data from high density low cost networks. As a starting point the SCEDC will store a dataset from QCN and CSN and distribute it through a separate STP client. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. XML formats: ? The SCEDC is now distributing earthquake parameter data through web services in QuakeML format. ? The SCEDC in collaboration with the Northern California Earthquake Data Center (NCEDC) and USGS Golden has reviewed and revised the StationXML format to produce version 2.0. The new version includes a rules on extending the schema, use of named complex types, and greater consistency in naming conventions. Based on this work we plan to develop readers and writers of the StationXML format.

  5. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 1, Operations

    SciTech Connect

    NSTec Aerial Measurement Systems

    2012-07-31

    The Monitoring division is primarily responsible for the coordination and direction of: Aerial measurements to delineate the footprint of radioactive contaminants that have been released into the environment. Monitoring of radiation levels in the environment; Sampling to determine the extent of contaminant deposition in soil, water, air and on vegetation; Preliminary field analyses to quantify soil concentrations or depositions; and Environmental and personal dosimetry for FRMAC field personnel, during a Consequence Management Response Team (CMRT) and Federal Radiological Monitoring and Assessment Center (FRMAC) response. Monitoring and sampling techniques used during CM/FRMAC operations are specifically selected for use during radiological emergencies where large numbers of measurements and samples must be acquired, analyzed, and interpreted in the shortest amount of time possible. In addition, techniques and procedures are flexible so that they can be used during a variety of different scenarios; e.g., accidents involving releases from nuclear reactors, contamination by nuclear waste, nuclear weapon accidents, space vehicle reentries, or contamination from a radiological dispersal device. The Monitoring division also provides technicians to support specific Health and Safety Division activities including: The operation of the Hotline; FRMAC facility surveys; Assistance with Health and Safety at Check Points; and Assistance at population assembly areas which require support from the FRMAC. This volume covers deployment activities, initial FRMAC activities, development and implementation of the monitoring and assessment plan, the briefing of field teams, and the transfer of FRMAC to the EPA.

  6. Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building

    USGS Publications Warehouse

    Kohler, M.D.; Davis, P.M.; Safak, E.

    2005-01-01

    Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.

  7. Moving Toward Climate Data Integration: The Observing System Monitoring Center

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Hankin, S.; Habermann, T.; Kern, K.; Schweitzer, R.; Little, M.; Snowden, D.; Cartwright, J.; Larocque, J.; Li, J.; Malczyk, J.; Manke, A.

    2008-12-01

    Understanding climate variability requires the development, maintenance and evaluation of a sustained global climate observing system. The purpose of the Observing System Monitoring Center (OSMC), which is being funded by the National Oceanic and Atmospheric Administration's (NOAA) Office of Climate Observation (OCO), is to provide a tool that will assist managers and scientists with monitoring the performance of the global in-situ ocean observing system, identifying and correcting problems, and evaluating the adequacy of the observations in support of ocean/climate state estimation, forecasting and research. Initially, the sole source of ocean in-situ data being added to the OSMC database was from the subset of data which is distributed daily via the Global Telecommunications System (GTS). However, it has become clear that in order to maintain a complete record of the observations going into the climate data record, it is necessary to include observations that are collected but not distributed via the GTS system. The challenges of integrating such data into the OSMC parallel the challenges of integrating climate data for general use and discovery by those who would like to utilize the observations. In this presentation, we will be talking about our approach to integrating climate observations through the OSMC. The areas of integration under the OSMC include realtime data input from GTS for the management of in-situ platforms, integration of climate platform archives from Data Access Centers (DACs), and integration of climate products and data. The OSMC hopes to significantly advance climate services by making the data available through web services such as OPeNDAP and the Sensor Observation Service (SOS).

  8. Restoration of accelerator facilities damaged by Great East Japan Earthquake at Cyclotron and Radioisotope Center, Tohoku University.

    PubMed

    Wakui, Takashi; Itoh, Masatoshi; Shimada, Kenzi; Yoshida, Hidetomo P; Shinozuka, Tsutomu; Sakemi, Yasuhiro

    2014-01-01

    The Cyclotron and Radioisotope Center (CYRIC) of Tohoku University is a joint-use institution for education and research in a wide variety of fields ranging from physics to medicine. Accelerator facilities at the CYRIC provide opportunities for implementing a broad research program, including medical research using positron emission tomography (PET), with accelerated ions and radioisotopes. At the Great East Japan Earthquake on March 11, 2011, no human injuries occurred and a smooth evacuation was made in the CYRIC, thanks to the anti-earthquake measures such as the renovation of the cyclotron building in 2009 mainly to provide seismic strengthening, fixation of shelves to prevent the falling of objects, and securement of the width of the evacuation route. The preparation of an emergency response manual was also helpful. However, the accelerator facilities were damaged because of strong shaking that continued for a few minutes. For example, two columns on which a 930 cyclotron was placed were damaged, and thereby the 930 cyclotron was inclined. All the elements of beam transport lines were deviated from the beam axis. Some peripheral devices in a HM12 cyclotron were broken. Two shielding doors fell from the carriage onto the floor and blocked the entrances to the rooms. The repair work on the accelerator facilities was started at the end of July 2011. During the repair work, the joint use of the accelerator facilities was suspended. After the repair work was completed, the joint use was re-started at October 2012, one and a half years after the earthquake. PMID:25030295

  9. Utilizing Changes in Repeating Earthquakes to Monitor Evolving Processes and Structure Before and During Volcanic Eruptions

    NASA Astrophysics Data System (ADS)

    Hotovec-Ellis, Alicia

    Repeating earthquakes are two or more earthquakes that share the same source location and source mechanism, which results in the earthquakes having highly similar waveforms when recorded at a seismic instrument. Repeating earthquakes have been observed in a wide variety of environments: from fault systems (such as the San Andreas and Cascadia subduction zone), to hydrothermal areas and volcanoes. Volcano seismologists are particularly concerned with repeating earthquakes, as they have been observed at volcanoes along the entire range of eruptive style and are often a prominent feature of eruption seismicity. The behavior of repeating earthquakes sometimes changes with time, which possibly reflects subtle changes in the mechanism creating the earthquakes. In Chapter 1, we document an example of repeating earthquakes during the 2009 eruption of Redoubt volcano that became increasingly frequent with time, until they blended into harmonic tremor prior to several explosions. We interpreted the source of the earthquakes as stick-slip on a fault near the conduit that slipped increasingly often as the explosion neared in response to the build-up of pressure in the system. The waveforms of repeating earthquakes may also change, even if the behavior does not. We can quantify changes in waveform using the technique of coda wave interferometry to differentiate between changes in source and medium. In Chapters 2 and 3, we document subtle changes in the coda of repeating earthquakes related to small changes in the near-surface velocity structure at Mount St. Helens before and during its eruption in 2004. Velocity changes have been observed prior to several volcanic eruptions, are thought to occur in response to volumetric strain and the opening or closing of cracks in the subsurface. We compared continuous records of velocity change against other geophysical data, and found that velocities at Mount St. Helens change in response to snow loading, fluid saturation, shaking from large distant earthquakes, shallow pressurization, and possibly lava extrusion. Velocity changes at Mount St. Helens are a complex mix of many different effects, and other complementary data are required to interpret the signal.

  10. The Irpinia Seismic Network: An Advanced Monitoring Infrastructure For Earthquake Early Warning in The Campania Region (Southern Italy)

    NASA Astrophysics Data System (ADS)

    Iannaccone, G.; Zollo, A.; Bobbio, A.; Cantore, L.; Convertito, V.; Elia, L.; Festa, G.; Lancieri, M.; Martino, C.; Romeo, A.; Satriano, C.; Vassallo, M.

    2007-12-01

    A new seismic network (ISNet, Irpinia Seismic Network) is now operating in the Southern Italy. It is conceived as the core infrastructure for an Earthquake Early Warning System (EEWS) under development in Southern Italy. It is primarily aimed at providing an alert for moderate to large earthquakes (M>4) to selected target sites in Campania Region and it also provides data for rapid computation of regional ground-shaking maps. ISNet is deployed over an area of about 100Ś70 km2 covering the Apenninic active seismic zone where most of large earthquakes occurred during the last centuries, including the Ms=6.9, 1980 Irpinia earthquake. ISNet is composed of 29 seismic stations equipped with three components accelerometers and velocimeters aggregated in six smaller sub-nets. The sub-net stations are connected with a real-time communications to a central data- collector site (LCC, Local Control Center). The different LCCs are linked among them and to a Network Control Center (NCC) located in the city of Naples 100 km away from the network center, with different type of transmission systems chosen according their transmission mode robustness and reliability. The network is designed to provide estimates of the location and size of a potential destructive earthquake within few seconds from the earthquake detection, through an evolutionary and fully probabilistic approach. For the real time location we developed a methodology which extends and generalizes the one Horiuchi et al. (2005) by a) starting the location procedure after only one station has triggered, b) using the Equal Differential Time (EDT) approach to incorporate the triggered arrivals and the not-yet-triggered stations, c) estimating the hypocenter probabilistically as a pdf instead of as a point, and d) applying a full, non-linearized, global-search for each update of the location estimate. Following an evolutionary approach, the method evaluates, at each time step, the EDT equations considering not only each pair of triggered stations, but also those pairs where only one station has triggered. The size of earthquake is also evaluated by a real time, evolutionary algorithm based on a magnitude predictive model and a Bayesian formulation. It is aimed at evaluating the conditional probability density function of magnitude as a function of ground motion quantities measured on the early part of the acquired signals. The predictive models are empirical relationships which correlate the final event magnitude with the P-displacement amplitudes measured on first 2-4 seconds of record after the first-P arrival. The methods previously described for rapidly estimating the event's location and magnitude, are used to perform a real-time seismic hazard analysis allowing to compute the probabilistic distribution, or hazard curve, of ground motion intensity measures (IM) i.e. the peak ground acceleration (PGA) or the spectral acceleration (Sa), at selected sites of the Campania Region. We show the performances of the earthquake early warning system through applications to simulated large events and recorded low magnitude earthquakes.

  11. Logic-centered architecture for ubiquitous health monitoring.

    PubMed

    Lewandowski, Jacek; Arochena, Hisbel E; Naguib, Raouf N G; Chao, Kuo-Ming; Garcia-Perez, Alexeis

    2014-09-01

    One of the key points to maintain and boost research and development in the area of smart wearable systems (SWS) is the development of integrated architectures for intelligent services, as well as wearable systems and devices for health and wellness management. This paper presents such a generic architecture for multiparametric, intelligent and ubiquitous wireless sensing platforms. It is a transparent, smartphone-based sensing framework with customizable wireless interfaces and plug'n'play capability to easily interconnect third party sensor devices. It caters to wireless body, personal, and near-me area networks. A pivotal part of the platform is the integrated inference engine/runtime environment that allows the mobile device to serve as a user-adaptable personal health assistant. The novelty of this system lays in a rapid visual development and remote deployment model. The complementary visual Inference Engine Editor that comes with the package enables artificial intelligence specialists, alongside with medical experts, to build data processing models by assembling different components and instantly deploying them (remotely) on patient mobile devices. In this paper, the new logic-centered software architecture for ubiquitous health monitoring applications is described, followed by a discussion as to how it helps to shift focus from software and hardware development, to medical and health process-centered design of new SWS applications. PMID:25192566

  12. Comprehensive Nuclear-Test-Ban Treaty seismic monitoring: 2012 USNAS report and recent explosions, earthquakes, and other seismic sources

    SciTech Connect

    Richards, Paul G.

    2014-05-09

    A comprehensive ban on nuclear explosive testing is briefly characterized as an arms control initiative related to the Non-Proliferation Treaty. The work of monitoring for nuclear explosions uses several technologies of which the most important is seismology-a physics discipline that draws upon extensive and ever-growing assets to monitor for earthquakes and other ground-motion phenomena as well as for explosions. This paper outlines the basic methods of seismic monitoring within that wider context, and lists web-based and other resources for learning details. It also summarizes the main conclusions, concerning capability to monitor for test-ban treaty compliance, contained in a major study published in March 2012 by the US National Academy of Sciences.

  13. The Savannah River Technology Center environmental monitoring field test platform

    SciTech Connect

    Rossabi, J.

    1993-03-05

    Nearly all industrial facilities have been responsible for introducing synthetic chemicals into the environment. The Savannah River Site is no exception. Several areas at the site have been contaminated by chlorinated volatile organic chemicals. Because of the persistence and refractory nature of these contaminants, a complete clean up of the site will take many years. A major focus of the mission of the Environmental Sciences Section of the Savannah River Technology Center is to develop better, faster, and less expensive methods for characterizing, monitoring, and remediating the subsurface. These new methods can then be applied directly at the Savannah River Site and at other contaminated areas in the United States and throughout the world. The Environmental Sciences Section has hosted field testing of many different monitoring technologies over the past two years primarily as a result of the Integrated Demonstration Program sponsored by the Department of Energy`s Office of Technology Development. This paper provides an overview of some of the technologies that have been demonstrated at the site and briefly discusses the applicability of these techniques.

  14. Federal Radiological Monitoring and Assessment Center Analytical Response

    SciTech Connect

    E.C. Nielsen

    2003-04-01

    The Federal Radiological Monitoring and Assessment Center (FRMAC) is authorized by the Federal Radiological Emergency Response Plan to coordinate all off-site radiological response assistance to state and local government s, in the event of a major radiological emergency in the United States. The FRMAC is established by the U.S. Department of Energy, National Nuclear Security Administration, to coordinate all Federal assets involved in conducting a comprehensive program of radiological environmental monitoring, sampling, radioanalysis, quality assurance, and dose assessment. During an emergency response, the initial analytical data is provided by portable field instrumentation. As incident responders scale up their response based on the seriousness of the incident, local analytical assets and mobile laboratories add additional capability and capacity. During the intermediate phase of the response, data quality objectives and measurement quality objectives are more rigorous. These higher objectives will require the use of larger laboratories, with greater capacity and enhanced capabilities. These labs may be geographically distant from the incident, which will increase sample management challenges. This paper addresses emergency radioanalytical capability and capacity and its utilization during FRMAC operations.

  15. Detection and monitoring of earthquake precursors: TwinSat, a Russia-UK satellite project

    NASA Astrophysics Data System (ADS)

    Chmyrev, Vitaly; Smith, Alan; Kataria, Dhiren; Nesterov, Boris; Owen, Christopher; Sammonds, Peter; Sorokin, Valery; Vallianatos, Filippos

    2013-09-01

    There is now a body of evidence to indicate that coupling occurs between the lithosphere-atmosphere-ionosphere prior to earthquake events. Nevertheless the physics of these phenomena and the possibilities of their use as part of an earthquake early warning system remain poorly understood. Proposed here is a programme to create a much greater understanding in this area through the deployment of a dedicated space asset along with coordinated ground stations, modelling and the creation of a highly accessible database. The space element would comprise 2 co-orbiting spacecraft (TwinSat) involving a microsatellite and a nanosatellite, each including a suite of science instruments appropriate to this study. Over a mission duration of 3 years ∌ 400 earthquakes in the range 6-6.9 on the Richter scale would be ‘observed’. Such a programme is a prerequisite for an effective earthquake early warning system.

  16. Early Results of Three-Year Monitoring of Red Wood Ants’ Behavioral Changes and Their Possible Correlation with Earthquake Events

    PubMed Central

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-01-01

    Simple Summary For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the video streams. Based on this automated approach, a statistical analysis of the ant behavior will be carried out. Abstract Short-term earthquake predictions with an advance warning of several hours or days are currently not possible due to both incomplete understanding of the complex tectonic processes and inadequate observations. Abnormal animal behaviors before earthquakes have been reported previously, but create problems in monitoring and reliability. The situation is different with red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae)). They have stationary mounds on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas. For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras with both a color and an infrared sensor. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the more than 45,000 hours of video streams. Based on this automated approach, a statistical analysis of the ants’ behavior will be carried out. In addition, other parameters (climate, geotectonic and biological), which may influence behavior, will be included in the analysis. PMID:26487310

  17. The Development of Several Electromagnetic Monitoring Strategies and Algorithms for Validating Pre-Earthquake Electromagnetic Signals

    NASA Astrophysics Data System (ADS)

    Bleier, T. E.; Dunson, J. C.; Roth, S.; Mueller, S.; Lindholm, C.; Heraud, J. A.

    2012-12-01

    QuakeFinder, a private research group in California, reports on the development of a 100+ station network consisting of 3-axis induction magnetometers, and air conductivity sensors to collect and characterize pre-seismic electromagnetic (EM) signals. These signals are combined with daily Infra Red signals collected from the GOES weather satellite infrared (IR) instrument to compare and correlate with the ground EM signals, both from actual earthquakes and boulder stressing experiments. This presentation describes the efforts QuakeFinder has undertaken to automatically detect these pulse patterns using their historical data as a reference, and to develop other discriminative algorithms that can be used with air conductivity sensors, and IR instruments from the GOES satellites. The overall big picture results of the QuakeFinder experiment are presented. In 2007, QuakeFinder discovered the occurrence of strong uni-polar pulses in their magnetometer coil data that increased in tempo dramatically prior to the M5.1 earthquake at Alum Rock, California. Suggestions that these pulses might have been lightning or power-line arcing did not fit with the data actually recorded as was reported in Bleier [2009]. Then a second earthquake occurred near the same site on January 7, 2010 as was reported in Dunson [2011], and the pattern of pulse count increases before the earthquake occurred similarly to the 2007 event. There were fewer pulses, and the magnitude of them was decreased, both consistent with the fact that the earthquake was smaller (M4.0 vs M5.4) and farther away (7Km vs 2km). At the same time similar effects were observed at the QuakeFinder Tacna, Peru site before the May 5th, 2010 M6.2 earthquake and a cluster of several M4-5 earthquakes.

  18. ESTABLISHMENT OF THE WESTERN REGIONAL CENTER FOR BIOLOGICAL MONITORING AND ASSESSMENT OF FRESHWATER ECOSYSTEMS:

    EPA Science Inventory

    Initial Center Objectives 1. Coordinate the establishment of the Advisory Board for the newly formed Western Regional Center for Biological Monitoring and Assessment of Freshwater Ecosystems. The responsibility of the Advisory Board will be to set research, education, and outr...

  19. Unexpected changes in resistivity monitoring for earthquakes of the Longmen Shan in Sichuan, China, with a fixed Schlumberger sounding array

    NASA Astrophysics Data System (ADS)

    Lu, Jun; Xue, Shunzhang; Qian, Fuye; Zhao, Yulin; Guan, Huaping; Mao, Xianjin; Ruan, Aiguo; Yu, Surong; Xiao, Wujun

    2004-07-01

    An electrical resistivity monitoring experiment using a fixed Schlumberger sounding array was conducted at Pixian in Sichuan Province, China, from January 1984 to June 1989, for the purposes of better understanding some geoelectrical phenomena and detecting earthquake-associated changes. Ancillary data of well water fluctuations and rainfall changes, along with seismicity pattern, fault map, atmospheric humidity, and temperature, have been collected. A very peculiar resistivity behavior was observed: the direction of the change in apparent resistivity for large electrode expanders was opposite to that of the resistivity change for the overburden. The geoelectrical structure beneath the monitoring array could be inferred from resistivity sounding data and available geological information. A method of data analysis applying singular value decomposition to the first-order sensitivity matrix yields the estimated change in true resistivity for the constituent medium in the sense of generalized least-squares. The sensitivity analysis shows that for the Pixian station, the abnormal annual variations in resistivity monitoring with large electrode spacings are related to the multilayered resistivity section with conductive substratum and are caused by the resistivity changes of overlying layer because the sensitivity coefficient for the topmost layer for such a section is negative. The resistivity changes of the overlying layer could be attributed mainly to seasonal rainfall changes. During the experiment, no significant resistivity changes are correlated with the earthquakes of ML 5.4 or less in the Longmen Shan and adjacent regions. However, a detailed interpretation of the observed phenomenon is helpful to improve our understanding of possible resistivity precursors to earthquakes.

  20. First Results of 3 Year Monitoring of Red Wood Ants' Behavioural Changes and Their Possible Correlation with Earthquake Events

    NASA Astrophysics Data System (ADS)

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-04-01

    Short-term earthquake predictions with an advance warning of several hours or days can currently not be performed reliably and remain limited to only a few minutes before the event. Abnormal animal behaviours prior to earthquakes have been reported previously but their detection creates problems in monitoring and reliability. A different situation is encountered for red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae). They have stationary nest sites on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas and are simultaneously information channels deeply reaching into the crust. A particular advantage of monitoring RWA is their high sensitivity to environmental changes. Besides an evolutionarily developed extremely strong temperature sensitivity of 0.25 K, they have chemoreceptors for the detection of CO2 concentrations and a sensitivity for electromagnetic fields. Changes of the electromagnetic field are discussed or short-lived "thermal anomalies" are reported as trigger mechanisms for bioanomalies of impending earthquakes. For 3 years, we have monitored two Red Wood Ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), 24/7 by high-resolution cameras equipped with a colour and infrared sensor. In the Neuwied Basin, an average of about 100 earthquakes per year with magnitudes up to M 3.9 occur located on different tectonic fault regimes (strike-slip faults and/or normal or thrust faults). The RWA mounds are located on two different fault regimes approximately 30 km apart. First results show that the ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants' behaviour hours before the earthquake event: The nocturnal rest phase and daily activity are suppressed, and standard daily routine is continued not before the next day. Additional parameters that might have an effect on the ants' daily routine (including climate data, earth tides, lunar phases and biological parameters) are recorded and correlated with the analysed daily activity. Additionally, nest air measurements (CO2, Helium, Radon, H2S and CH4) are performed at intervals. At present, an automated image analysis routine is being applied to the acquired more than 45,000 hours of video stream data. It is a valuable tool to objectively identify and classify the ants' activity on top of mounds and to examine possible correlations with earthquakes. Based on this automated approach, a statistical analysis of the ants' behaviour is intended. The investigation and results presented here are a first access to a completely new research complex. The key question is whether the ants' behavioural changes and their correlation with earthquake events are statistically significant and if a detection by an automated system is possible. Long-term studies have to show whether confounding factors and climatic influences can clearly be distinguished. Although the first results suggest that it is promising to consolidate and extend the research to determine a pattern for exceptional situations, there is, however, still a long way to go for a usable automated earthquake warning system. References Berberich G (2010): Identifikation junger gasfĂŒhrender Störungszonen in der West- und Hocheifel mit Hilfe von Bioindikatoren. Dissertation. Essen, 293 S. Berberich G, Klimetzek D, Wöhler C., and Grumpe A (2012): Statistical Correlation between Red Wood Ant Sites and Neotectonic Strike-Slip Faults. Geophysical Research Abstracts Vol. 14, EGU2012-3518 Berberich G, Berberich M, Grumpe A, Wöhler C., and Schreiber U (2012): First Results of 3 Year Monitoring of Red Wood Ants' Behavioural Changes and Their Possible Correlation with Earthquake Events. Animals, ISSN 2076-2615,. Special Issue "Biological Anomalies Prior to Earthquakes") (in prep.) Dologlou E. (2010): Recent aspects on possible interrelation between precursory electric signals and anomalous bioeffects. Nat. Hazards Earth Syst. Sci., 10, 1951-1955. Kirchner, W (2007): Die Ameisen - Biologie und Verhalten. Verlag C.H. Beck, 125 p. Hetz SK, Bradley TJ (2005): Insects breathe discontinuously to avoid oxygen toxicity. Nature 433. Ouzounov D, Freund F (2004): Mid-infrared emission prior to strong earthquakes analyzed by remote sensing data, Adv. pace Res., 33, 268-273. Weaver JC, Vaughan TE, Astumian RD (2000): Biological sensing of small field differences by magnetically sensitive chemical reactions. Nature. 405. Weaver J.C. (2002): Understanding conditions for which biological effects of nonionizing electromagnetic fields can be expected. Bioelectrochemistry, 56, 207- 209.

  1. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  2. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred


  3. The Community Seismic Network and Quake-Catcher Network: Monitoring building response to earthquakes through community instrumentation

    NASA Astrophysics Data System (ADS)

    Cheng, M.; Kohler, M. D.; Heaton, T. H.; Clayton, R. W.; Chandy, M.; Cochran, E.; Lawrence, J. F.

    2013-12-01

    The Community Seismic Network (CSN) and Quake-Catcher Network (QCN) are dense networks of low-cost ($50) accelerometers that are deployed by community volunteers in their homes in California. In addition, many accelerometers are installed in public spaces associated with civic services, publicly-operated utilities, university campuses, and high-rise buildings. Both CSN and QCN consist of observation-based structural monitoring which is carried out using records from one to tens of stations in a single building. We have deployed about 150 accelerometers in a number of buildings ranging between five and 23 stories in the Los Angeles region. In addition to a USB-connected device which connects to the host's computer, we have developed a stand-alone sensor-plug-computer device that directly connects to the internet via Ethernet or WiFi. In the case of CSN, the sensors report data to the Google App Engine cloud computing service consisting of data centers geographically distributed across the continent. This robust infrastructure provides parallelism and redundancy during times of disaster that could affect hardware. The QCN sensors, however, are connected to netbooks with continuous data streaming in real-time via the distributed computing Berkeley Open Infrastructure for Network Computing software program to a server at Stanford University. In both networks, continuous and triggered data streams use a STA/LTA scheme to determine the occurrence of significant ground accelerations. Waveform data, as well as derived parameters such as peak ground acceleration, are then sent to the associated archives. Visualization models of the instrumented buildings' dynamic linear response have been constructed using Google SketchUp and MATLAB. When data are available from a limited number of accelerometers installed in high rises, the buildings are represented as simple shear beam or prismatic Timoshenko beam models with soil-structure interaction. Small-magnitude earthquake records are used to identify the first two pairs of horizontal vibrational frequencies, which are then used to compute the response on every floor of the building, constrained by the observed data. The approach has been applied to a CSN-instrumented 12-story reinforced concrete building near downtown Los Angeles. The frequencies were identified directly from spectra of the 8 August 2012 M4.5 Yorba Linda, California earthquake acceleration time series. When the basic dimensions and the first two frequencies are input into a prismatic Timoshenko beam model of the building, the model yields mode shapes that have been shown to match well with densely recorded data. For the instrumented 12-story building, comparisons of the predictions of responses on other floors using only the record from the 9th floor with actual data from the other floors shows this method to approximate the true response remarkably well.

  4. Passive seismic monitoring of natural and induced earthquakes: case studies, future directions and socio-economic relevance

    USGS Publications Warehouse

    Bohnhoff, Marco; Dresen, Georg; Ellsworth, William L.; Ito, Hisao

    2010-01-01

    An important discovery in crustal mechanics has been that the Earth’s crust is commonly stressed close to failure, even in tectonically quiet areas. As a result, small natural or man-made perturbations to the local stress field may trigger earthquakes. To understand these processes, Passive Seismic Monitoring (PSM) with seismometer arrays is a widely used technique that has been successfully applied to study seismicity at different magnitude levels ranging from acoustic emissions generated in the laboratory under controlled conditions, to seismicity induced by hydraulic stimulations in geological reservoirs, and up to great earthquakes occurring along plate boundaries. In all these environments the appropriate deployment of seismic sensors, i.e., directly on the rock sample, at the earth’s surface or in boreholes close to the seismic sources allows for the detection and location of brittle failure processes at sufficiently low magnitude-detection threshold and with adequate spatial resolution for further analysis. One principal aim is to develop an improved understanding of the physical processes occurring at the seismic source and their relationship to the host geologic environment. In this paper we review selected case studies and future directions of PSM efforts across a wide range of scales and environments. These include induced failure within small rock samples, hydrocarbon reservoirs, and natural seismicity at convergent and transform plate boundaries. Each example represents a milestone with regard to bridging the gap between laboratory-scale experiments under controlled boundary conditions and large-scale field studies. The common motivation for all studies is to refine the understanding of how earthquakes nucleate, how they proceed and how they interact in space and time. This is of special relevance at the larger end of the magnitude scale, i.e., for large devastating earthquakes due to their severe socio-economic impact.

  5. Catalog of Earthquake Hypocenters at Alaskan Volcanoes: January 1 through December 31, 2007

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.

    2008-01-01

    Between January 1 and December 31, 2007, AVO located 6,664 earthquakes of which 5,660 occurred within 20 kilometers of the 33 volcanoes monitored by the Alaska Volcano Observatory. Monitoring highlights in 2007 include: the eruption of Pavlof Volcano, volcanic-tectonic earthquake swarms at the Augustine, Illiamna, and Little Sitkin volcanic centers, and the cessation of episodes of unrest at Fourpeaked Mountain, Mount Veniaminof and the northern Atka Island volcanoes (Mount Kliuchef and Korovin Volcano). This catalog includes descriptions of : (1) locations of seismic instrumentation deployed during 2007; (2) earthquake detection, recording, analysis, and data archival systems; (3) seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2007; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2007.

  6. Martin Marietta Energy Systems, Inc. comprehensive earthquake management plan: Emergency Operations Center training manual

    SciTech Connect

    Not Available

    1990-02-28

    The objective of this training is to: describe the responsibilities, resources, and goals of the Emergency Operations Center and be able to evaluate and interpret this information to best direct and allocate emergency, plant, and other resources to protect life and the Paducah Gaseous Diffusion Plant.

  7. Drilling Across Active Faults in Deep Mines in South Africa for Monitoring Earthquake Processes in the Near-Field

    NASA Astrophysics Data System (ADS)

    Reches, Z.; Jordan, T. H.; Johnston, M. J.; Zoback, M.

    2005-12-01

    Deep mines can provide three-dimensional access to fault-zones that are likely to be activated by the mining operations. To take advantage of this option, we ranked 12 active faults in South African mines according to dimensions, internal structure, accessibility, and likelihood of M>3.0 seismic events. The selected site is on the Pretorius fault, which is 10 km long with throw of 30-60 m, and which is exposed at multiple levels in TauTona and Mponeng gold mines, Western Deep Levels. The DAFSAM-NELSAM project (earthquakes.ou.edu) focuses on establishing a natural earthquake laboratory along the Pretorius fault at 3.5 km depth in TauTona mine. The work on the site started in January, 2005, and has been devoted so far to site characterization, including 3D mapping and in-situ stress measurements, and drilling short holes for accelerometers-seismometers. Cross-fault drilling will initiate in September, 2005, and will include four boreholes 40-60 m long each for the installation of creepmeters strain meters, temperature sensors, acoustic emission transducers, and gas analyzers. The monitoring site is established where both sides of the Pretorius fault and the practice in the deep mines and numerical modeling predict profound increase of the seismic activity at the site during the next 2-4 years. The associated increase of shear stresses on the fault is expected to generate a few earthquakes of M>3.0 along the segments of the Pretorius fault. We will present the feature of the monitoring system, and the main current results of fault characteristics and state of stress.

  8. Space Monitoring Data Center at Moscow State University

    NASA Astrophysics Data System (ADS)

    Kalegaev, Vladimir; Bobrovnikov, Sergey; Barinova, Vera; Myagkova, Irina; Shugay, Yulia; Barinov, Oleg; Dolenko, Sergey; Mukhametdinova, Ludmila; Shiroky, Vladimir

    Space monitoring data center of Moscow State University provides operational information on radiation state of the near-Earth space. Internet portal http://swx.sinp.msu.ru/ gives access to the actual data characterizing the level of solar activity, geomagnetic and radiation conditions in the magnetosphere and heliosphere in the real time mode. Operational data coming from space missions (ACE, GOES, ELECTRO-L1, Meteor-M1) at L1, LEO and GEO and from the Earth’s surface are used to represent geomagnetic and radiation state of near-Earth environment. On-line database of measurements is also maintained to allow quick comparison between current conditions and conditions experienced in the past. The models of space environment working in autonomous mode are used to generalize the information obtained from observations on the whole magnetosphere. Interactive applications and operational forecasting services are created on the base of these models. They automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons using data from LEO orbits. Special forecasting services give short-term forecast of SEP penetration to the Earth magnetosphere at low altitudes, as well as relativistic electron fluxes at GEO. Velocities of recurrent high speed solar wind streams on the Earth orbit are predicted with advance time of 3-4 days on the basis of automatic estimation of the coronal hole areas detected on the images of the Sun received from the SDO satellite. By means of neural network approach, Dst and Kp indices online forecasting 0.5-1.5 hours ahead, depending on solar wind and the interplanetary magnetic field, measured by ACE satellite, is carried out. Visualization system allows representing experimental and modeling data in 2D and 3D.

  9. Managing Multi-center Flow Cytometry Data for Immune Monitoring

    PubMed Central

    White, Scott; Laske, Karoline; Welters, Marij JP; Bidmon, Nicole; van der Burg, Sjoerd H; Britten, Cedrik M; Enzor, Jennifer; Staats, Janet; Weinhold, Kent J; Gouttefangeas, CĂ©cile; Chan, Cliburn

    2014-01-01

    With the recent results of promising cancer vaccines and immunotherapy1–5, immune monitoring has become increasingly relevant for measuring treatment-induced effects on T cells, and an essential tool for shedding light on the mechanisms responsible for a successful treatment. Flow cytometry is the canonical multi-parameter assay for the fine characterization of single cells in solution, and is ubiquitously used in pre-clinical tumor immunology and in cancer immunotherapy trials. Current state-of-the-art polychromatic flow cytometry involves multi-step, multi-reagent assays followed by sample acquisition on sophisticated instruments capable of capturing up to 20 parameters per cell at a rate of tens of thousands of cells per second. Given the complexity of flow cytometry assays, reproducibility is a major concern, especially for multi-center studies. A promising approach for improving reproducibility is the use of automated analysis borrowing from statistics, machine learning and information visualization21–23, as these methods directly address the subjectivity, operator-dependence, labor-intensive and low fidelity of manual analysis. However, it is quite time-consuming to investigate and test new automated analysis techniques on large data sets without some centralized information management system. For large-scale automated analysis to be practical, the presence of consistent and high-quality data linked to the raw FCS files is indispensable. In particular, the use of machine-readable standard vocabularies to characterize channel metadata is essential when constructing analytic pipelines to avoid errors in processing, analysis and interpretation of results. For automation, this high-quality metadata needs to be programmatically accessible, implying the need for a consistent Application Programming Interface (API). In this manuscript, we propose that upfront time spent normalizing flow cytometry data to conform to carefully designed data models enables automated analysis, potentially saving time in the long run. The ReFlow informatics framework was developed to address these data management challenges. PMID:26085786

  10. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site

    EPA Science Inventory

    The presentation covers the following monitoring objectives at the demonstration site at Edison, NJ: Hydrologic performance, water quality performance, urban heat island effects, maintenance effects and infiltration water parameters. There will be a side by side monitoring of ...

  11. The continuous automatic monitoring network installed in Tuscany (Italy) since late 2002, to study earthquake precursory phenomena

    NASA Astrophysics Data System (ADS)

    Pierotti, Lisa; Cioni, Roberto

    2010-05-01

    Since late 2002, a continuous automatic monitoring network (CAMN) was designed, built and installed in Tuscany (Italy), in order to investigate and define the geochemical response of the aquifers to the local seismic activity. The purpose of the investigation was to identify eventual earthquake precursors. The CAMN is constituted by two groups of five measurement stations each. A first group has been installed in the Serchio and Magra graben (Garfagnana and Lunigiana Valleys, Northern Tuscany), while the second one, in the area of Mt. Amiata (Southern Tuscany), an extinct volcano. Garfagnana, Lunigiana and Mt. Amiata regions belong to the inner zone of the Northern Apennine fold-and-thrust belt. This zone has been involved in the post-collision extensional tectonics since the Upper Miocene-Pliocene. Such tectonic activity has produced horst and graben structures oriented from N-S to NW-SE that are transferred by NE-SW system. Both Garfagnana (Serchio graben) and Lunigiana (Magra graben) belong to the most inner sector of the belt where the seismic sources, responsible for the strongest earthquakes of the northern Apennine, are located (e.g. the M=6.5 earthquake of September 1920). The extensional processes in southern Tuscany have been accompanied by magmatic activity since the Upper Miocene, developing effusive and intrusive products traditionally attributed to the so-called Tuscan Magmatic Province. Mt. Amiata, whose magmatic activity ceased about 0.3 M.y. ago, belongs to the extensive Tyrrhenian sector that is characterized by high heat flow and crustal thinning. The whole zone is characterized by wide-spread but moderate seismicity (the maximum recorded magnitude has been 5.1 with epicentre in Piancastagnaio, 1919). The extensional regime in both the Garfagnana-Lunigiana and Mt. Amiata area is confirmed by the focal mechanisms of recent earthquakes. An essential phase of the monitoring activities has been the selection of suitable sites for the installation of monitoring stations. This has been carried out on the basis of: i) hydrogeologic and structural studies in order to assess the underground fluid circulation regime; ii) a detailed geochemical study of all the natural manifestations present in the selected territories, such as cold and hot springs and gas emission zones; iii) logistical aspects. Therefore, a detailed hydrogeochemical study was performed in 2002. A total of 150 water points were sampled in Garfagnana/Lunigiana area (N-W Tuscany) and analysed. Based on the results of this multidisciplinary study, five water points suitable for the installation of the monitoring stations, were selected. They are: Bagni di Lucca (BernabĂČ spring), Gallicano (Capriz spring) and Pieve Fosciana (PrĂ  di Lama spring) in Garfagnana, Equi Terme (main spring feeding the swimming pool of the thermal resort) and Villafranca in Lunigiana (well feeding the public swimming pool). In the Amiata area, in the preliminary campaign, 69 water points were sampled and analyzed and five sites were selected. They are Piancastagnaio, Santa Fiora, Pian dei Renai and Bagnore, which are fed by the volcanic aquifer, and Bagno Vignoni borehole, which is fed by the evaporite carbonate aquifer. The installation and start-up process of the monitoring systems in the Garfagnana-Lunigiana area begun in November 2002; in the Monte Amiata region it begun in June 2003. From the day of installation, a periodic water sampling and manual measurement of the main physical and physicochemical parameters have been carried out on a monthly basis. Such activity has the double function of performing a cross-check of the monitoring instrumentation, and carrying out additional chemical and isotopic analysis. The continuous automatic monitoring stations operate with flowing water (about 5 litres per minute) and record the following parameters: temperature (T), pH, electrical conductivity (EC), redox potential (ORP) and the content of CO2 and CH4 dissolved in water. Data are acquired once per second; the average value, median value and variance of the samples collected over a period of 5 min are recorded in a local removable non-volatile memory (Compact Flash card). Data can be downloaded both onsite and in remote, via a GSM/GPRS modem connected to the embedded PC. The results of seven years of continuous monitoring can be summarised as follows: i) the monitoring stations made it possible to detect even small variations of the measured parameters, with respect to equivalent commercial devices; ii) acquired data made it possible to identify the groundwater circulation patterns; iii) in most locations, the observed trend of the acquired parameters is consistent with the periodic manual sampling results, and confirms the mixture of different water types that the hydrogeochemical model has determined. The absence of seismic events with a sufficient energy precluded the possibility to locate anomalies, with two exception: Equi Terme and Bagno Vignoni sites. At the Equi Terme station an anomalous increase in the dissolved CO2 content was observed twelve days before a M=3.7 earthquake occurred at a distance of 3 km north of the monitoring station. At the Bagno Vignoni station an anomalous decrease in the temperature and electrical conductivity signal was observed nine days before a M=3.3 earthquake occurred at a distance of 12 km est of the monitoring station. The CAMN resulted as being a suitable tool in order to investigate the anomalous variations of the physical, physicochemical and chemical parameters of aquifer systems as earthquake precursors.

  12. Earthquake Alert System feasibility study

    SciTech Connect

    Harben, P.E.

    1991-12-01

    An Earthquake Alert System (EAS) could give several seconds to several tens of seconds warning before the strong motion from a large earthquake arrives. Such a system would include a large network of sensors distributed within an earthquake-prone region. The sensors closest to the epicenter of a particular earthquake would transmit data at the speed of light to a central processing center, which would broadcast an area-wide alarm in advance of the spreading elastic wave energy from the earthquake. This is possible because seismic energy travels slowly (3--6 km/s) compared to the speed of light. Utilities, public and private institutions, businesses, and the general public would benefit from an EAS. Although many earthquake protection systems exist that automatically shut down power, gas mains, etc. when ground motion at a facility reaches damaging levels, not EAS -- that is, a system that can provide warning in advance of elastic wave energy arriving at a facility -- has ever been developed in the United States. A recent study by the National Academy of Sciences (NRC, 1991) concludes that an EAS is technically feasible and strongly recommends installing a prototype system that makes use of existing microseismic stations as much as possible. The EAS concept discussed here consists of a distributed network of remote seismic stations that measure weak and strong earth motion and transmit the data in real time to central facility. This facility processes the data and issues warning broadcasts in the form of information packets containing estimates of earthquake location, zero time (the time the earthquake began), magnitude, and reliability of the predictions. User of the warning broadcasts have a dedicated receiver that monitors the warning broadcast frequency. The user also has preprogrammed responses that are automatically executed when the warning information packets contain location and magnitude estimates above a facility's tolerance.

  13. Earthquake Alert System feasibility study

    SciTech Connect

    Harben, P.E.

    1991-12-01

    An Earthquake Alert System (EAS) could give several seconds to several tens of seconds warning before the strong motion from a large earthquake arrives. Such a system would include a large network of sensors distributed within an earthquake-prone region. The sensors closest to the epicenter of a particular earthquake would transmit data at the speed of light to a central processing center, which would broadcast an area-wide alarm in advance of the spreading elastic wave energy from the earthquake. This is possible because seismic energy travels slowly (3--6 km/s) compared to the speed of light. Utilities, public and private institutions, businesses, and the general public would benefit from an EAS. Although many earthquake protection systems exist that automatically shut down power, gas mains, etc. when ground motion at a facility reaches damaging levels, not EAS -- that is, a system that can provide warning in advance of elastic wave energy arriving at a facility -- has ever been developed in the United States. A recent study by the National Academy of Sciences (NRC, 1991) concludes that an EAS is technically feasible and strongly recommends installing a prototype system that makes use of existing microseismic stations as much as possible. The EAS concept discussed here consists of a distributed network of remote seismic stations that measure weak and strong earth motion and transmit the data in real time to central facility. This facility processes the data and issues warning broadcasts in the form of information packets containing estimates of earthquake location, zero time (the time the earthquake began), magnitude, and reliability of the predictions. User of the warning broadcasts have a dedicated receiver that monitors the warning broadcast frequency. The user also has preprogrammed responses that are automatically executed when the warning information packets contain location and magnitude estimates above a facility`s tolerance.

  14. Monitoring Local and Teleseismic Earthquakes Off--Shore San Diego(California) During an OBSIP Test Deployment

    NASA Astrophysics Data System (ADS)

    Laske, G.; Babcock, J.; Hollinshead, C.; Georgieff, P.; Allmann, B.; Orcutt, J.

    2004-12-01

    The Scripps OBS (Ocean Bottom Seismometer) team is one of three groups that provide instrumentation for the US National OBS Instrument Pool (OBSIP). The compact active source LC2000 instruments are being used successfully in numerous experiments, with excellent data quality and return rates. A set of five new passive seismic instruments was test--deployed from November 6th, 2003 through January 8th, 2004 in the San Diego Trough, about 1km below the sea surface, about 40km off--shore San Diego, California. These instruments are equipped with a Nanometrics Trillium 40s 3--component seismometer and a Cox--Webb differential pressure gauge. We recorded more than 30 teleseismic earthquakes suitable for a long-period surface wave study. The vertical--component seismometer recordings are of excellent quality and are often superior to those from similar sensors on land (Guralp CMG-40T). The signal--to--noise ratio on the DPGs depend strongly on the water depth and was expected to be low for the test deployment. Nevertheless, the December 22, 2003 San Simeon/ California earthquake was recorded with high fidelity and non--seismogenic signals are extremely coherent down to very long periods. We also recorded numerous local earthquakes. Many of these occurred off-shore and the OBSs were the closest stations by many tens of kilometers. For example, a magnitude 3.0 earthquake on the Coronado Banks Fault was recorded at station SOL in La Jolla at about 30km distance, with a signal-to-noise ratio too poor to pick the first arrival. The next closest stations were 60km and 80km away, while one of the OBSs was only 20km away. The co-deployment of DPGs allowed us to observe the first P arrival very clearly. We also recorded numerous events that were not recorded on land. About six months later, on June 15, 2004 the greater San Diego area was struck by a magnitude 5.2 earthquake on the San Clemente Fault, about 40km southwest of the OBS test deployment. Though no structural damage was reported, intensity 4 shaking occurred throughout the city, which prompted Amtrak and Sea World to shut down operations for inspections. These events are continous reminders that significant seismic hazard is caused by activity along the only poorly understood, off-shore faults in the California Borderland. Realtime seismic monitoring using cabled or moored seismic observatories is clearly needed.

  15. 88 hours: The U.S. Geological Survey National Earthquake Information Center response to the 11 March 2011 Mw 9.0 Tohoku earthquake

    USGS Publications Warehouse

    Hayes, G.P.; Earle, P.S.; Benz, H.M.; Wald, D.J.; Briggs, R.W.

    2011-01-01

    This article presents a timeline of NEIC response to a major global earthquake for the first time in a formal journal publication. We outline the key observations of the earthquake made by the NEIC and its partner agencies, discuss how these analyses evolved, and outline when and how this information was released to the public and to other internal and external parties. Our goal in the presentation of this material is to provide a detailed explanation of the issues faced in the response to a rare, giant earthquake. We envisage that the timeline format of this presentation can highlight technical and procedural successes and shortcomings, which may in turn help prompt research by our academic partners and further improvements to our future response efforts. We have shown how NEIC response efforts have significantly improved over the past six years since the great 2004 Sumatra-Andaman earthquake. We are optimistic that the research spawned from this disaster, and the unparalleled dense and diverse data sets that have been recorded, can lead to similar-and necessary-improvements in the future.

  16. The 2010 Mw 8.8 Maule megathrust earthquake of Central Chile, monitored by GPS.

    PubMed

    Vigny, C; Socquet, A; Peyrat, S; Ruegg, J-C; Métois, M; Madariaga, R; Morvan, S; Lancieri, M; Lacassin, R; Campos, J; Carrizo, D; Bejar-Pizarro, M; Barrientos, S; Armijo, R; Aranda, C; Valderas-Bermejo, M-C; Ortega, I; Bondoux, F; Baize, S; Lyon-Caen, H; Pavez, A; Vilotte, J P; Bevis, M; Brooks, B; Smalley, R; Parra, H; Baez, J-C; Blanco, M; Cimbaro, S; Kendrick, E

    2011-06-17

    Large earthquakes produce crustal deformation that can be quantified by geodetic measurements, allowing for the determination of the slip distribution on the fault. We used data from Global Positioning System (GPS) networks in Central Chile to infer the static deformation and the kinematics of the 2010 moment magnitude (M(w)) 8.8 Maule megathrust earthquake. From elastic modeling, we found a total rupture length of ~500 kilometers where slip (up to 15 meters) concentrated on two main asperities situated on both sides of the epicenter. We found that rupture reached shallow depths, probably extending up to the trench. Resolvable afterslip occurred in regions of low coseismic slip. The low-frequency hypocenter is relocated 40 kilometers southwest of initial estimates. Rupture propagated bilaterally at about 3.1 kilometers per second, with possible but not fully resolved velocity variations. PMID:21527673

  17. Real-time particulate fallout contamination monitoring technology development at NASA Kennedy Space Center

    NASA Astrophysics Data System (ADS)

    Mogan, Paul A.; Schwindt, Chris J.

    1998-10-01

    Two separate real-time particulate fallout monitoring instruments have been developed by the contamination monitoring Laboratory at NASA John F. Kennedy Space Center. These instruments monitor particular fallout contamination deposition rates in cleanrooms and allow certification of cleanliness levels as well as proactive protection of valuable flight hardware.

  18. Earthquake monitoring of eastern Washington: Annual technical report 1986 (July 1, 1985-June 30, 1986)

    SciTech Connect

    Not Available

    1986-10-01

    This report covers the operations and research performed for DOE by the University of Washington Geophysics Program on the seismicity and structure of eastern Washington and northeastern Oregon for the year, July 1, 1985 to June 30, 1986. There are presently 111 stations in Washington and northeastern Oregon whose data are telemetered to the University for recording, analysis and interpretation. Section I covers the operation of the network including station maintenance, data processing and telemetry problems in eastern Washington and the Washington-Oregon border area. A fairly detailed description of the switch to BPA microwave telemetry is included. The seismicity of the past year and a description of the catalog is covered in Section II. Section III is a description of our experiments with an earthquake location routine for eastern Washington using a velocity structure including a low-velocity zone. Section IV gives the detailed results of our re-calibration of the USGS amplitude magnitude scale used between 1969 and 1974 for magnitude determinations in eastern Washington. Section V is an exploration of techniques to improve timing accuracy and relative locations are studied using a well recorded swarm of earthquakes in the Cascades. The appendix includes the catalog of earthquakes located in eastern Washington during this year and a table of station outages.

  19. Activity remotely triggered in volcanic and geothermal centers in California and Washington by the 3 November 2002 Mw=7.9 Alaska earthquake

    NASA Astrophysics Data System (ADS)

    Hill, D. P.; Prejean, S.; Oppenheimer, D.; Pitt, A. M.; S. D. Malone; Richards-Dinger, K.

    2002-12-01

    The M=7.9 Alaska earthquake of 3 November 2002 was followed by bursts of remotely triggered earthquakes at several volcanic and geothermal areas across the western United States at epicentral distances of 2,500 to 3,660 km. Husen et al. (this session) describe the triggered response for Yellowstone caldera, Wyoming. Here we highlight the triggered response for the Geysers geothermal field in northern California, Mammoth Mountain and Long Valley caldera in eastern California, the Coso geothermal field in southeastern California, and Mount Rainier in central Washington. The onset of triggered seismicity at each of these areas began 15 to 17 minutes after the Alaska earthquake during the S-wave coda and the early phases of the Love and Raleigh waves with periods of 5 to 40 seconds and dynamic strains of a few microstrain. In each case, the seismicity was characterized by spasmodic bursts of small (M<2 ), brittle-failure earthquakes. The activity persisted for just a few minutes at Mount Rainier and Mammoth Mountain and roughly 30 minutes at the Geysers and Coso geothermal fields. Many of the triggered earthquakes at all three sites were too small for reliable locations (magnitudes M<1), although their small S-P times indicate hypocentral locations within a few km of the nearest seismic station. Borehole dilatometers in vicinity of Mammoth Mountain recorded strain offsets on the order of 0.1 microstrain coincident in time with the triggered seismicity (Johnston et al. this session), and water level in the 3-km-deep LVEW well in the center of Long Valley caldera dropped by ~13 cm during passage of the seismic wave train from the Alaska earthquake followed by a gradual recovery. The Geysers, Coso, and Mount Rainier have no continuous, high-resolution strain instrumentation. A larger earthquake swarm that began 23.5 hours later (21:38 UT on the 4th) in the south moat of Long Valley caldera and included nine M>2 and one M=3.0 earthquake may represent a delayed response to the Alaska earthquake.

  20. A framework for rapid post-earthquake assessment of bridges and restoration of transportation network functionality using structural health monitoring

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; Ramhormozian, Shahab; Mangabhai, Poonam; Singh, Ravikash; Orense, Rolando

    2013-04-01

    Quick and reliable assessment of the condition of bridges in a transportation network after an earthquake can greatly assist immediate post-disaster response and long-term recovery. However, experience shows that available resources, such as qualified inspectors and engineers, will typically be stretched for such tasks. Structural health monitoring (SHM) systems can therefore make a real difference in this context. SHM, however, needs to be deployed in a strategic manner and integrated into the overall disaster response plans and actions to maximize its benefits. This study presents, in its first part, a framework of how this can be achieved. Since it will not be feasible, or indeed necessary, to use SHM on every bridge, it is necessary to prioritize bridges within individual networks for SHM deployment. A methodology for such prioritization based on structural and geotechnical seismic risks affecting bridges and their importance within a network is proposed in the second part. An example using the methodology application to selected bridges in the medium-sized transportation network of Wellington, New Zealand is provided. The third part of the paper is concerned with using monitoring data for quick assessment of bridge condition and damage after an earthquake. Depending on the bridge risk profile, it is envisaged that data will be obtained from either local or national seismic monitoring arrays or SHM systems installed on bridges. A method using artificial neural networks is proposed for using data from a seismic array to infer key ground motion parameters at an arbitrary bridges site. The methodology is applied to seismic data collected in Christchurch, New Zealand. Finally, how such ground motion parameters can be used in bridge damage and condition assessment is outlined.

  1. Catalog of earthquake hypocenters at Alaskan Volcanoes: January 1 through December 31, 2010

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl K.

    2011-01-01

    Between January 1 and December 31, 2010, the Alaska Volcano Observatory (AVO) located 3,405 earthquakes, of which 2,846 occurred within 20 kilometers of the 33 volcanoes with seismograph subnetworks. There was no significant seismic activity in 2010 at these monitored volcanic centers. Seismograph subnetworks with severe outages in 2009 were repaired in 2010 resulting in three volcanic centers (Aniakchak, Korovin, and Veniaminof) being relisted in the formal list of monitored volcanoes. This catalog includes locations and statistics of the earthquakes located in 2010 with the station parameters, velocity models, and other files used to locate these earthquakes.

  2. A summary of ground motion effects at SLAC (Stanford Linear Accelerator Center) resulting from the Oct 17th 1989 earthquake

    SciTech Connect

    Ruland, R.E.

    1990-08-01

    Ground motions resulting from the October 17th 1989 (Loma Prieta) earthquake are described and can be correlated with some geologic features of the SLAC site. Recent deformations of the linac are also related to slow motions observed over the past 20 years. Measured characteristics of the earthquake are listed. Some effects on machine components and detectors are noted. 18 refs., 16 figs.

  3. Analysis in natural time domain of geoelectric time series monitored prior two strong earthquakes occurred in Mexico

    NASA Astrophysics Data System (ADS)

    Ramírez-Rojas, A.; Flores-Marquez, L. E.

    2009-12-01

    The short-time prediction of seismic phenomena is currently an important problem in the scientific community. In particular, the electromagnetic processes associated with seismic events take in great interest since the VAN method was implemented. The most important features of this methodology are the seismic electrical signals (SES) observed prior to strong earthquakes. SES has been observed in the electromagnetic series linked to EQs in Greece, Japan and Mexico. By mean of the so-called natural time domain, introduced by Varotsos et al. (2001), they could characterize signals of dichotomic nature observed in different systems, like SES and ionic current fluctuations in membrane channels. In this work we analyze SES observed in geoelectric time series monitored in Guerrero, México. Our analysis concern with two strong earthquakes occurred, on October 24, 1993 (M=6.6) and September 14, 1995 (M=7.3). The time series of the first one displayed a seismic electric signal six days before the main shock and for the second case the time series displayed dichotomous-like fluctuations some months before the EQ. In this work we present the first results of the analysis in natural time domain for the two cases which seems to be agreeing with the results reported by Varotsos. P. Varotsos, N. Sarlis, and E. Skordas, Practica of the Athens Academy 76, 388 (2001).

  4. Improved earthquake monitoring in the central and eastern United States in support of seismic assessments for critical facilities

    USGS Publications Warehouse

    Leith, William S.; Benz, Harley M.; Herrmann, Robert B.

    2011-01-01

    Evaluation of seismic monitoring capabilities in the central and eastern United States for critical facilities - including nuclear powerplants - focused on specific improvements to understand better the seismic hazards in the region. The report is not an assessment of seismic safety at nuclear plants. To accomplish the evaluation and to provide suggestions for improvements using funding from the American Recovery and Reinvestment Act of 2009, the U.S. Geological Survey examined addition of new strong-motion seismic stations in areas of seismic activity and addition of new seismic stations near nuclear power-plant locations, along with integration of data from the Transportable Array of some 400 mobile seismic stations. Some 38 and 68 stations, respectively, were suggested for addition in active seismic zones and near-power-plant locations. Expansion of databases for strong-motion and other earthquake source-characterization data also was evaluated. Recognizing pragmatic limitations of station deployment, augmentation of existing deployments provides improvements in source characterization by quantification of near-source attenuation in regions where larger earthquakes are expected. That augmentation also supports systematic data collection from existing networks. The report further utilizes the application of modeling procedures and processing algorithms, with the additional stations and the improved seismic databases, to leverage the capabilities of existing and expanded seismic arrays.

  5. Reducing atmospheric noise in RST analysis of TIR satellite radiances for earthquakes prone areas satellite monitoring

    NASA Astrophysics Data System (ADS)

    Lisi, Mariano; Filizzola, Carolina; Genzano, Nicola; Paciello, Rossana; Pergola, Nicola; Tramutoli, Valerio

    Space-time fluctuations of the Earth's emitted Thermal Infrared (TIR) radiation observed from satellite from months to weeks before an earthquake are reported in several studies. Among the others, a Robust Satellite data analysis Technique (RST) was proposed (and applied to different satellite sensors in various geo-tectonic contexts) to discriminate anomalous signal transients possibly associated with earthquake occurrence from normal TIR signal fluctuations due to other possible causes (e.g. solar diurnal-annual cycle, meteorological conditions, changes in observational conditions, etc.). Variations in satellite view angle depending on satellite's passages (for polar satellites) and atmospheric water vapour fluctuations were recognized in the past as the main factors affecting the residual signal variability reducing the overall Signal-to-Noise (S/N) ratio and the potential of the RST-based approach in identifying seismically related thermal anomalies. In this paper we focus on both factors for the first time, applying the RST approach to geostationary satellites (which guarantees stable view angles) and using Land Surface Temperature (LST) data products (which are less affected by atmospheric water vapour variability) instead of just TIR radiances at the sensor. The first results, obtained in the case of the Abruzzo earthquake (6 April 2009, MW ? 6.3) by analyzing 6 years of SEVIRI (Spinning Enhanced Visible and Infrared Imager on board the geostationary Meteosat Second Generation satellite) LST products provided by EUMETSAT, seem to confirm the major sensitivity of the proposed approach in detecting perturbations of the Earth's thermal emission a few days before the main shock. The results achieved in terms of increased S/N ratio (in validation) and reduced "false alarms" rate (in confutation) are discussed comparing results obtained by applying RST to LST products with those achieved by applying an identical RST analysis (using the same MSG-SEVIRI 2005-2010 data-set) to the simple TIR radiances at the sensor.

  6. Medical Relief Response by Miyako Public Health Center after the Great East Japan Earthquake and Tsunami, 2011.

    PubMed

    Yanagihara, Hiroki

    2016-01-01

    Objectives To improve disaster preparedness, we investigated the response of medical relief activities managed by Iwate Prefectural Miyako Public Health Center during the post-acute phase of the Great East Japan Earthquake and Tsunami on March 11, 2011.Methods The study divided the post-disaster period into three approximate time segments: Period I (time of disaster through late March), Period II (mid-April), and Period III (end of May in Miyako City, early July in Yamada Town). We reviewed records on medical relief activities conducted by medical assistance teams (MATs) in Miyako City and Yamada Town.Results Miyako Public Health Center had organized a meeting to coordinate medical relief activities from Period I to Period III. According to demand for medical services and recovery from the local medical institutions (LMIs) in the affected area, MATs were deployed and active on evacuation centers in each area assigned. The number of patients examined by MATs in Miyako rose to approximately 250 people per day in Period I and decreased to 100 in Period III. However, in Yamada, the number surged to 700 in Period I, fell to 100 in Period II, and decreased to 50 in Period III. This difference could be partly explained as follows. In Miyako, most evacuees had consulted LMIs which restarted medical services after disaster, and the number of LMIs restarted had already reached 29 (94% of the whole) in Period I. In Yamada, most evacuees who had consulted MATs in Period I had almost moved to LMIs restarted in Period II. During the same time, a division of roles and coordination on medical services provision was conducted, such as MATs mainly in charge of primary emergency triage, in response to the number of LMIs restarted which reached 1 (20%) in Period I and 3 (60%) in Period II. Following Period III, more than 80% of patients in Miyako had been a slight illness, such as need for health guidance, and the number of people who underwent emergency medical transport reached pre-disaster levels in both locations. These results suggest that demand for medical services of evacuees declined to a stable level in an early stage of Period III. Using the above findings, one might justify supporting local medical institutions' recovery earlier. Then, medical relief activities might be finished properly.Conclusion This study shows useful perspectives in the response of medical relief activities during post-acute phase after disaster and the importance of establishing systems for information management that apply these perspectives. PMID:26971453

  7. Earthquake monitoring of eastern and southern Washington: Annual technical report 1983

    SciTech Connect

    Not Available

    1983-09-01

    This report covers the operations and research performed for D.O.E. and the N.R.C. by the University of Washington Geophysics Program on the seismicity and structure of eastern and southern Washington and northern Oregon during the past year. There are presently 107 stations in Washington and northern Oregon whose data are telemetered to the University for recording, analysis and interpretation. Section I covers the details of the operation of the network in eastern and southern Washington and northern Oregon. Details of the past year's seismicity is covered in section II. The only seismicity of note during the past year was a M=3.8 felt earthquake in southeastern Washington. Section III covers our recent crustal structure work in eastern Washington. It involves a time term analysis of the shallow crustal structure in and around the Pasco Basin. Section IV summarizes research using the borehole seismomenter and gives a brief summary of theories dealing with the effects of cracks and pores on seismic velocities and attenuation and a list of all pertinent publications from 1950 to the present. In the appendices are the earthquake catalog for 1982-1983 and a monthly listing of the station 'up-times' for the eastern Washington network. 80 refs., 11 figs., 3 tabs.

  8. Advance earthquake warnings

    NASA Astrophysics Data System (ADS)

    Showstack, Randy

    Urban centers can receive a 15-to-30-second advance warning about impending earthquakes if several technical criteria are met, according to a study by Ta-liang (Leon) Teng of the Southern California Earthquake Center and Yih-Min Wu of the Taiwan Central Weather Bureau. The study is published in the Bulletin of the Seismological Society of America, vol. 92, no. 5, June 2002).Teng and Wu's system for developing early warnings for earthquakes relies on data from a "virtual sub-network" automatically established when an earthquake triggers a group of nearby seismic stations. That information can then be immediately distributed as an early warning.

  9. On the Potential Uses of Static Offsets Derived From Low-Cost Community Instruments and Crowd-Sourcing for Earthquake Monitoring and Rapid Response

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Murray, J. R.; Iannucci, R. A.

    2013-12-01

    We explore the efficacy of low-cost community instruments (LCCIs) and crowd-sourcing to produce rapid estimates of earthquake magnitude and rupture characteristics which can be used for earthquake loss reduction such as issuing tsunami warnings and guiding rapid response efforts. Real-time high-rate GPS data are just beginning to be incorporated into earthquake early warning (EEW) systems. These data are showing promising utility including producing moment magnitude estimates which do not saturate for the largest earthquakes and determining the geometry and slip distribution of the earthquake rupture in real-time. However, building a network of scientific-quality real-time high-rate GPS stations requires substantial infrastructure investment which is not practicable in many parts of the world. To expand the benefits of real-time geodetic monitoring globally, we consider the potential of pseudorange-based GPS locations such as the real-time positioning done onboard cell phones or on LCCIs that could be distributed in the same way accelerometers are distributed as part of the Quake Catcher Network (QCN). While location information from LCCIs often have large uncertainties, their low cost means that large numbers of instruments can be deployed. A monitoring network that includes smartphones could collect data from potentially millions of instruments. These observations could be averaged together to substantially decrease errors associated with estimated earthquake source parameters. While these data will be inferior to data recorded by scientific-grade seismometers and GPS instruments, there are features of community-based data collection (and possibly analysis) that are very attractive. This approach creates a system where every user can host an instrument or download an application to their smartphone that both provides them with earthquake and tsunami warnings while also providing the data on which the warning system operates. This symbiosis helps to encourage people to both become users of the warning system and to contribute data to the system. Further, there is some potential to take advantage of the LCCI hosts' computing and communications resources to do some of the analysis required for the warning system. We will present examples of the type of data which might be observed by pseudorange-based positioning for both actual earthquakes and laboratory tests as well as performance tests of potential earthquake source modeling derived from pseudorange data. A highlight of these performance tests is a case study of the 2011 Mw 9 Tohoku-oki earthquake.

  10. Hatfield Marine Science Center Dynamic Revetment Project DSL permit #45455-FP, Monitoring Report February 2012

    EPA Science Inventory

    A Dynamic Revetment (gravel beach) was installed in November, 2011 on the shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) to mitigate erosion that threatened HMSC critical infrastructure. Shoreline topographic and biological monitoring was init...

  11. Monitoring of the Permeable Pavement Demonstration Site at the Edison Environmental Center (Poster)

    EPA Science Inventory

    This is a poster on the permeable pavement parking lot at the Edison Environmental Center. The monitoring scheme for the project is discussed in-depth with graphics explaining the instrumentation installed at the site.

  12. Hatfield Marine Science Center Dynamic Revetment Project DSL permit #45455-FP, Monitoring Report February, 2013

    EPA Science Inventory

    A Dynamic Revetment (gravel beach) was installed in November, 2011 on the shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) to mitigate erosion that threatened HMSC critical infrastructure. Shoreline topographic and biological monitoring was init...

  13. GREENHOUSE GAS (GHG) MITIGATION AND MONITORING TECHNOLOGY PERFORMANCE: ACTIVITIES OF THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper discusses greenhouse gas (GHG) mitigation and monitoring technology performance activities of the GHG Technology Verification Center. The Center is a public/private partnership between Southern Research Institute and the U.S. EPA's Office of Research and Development. It...

  14. (Stanford Linear Accelerator Center) annual environmental monitoring report, January--December 1989

    SciTech Connect

    Not Available

    1990-05-01

    This progress report discusses environmental monitoring activities at the Stanford Linear Accelerator Center for 1989. Topics include climate, site geology, site water usage, land use, demography, unusual events or releases, radioactive and nonradioactive releases, compliance summary, environmental nonradiological program information, environmental radiological program information, groundwater protection monitoring ad quality assurance. 5 figs., 7 tabs. (KJD)

  15. GONAF - A deep Geophysical Observatory at the North Anatolian Fault: Permanent downhole monitoring of a pending major earthquake

    NASA Astrophysics Data System (ADS)

    Bulut, Fatih; Bohnhoff, Marco; Dresen, Georg; Raub, Christina; Kilic, Tugbay; Kartal, Recai F.; Tuba Kadirioglu, F.; Nurlu, Murat; Ito, Hisao; Malin, Peter E.

    2014-05-01

    The North Anatolian Fault Zone (NAFZ hereafter) is a right-lateral transform plate boundary between the Anatolian plate and Eurasia accommodating a relative plate motion of ~25 mm/yr. Almost the entire fault zone has failed during the last century as a westward migrating sequence of destructive earthquakes leaving a very high probability of a forthcoming large event to the Sea of Marmara segments. This area did not host any M>7 earthquake since 1766. Therefore, listening to the Sea of Marmara segments at a very low detection threshold is required to address how the brittle deformation develops along a critically-stressed fault segment prior to a potential failure. GONAF-ICDP project has been developed to design a downhole seismic network surrounding the Sea of Marmara segments of the NAFZ deploying 300 m deep boreholes equipped with a chain of sensitive seismographs. Natural and city-induced noise is attenuated through the unconsolidated subsurface formation and therefore provides ideal boundary conditions for seismic monitoring within the intact rocks at greater depths. A typical GONAF borehole consists of 1 Hz vertical sensor at every 75 m depth increment and a combination of 1Hz, 2Hz and 15 Hz 3C sensors at 300 m depth. By now, three boreholes were successfully implemented in the Tuzla and Yalova-Ç?narc?k regions. The plan is to complete four more GONAF boreholes in 2014. Our preliminary results show that GONAF waveform recordings will broaden the magnitude range down to ~M -1 in the target area providing a better characterization of seismically active features in time and space.

  16. Earthquake Testing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

  17. Structural Health Monitoring Sensor Development at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Prosser, W. H.; Wu, M. C.; Allison, S. G.; DeHaven, S. L.; Ghoshal, A.

    2002-01-01

    NASA is applying considerable effort on the development of sensor technology for structural health monitoring (SHM). This research is targeted toward increasing the safety and reliability of aerospace vehicles, while reducing operating and maintenance costs. Research programs are focused on applications to both aircraft and space vehicles. Sensor technologies under development span a wide range including fiber-optic sensing, active and passive acoustic sensors, electromagnetic sensors, wireless sensing systems, MEMS, and nanosensors. Because of their numerous advantages for aerospace applications, fiber-optic sensors are one of the leading candidates and are the major focus of this presentation. In addition, recent advances in active and passive acoustic sensing will also be discussed.

  18. Source spectra, moment, and energy for recent eastern mediterranean earthquakes: calibration of international monitoring system stations

    SciTech Connect

    Mayeda, K M; Hofstetter, A; Rodgers, A J; Walter, W R

    2000-07-26

    In the past several years there have been several large (M{sub w} > 7.0) earthquakes in the eastern Mediterranean region (Gulf of Aqaba, Racha, Adana, etc.), many of which have had aftershock deployments by local seismological organizations. In addition to providing ground truth data (GT << 5 km) that is used in regional location calibration and validation, the waveform data can be used to aid in calibrating regional magnitudes, seismic discriminants, and velocity structure. For small regional events (m{sub b} << 4.5), a stable, accurate magnitude is essential in the development of realistic detection threshold curves, proper magnitude and distance amplitude correction processing, formation of an M{sub s}:m{sub b} discriminant, and accurate yield determination of clandestine nuclear explosions. Our approach provides a stable source spectra from which M{sub w} and m{sub b} can be obtained without regional magnitude biases. Once calibration corrections are obtained for earthquakes, the coda-derived source spectra exhibit strong depth-dependent spectral peaking when the same corrections are applied to explosions at the Nevada Test Site (Mayeda and Walter, 1996), chemical explosions in the recent ''Depth of Burial'' experiment in Kazahkstan (Myers et al., 1999), and the recent nuclear test in India. For events in the western U.S. we found that total seismic energy, E, scales as M{sub o}{sup 0.25} resulting in more radiated energy than would be expected under the assumptions of constant stress-drop scaling. Preliminary results for events in the Middle East region also show this behavior, which appears to be the result of intermediate spectra fall-off (f{sup 1.5}) for frequencies ranging between {approx}0.1 and 0.8 Hz for the larger events. We developed a Seismic Analysis Code (SAC) coda processing command that reads in an ASCII flat file that contains calibration information specific for a station and surrounding region, then outputs a coda-derived source spectra, moment estimate, and energy estimate.

  19. Earthquake monitoring of eastern and southern Washington: Annual technical report 1982

    SciTech Connect

    Not Available

    1982-09-01

    This report covers the operation and research performed for DOE and the NRC by the University of Washington Geophysics Program on the seismicity and structure of eastern and southern Washington during the past year. There are presently 114 stations in Washington and northern Oregon whose data are telemetered to the University for recording, analysis, and interpretation. Section I of this report covers the details of the operation of the network in eastern and southern Washington and northern Oregon. Details of the past year's seismicity is covered in section II. There was little seismicity of note this past year anywhere in the state in marked contrast to the previous two years. Section III covers some recent advances in our crustal structure studies. Besides summarizing some recent results of investigations of crustal and upper mantle structure of the north Cascade mountains we cover in this section a study using laterally inhomogeneous velocity structures to model the transition between several tectonic provinces. Synthetic seismograms are calculated to compare with observed record sections constructed from the digital recordings of medium sized earthquakes. Section IV summarizes research using a bore-hole seismometer and also describes a technique for recovering velocity, attenuation and fracture porosity from a cross-hole seismic survey run by a subcontractor for Rockwell Inc. a few years ago. Section V is a preliminary report on a teleseismic P-wave delay study using the digital records of /number sign/ teleseisms recorded at /number sign/ stations of the state wide network. 14 refs., 19 figs., 5 tabs.

  20. Earthquake monitoring of eastern and southern Washington: Annual technical report 1984

    SciTech Connect

    Not Available

    1984-09-01

    This report covers the operations and research performed for DOE and NRC by the University of Washington Geophysics Program on the seismicity and structure of eastern and southern Washington and northern Oregon for the year, July 1, 1983 to June 30, 1984. There are presently 104 stations in Washington and northern Oregon whose data are telemetered to the University for recording, analysis and interpretation. Section I of this report covers the details of the operation of the network in eastern Washington and the Washington-Oregon border area. Details of the past year's seismicity and a description of the catalog is covered in Section II. Section III is a preliminary description of our data from the joint USGS-Rockwell-UW refraction experiment caried out this summer. Preliminary examination of these data indicate that useful interpretations will be possible. Section IV summarizes research in vertical seismic profiling which will be useful to study the velocity and attenuation structure of the basalts. The appendices include the earthquake catalog for 1983-1984 and a monthly listing of the station ''up-times'' for the eastern Washington network. 4 refs., 3 figs., 3 tabs.

  1. A cost effective wireless structural health monitoring network for buildings in earthquake zones

    NASA Astrophysics Data System (ADS)

    Pentaris, F. P.; Stonham, J.; Makris, J. P.

    2014-10-01

    The design, programming and implementation of a cost effective wireless structural health monitoring system (wSHMs) is presented, able to monitor the seismic and/or man-made acceleration in buildings. This system actually operates as a sensor network exploiting internet connections that commonly exist, aiming to monitor the structural health of the buildings being installed. Key-feature of wSHMs is that it can be implemented in Wide Area Network mode to cover many remote structures and buildings, on metropolitan scale. Acceleration data is able to send, in real time, from dozens of buildings of a broad metropolitan area, to a central database, where they are analyzed in order to depict possible structural damages or nonlinear characteristics and alert for non-appropriateness of specific structures.

  2. Monitoring of movement of potential earthquake areas with precise distance measuring and leveling systems

    NASA Astrophysics Data System (ADS)

    Staples, Jack E.

    1986-11-01

    Whether monitoring crustal movements in localized volcanic areas along known fault lines, or over large crustal-movement areas, the geodesist has been restricted by the measurement accuracy of the instruments used, the accumulation of errors, the lack of reliable air refraction information and the problem of finding proper measurement procedures and mathematical solutions to assure that the inherent errors of the measurement-mathematical procedures do not exceed any conceivable ground movement. Recent technological advances have placed new instruments and systems at the disposal of the geodesist, so that is now feasible to measure and analyze these micro and macro crustal movements within the accuracies required. The paper describes three such systems: (1) The Wild Electronic Theodolite T-2000 with a highly precise distance-measurement instrument, the DI-4S, together with a data collector, the GRE-3, which are connected to a computer and a plotter to measure and analyze both micro and macro crustal movements. (2) The Wild NAK-2 level with an antimagnetic compensator which increases the accuracy in the height/velocity monitoring of vertical crustal movements by virtual elimination of the influence of natural or man-made magnetic fields on the automatic level. (3) The use of analytical photogrammetry employing both terrestrial and aerial photography to monitor crustal movements. By taking advantage of these new instruments and systems, the scientists capability to provide crustal movement data for use in the analysis and prediction of micro or macro crustal movement is greatly enhanced.

  3. Micro-earthquake monitoring and tri-axial drill-bit VSP in NEDO {open_quotes}Deep-seated geothermal reservoir survey{close_quotes} in Kakkonda, Japan

    SciTech Connect

    Takahashi, M.; Kondo, T.; Suzuki, I.

    1995-12-31

    New Energy and Industrial Technology Development Organization has been drilling well WD-1 and employing micro-earthquake monitoring and tri-axial drill-bit VSP as the exploration techniques for the deep geothermal reservoir in the Kakkonda geothermal field, Japan. The results of them are as follows: (1) More than 1000 micro-earthquakes were observed from December 23, 1994 to July 1, 1995 in the Kakkonda geothermal field. Epicenters are distributed NW-SE from a macroscopic viewpoint; they distribute almost in the same areas as the fractured zone in the Kakkonda shallow reservoir as pointed out by Doi et al. (1988). They include three groups trending NE-SW. Depths of hypocenters range from the ground surface to about -2.5 km Sea level; they seem to be deeper in the western part. (2) Well WD-1 drilled into a swarm of micro-earthquakes at depths 1200 to 2200 m and encountered many lost circulations in those depths. However, these earthquakes occurred before well WD-1 reached those depths. (3) The bottom boundary of micro-earthquake distribution has a very similar shape to that of the top of the Kakkonda granite, though all of the micro-earthquakes are plotted 300 m shallower than the top of the granite. (4) The TAD VSP shows a possibility of existence of seismic reflectors at sea levels around -2.0, -2.2 and -2.6 km. These reflectors seem to correspond to the top of the Pre-Tertiary formation, the top of the Kakkonda granite and reflectors within the Kakkonda granite.

  4. Academia Sinica, TW E-science to Assistant Seismic Observations for Earthquake Research, Monitor and Hazard Reduction Surrounding the South China Sea

    NASA Astrophysics Data System (ADS)

    Huang, Bor-Shouh; Liu, Chun-Chi; Yen, Eric; Liang, Wen-Tzong; Lin, Simon C.; Huang, Win-Gee; Lee, Shiann-Jong; Chen, Hsin-Yen

    Experience from the 1994 giant Sumatra earthquake, seismic and tsunami hazard have been considered as important issues in the South China Sea and its surrounding region, and attracted many seismologist's interesting. Currently, more than 25 broadband seismic instruments are currently operated by Institute of Earth Sciences, Academia Sinica in northern Vietnam to study the geodynamic evolution of the Red river fracture zone and rearranged to distribute to southern Vietnam recently to study the geodynamic evolution and its deep structures of the South China Sea. Similar stations are planned to deploy in Philippines in near future. In planning, some high quality stations may be as permanent stations and added continuous GPS observations, and instruments to be maintained and operated by several cooperation institutes, for instance, Institute of Geophysics, Vietnamese Acadamy of Sciences and Technology in Vietnam and Philippine Institute of Volcanology and Seismology in Philippines. Finally, those stations will be planed to upgrade as real time transmission stations for earthquake monitoring and tsunami warning. However, high speed data transfer within different agencies is always a critical issue for successful network operation. By taking advantage of both EGEE and EUAsiaGrid e-Infrastructure, Academia Sinica Grid Computing Centre coordinates researchers from various Asian countries to construct a platform to high performance data transfer for huge parallel computation. Efforts from this data service and a newly build earthquake data centre for data management may greatly improve seismic network performance. Implementation of Grid infrastructure and e-science issues in this region may assistant development of earthquake research, monitor and natural hazard reduction. In the near future, we will search for new cooperation continually from the surrounding countries of the South China Sea to install new seismic stations to construct a complete seismic network of the South China Sea and encourage studies for earthquake sciences and natural hazard reductions.

  5. Development of a component centered fault monitoring and diagnosis knowledge based system for space power system

    NASA Technical Reports Server (NTRS)

    Lee, S. C.; Lollar, Louis F.

    1988-01-01

    The overall approach currently being taken in the development of AMPERES (Autonomously Managed Power System Extendable Real-time Expert System), a knowledge-based expert system for fault monitoring and diagnosis of space power systems, is discussed. The system architecture, knowledge representation, and fault monitoring and diagnosis strategy are examined. A 'component-centered' approach developed in this project is described. Critical issues requiring further study are identified.

  6. Earthquake Monitoring at 9 deg 50'N on the East Pacific Rise: Latest Results and Implications for Integrated Models

    NASA Astrophysics Data System (ADS)

    Doermann, L.; Waldhauser, F.; Tolstoy, M.

    2008-12-01

    Ocean bottom seismograph (OBS) data were recorded continuously between October 2003 and January 2007 at the Ridge 2000 Bull's Eye site at 9°50'N on the East Pacific Rise (EPR) using a 4 x 4 km array of up to 12 instruments with approximately annual turnaround. These data have provided exciting insights into fundamental processes at fast-spreading ridges including volcanism and hydrothermal circulation. They also are providing critical linkages for understanding the geological, chemical and biological data at this site. Results from the first OBS deployment have shown that we are able to monitor microseismicity on a fine enough scale to image the fundamental structure of a hydrothermal circulation cell, and we have identified an on-axis down-flow zone and a hydrothermal cracking front overlying the axial magma chamber (Tolstoy et al., 2008). Our results show that hydrothermal circulation at the EPR is dominantly along-axis with narrowly focused down-flow at small kinks in the axial summit trough (AST). There appear to be two distinct circulation cells within the 9°49'N-9°51'N area, and these correlate well with temperature, chemical and biological observations. The rate of seismic events recorded at the array were ~2 orders of magnitude higher than anticipated based on prior results from this area (>320,000 events recorded versus ~4,500 anticipated), and therefore the processing task is considerable. In addition to hand-picking phase arrival times from periods of particular interest, we are also working on improved automatic detection tools to speed up processing of data from the remaining years and the use of waveform cross-correlation to improve event locations. Preliminary results to date suggest that the basic structure imaged in the 2003-2004 earthquake data persists, with seismicity rates continuing to climb leading up to the January 2006 eruption. We will present the most recent earthquake locations and discuss how they fit into results from the 2003-2004 data, as well as the implications for integrated models at this site.

  7. Real time of earthquakes prone areas by RST analysis of satellite TIR radiances: results of continuous monitoring over Italy and Turkey regions.

    NASA Astrophysics Data System (ADS)

    Tramutoli, V.; Filizzola, C.; Genzano, N.; Lisi, M.; Paciello, R.; Pergola, N.

    2012-04-01

    Meteorological satellites offering global coverage, continuity of observations and long term time series (starting even 30 years ago) offer a unique possibility not only to learn from the past but also to guarantee continuous monitoring whereas other observation technologies are lacking because too expensive or (like in the case of earthquake precursor studies) or considered useless by decision-makers. Space-time fluctuations of Earth's emitted Thermal Infrared (TIR) radiation have been observed from satellite months to weeks before earthquakes occurrence. The general RST approach has been proposed (since 2001) in order to discriminate normal (i.e. related to the change of natural factor and/or observation conditions) TIR signal fluctuations from anomalous signal transient possibly associated to earthquake occurrence. Since then several earthquakes occurred in Europe, Africa and America have been studied by analyzing decades of satellite observations always using a validation/confutation approach in order to verify the presence/absence of anomalous space-time TIR transients in presence/absence of significant seismic activity. In the framework of PRE-EARTHQUAKES EU-FP7 Project (www.pre-earthquakes.org) , starting from October 2010 (still continuing) RST approach has been applied to MSG/SEVIRI data to generate TIR anomaly maps over Italian peninsula, continuously for all the midnight slots. Since September 2011 the same monitoring activity (still continuing) started for Turkey region. For the first time a similar analysis has been performed in real-time, systematically analyzing TIR anomaly maps in order to identify day by day possible significant (e.g. persistent in the space-time domain) thermal anomalies. During 2011 only in very few cases (1 in Italy in July and 2 in the Turkish region in September and November) the day by day analysis enhanced significant anomalies that in two cases were communicated to the other PRE-EARTHQUAKES partners asking for their attention. In this paper results of such analysis will be presented which seem to confirm results independently achieved (unfortunately without their knowledge) by other authors applying a similar approach to EOS/MODIS data over California region.

  8. Exploring Earthquakes in Real-Time

    NASA Astrophysics Data System (ADS)

    Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.

    2013-12-01

    Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.

  9. Earthquake Facts

    MedlinePLUS

    ... most people lived in caves carved from soft rock. These dwellings collapsed during the earthquake, killing an ... Aristotle that soft ground shakes more than hard rock in an earthquake. The cause of earthquakes was ...

  10. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - Abstract

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch (UWMB) is monitoring an instrumented, working, 110-space pervious pavement parking at EPA’s Edison Environmental Center (EEC). Permeable pavement systems are classified as stormwater best management practices (BMPs) which reduce runo...

  11. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - Abstract

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch (UWMB) is monitoring an instrumented, working, 110-space pervious pavement parking at EPA’s Edison Environmental Center (EEC). Permeable pavement systems are classified as stormwater best management practices (BMPs) which reduce runo...

  12. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - presentation

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch has been monitoring an instrumented 110-space pervious pavement parking lot. The lot is used by EPA personnel and visitors to the Edison Environmental Center. The design includes 28-space rows of three permeable pavement types: asphal...

  13. Lessons learned from the introduction of autonomous monitoring to the EUVE science operations center

    NASA Technical Reports Server (NTRS)

    Lewis, M.; Girouard, F.; Kronberg, F.; Ringrose, P.; Abedini, A.; Biroscak, D.; Morgan, T.; Malina, R. F.

    1995-01-01

    The University of California at Berkeley's (UCB) Center for Extreme Ultraviolet Astrophysics (CEA), in conjunction with NASA's Ames Research Center (ARC), has implemented an autonomous monitoring system in the Extreme Ultraviolet Explorer (EUVE) science operations center (ESOC). The implementation was driven by a need to reduce operations costs and has allowed the ESOC to move from continuous, three-shift, human-tended monitoring of the science payload to a one-shift operation in which the off shifts are monitored by an autonomous anomaly detection system. This system includes Eworks, an artificial intelligence (AI) payload telemetry monitoring package based on RTworks, and Epage, an automatic paging system to notify ESOC personnel of detected anomalies. In this age of shrinking NASA budgets, the lessons learned on the EUVE project are useful to other NASA missions looking for ways to reduce their operations budgets. The process of knowledge capture, from the payload controllers for implementation in an expert system, is directly applicable to any mission considering a transition to autonomous monitoring in their control center. The collaboration with ARC demonstrates how a project with limited programming resources can expand the breadth of its goals without incurring the high cost of hiring additional, dedicated programmers. This dispersal of expertise across NASA centers allows future missions to easily access experts for collaborative efforts of their own. Even the criterion used to choose an expert system has widespread impacts on the implementation, including the completion time and the final cost. In this paper we discuss, from inception to completion, the areas where our experiences in moving from three shifts to one shift may offer insights for other NASA missions.

  14. Environmental assessment of the Carlsbad Environmental Monitoring and Research Center Facility

    SciTech Connect

    1995-10-01

    This Environmental Assessment has been prepared to determine if the Carlsbad Environmental Monitoring and Research Center (the Center), or its alternatives would have significant environmental impacts that must be analyzed in an Environmental Impact Statement. DOE`s proposed action is to continue funding the Center. While DOE is not funding construction of the planned Center facility, operation of that facility is dependent upon continued funding. To implement the proposed action, the Center would initially construct a facility of approximately 2,300 square meters (25,000 square feet). The Phase 1 laboratory facilities and parking lot will occupy approximately 1.2 hectares (3 acres) of approximately 8.9 hectares (22 acres) of land which were donated to New Mexico State University (NMSU) for this purpose. The facility would contain laboratories to analyze chemical and radioactive materials typical of potential contaminants that could occur in the environment in the vicinity of the DOE Waste Isolation Pilot Plant (WIPP) site or other locations. The facility also would have bioassay facilities to measure radionuclide levels in the general population and in employees of the WIPP. Operation of the Center would meet the DOE requirement for independent monitoring and assessment of environmental impacts associated with the planned disposal of transuranic waste at the WIPP.

  15. Program Evaluation of Remote Heart Failure Monitoring: Healthcare Utilization Analysis in a Rural Regional Medical Center

    PubMed Central

    Keberlein, Pamela; Sorenson, Gigi; Mohler, Sailor; Tye, Blake; Ramirez, A. Susana; Carroll, Mark

    2015-01-01

    Abstract Background: Remote monitoring for heart failure (HF) has had mixed and heterogeneous effects across studies, necessitating further evaluation of remote monitoring systems within specific healthcare systems and their patient populations. “Care Beyond Walls and Wires,” a wireless remote monitoring program to facilitate patient and care team co-management of HF patients, served by a rural regional medical center, provided the opportunity to evaluate the effects of this program on healthcare utilization. Materials and Methods: Fifty HF patients admitted to Flagstaff Medical Center (Flagstaff, AZ) participated in the project. Many of these patients lived in underserved and rural communities, including Native American reservations. Enrolled patients received mobile, broadband-enabled remote monitoring devices. A matched cohort was identified for comparison. Results: HF patients enrolled in this program showed substantial and statistically significant reductions in healthcare utilization during the 6 months following enrollment, and these reductions were significantly greater compared with those who declined to participate but not when compared with a matched cohort. Conclusions: The findings from this project indicate that a remote HF monitoring program can be successfully implemented in a rural, underserved area. Reductions in healthcare utilization were observed among program participants, but reductions were also observed among a matched cohort, illustrating the need for rigorous assessment of the effects of HF remote monitoring programs in healthcare systems. PMID:25025239

  16. Response to the great East Japan earthquake of 2011 and the Fukushima nuclear crisis: the case of the Laboratory Animal Research Center at Fukushima Medical University.

    PubMed

    Katahira, Kiyoaki; Sekiguchi, Miho

    2013-01-01

    A magnitude 9.0 great earthquake, the 2011 off the Pacific coast of Tohoku Earthquake, occurred on March 11, 2011, and subsequent Fukushima Daiichi Nuclear Power Station (Fukushima NPS) accidents stirred up natural radiation around the campus of Fukushima Medical University (FMU). FMU is located in Fukushima City, and is 57 km to the northwest of Fukushima NPS. Due to temporary failure of the steam boilers, the air conditioning system for the animal rooms, all autoclaves, and a cage washer could not be used at the Laboratory Animal Research Center (LARC) of FMU. The outside air temperature dropped to zero overnight, and the temperature inside the animal rooms fell to 10°C for several hours. We placed sterilized nesting materials inside all cages to encourage rodents to create nests. The main water supply was cut off for 8 days in all, while supply of steam and hot water remained unavailable for 12 days. It took 20 days to restore the air conditioning system to normal operation at the facility. We measured radiation levels in the animal rooms to confirm the safety of care staff and researchers. On April 21, May 9, and June 17, the average radiation levels at a central work table in the animal rooms with HEPA filters were 46.5, 44.4, and 43.4 cpm, respectively, which is equal to the background level of the equipment. We sincerely hope our experiences will be a useful reference regarding crisis management for many institutes having laboratory animals. PMID:23615301

  17. Skewed orientation groups in scatter plots of earthquake fault plane solutions: Implications for extensional geometry at oceanic spreading centers

    NASA Astrophysics Data System (ADS)

    Lister, G. S.; Tkal?i?, H.; McClusky, S.; Forster, M. A.

    2014-03-01

    Systematic analysis of earthquake focal solutions derived from centroid moment tensors shows well-defined orientation groups in scatterplots of fault plane normals and associated slip line vectors. Consideration of the geometry implied by these orientation groups can allow resolution of the ambiguity inherent in the choice as to which of the two conjugate fault plane solutions should apply, and in many cases, the same classification can be applied to the entire orientation group. Examining scatter plots of data from normal fault earthquakes on spreading ridges typically shows orthogonal relations but there are also many cases where there is a skew with respect to the great circles defined by faults on adjacent transform faults. This can be explained by finite rock strength in the adjacent transforms, requiring resolved shear stress to allow movement, thus requiring rotation of the trajectories of the deviatoric stress axes: anticlockwise for right-lateral transforms and clockwise for left-lateral transforms. This asymmetry also requires formation of tilt block geometries reminiscent of Basin and Range style continental extension.

  18. Source Process of the Mw 5.0 Au Sable Forks, New York, Earthquake Sequence from Local Aftershock Monitoring Network Data

    NASA Astrophysics Data System (ADS)

    Kim, W.; Seeber, L.; Armbruster, J. G.

    2002-12-01

    On April 20, 2002, a Mw 5 earthquake occurred near the town of Au Sable Forks, northeastern Adirondacks, New York. The quake caused moderate damage (MMI VII) around the epicentral area and it is well recorded by over 50 broadband stations in the distance ranges of 70 to 2000 km in the Eastern North America. Regional broadband waveform data are used to determine source mechanism and focal depth using moment tensor inversion technique. Source mechanism indicates predominantly thrust faulting along 45° dipping fault plane striking due South. The mainshock is followed by at least three strong aftershocks with local magnitude (ML) greater than 3 and about 70 aftershocks are detected and located in the first three months by a 12-station portable seismographic network. The aftershock distribution clearly delineate the mainshock rupture to the westerly dipping fault plane at a depth of 11 to 12 km. Preliminary analysis of the aftershock waveform data indicates that orientation of the P-axis rotated 90° from that of the mainshock, suggesting a complex source process of the earthquake sequence. We achieved an important milestone in monitoring earthquakes and evaluating their hazards through rapid cross-border (Canada-US) and cross-regional (Central US-Northeastern US) collaborative efforts. Hence, staff at Instrument Software Technology, Inc. near the epicentral area joined Lamont-Doherty staff and deployed the first portable station in the epicentral area; CERI dispatched two of their technical staff to the epicentral area with four accelerometers and a broadband seismograph; the IRIS/PASSCAL facility shipped three digital seismographs and ancillary equipment within one day of the request; the POLARIS Consortium, Canada sent a field crew of three with a near real-time, satellite telemetry based earthquake monitoring system. The Polaris station, KSVO, powered by a solar panel and batteries, was already transmitting data to the central Hub in London, Ontario, Canada within a day after the field crew arrived in the Au Sable Forks area. This collaboration allowed us to maximize the scarce resources available for monitoring this damaging earthquake and its aftershocks in the Northeastern U.S.

  19. Advancing Research Methodology for Measuring & Monitoring Patient-centered Communication in Cancer Care

    Cancer.gov

    A critical step in facilitating the delivery of patient-centered communication (PCC) as part of routine cancer care delivery is creating a measurement and monitoring system that will allow for the ongoing assessment, tracking, and improvement of these six functions of patient-centered communication. To build the foundation of such a system and to advance research methodology in this area, the ORB has collaborated with the Agency for Healthcare Research and Quality (AHRQ) on a research project conducted within AHRQ's DEcIDE network.

  20. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Chen, S. E.; Yu, E.; Bhaskaran, A.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2011-12-01

    Currently, the SCEDC archives continuous and triggered data from nearly 8400 data channels from 425 SCSN recorded stations, processing and archiving an average of 6.4 TB of continuous waveforms and 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC during 2011. New website design: ? The SCEDC has revamped its website. The changes make it easier for users to search the archive, discover updates and new content. These changes also improve our ability to manage and update the site. New data holdings: ? Post processing on El Mayor Cucapah 7.2 sequence continues. To date there have been 11847 events reviewed. Updates are available in the earthquake catalog immediately. ? A double difference catalog (Hauksson et. al 2011) spanning 1981 to 6/30/11 will be available for download at www.data.scec.org and available via STP. ? A focal mechanism catalog determined by Yang et al. 2011 is available for distribution at www.data.scec.org. ? Waveforms from Southern California NetQuake stations are now being stored in the SCEDC archive and available via STP as event associated waveforms. Amplitudes from these stations are also being stored in the archive and used by ShakeMap. ? As part of a NASA/AIST project in collaboration with JPL and SIO, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP. Improvements in the user tool STP: ? STP sac output now includes picks from the SCSN. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. The data is stored in miniseed format with gzip compression. Time gaps between time series were padded with null values, which substantially increases search efficiency by make the records uniform in length.

  1. Environmental monitoring and research at the John F. Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Hall, C. R.; Hinkle, C. R.; Knott, W. M.; Summerfield, B. R.

    1992-01-01

    The Biomedical Operations and Research Office at the NASA John F. Kennedy Space Center has been supporting environmental monitoring and research since the mid-1970s. Program elements include monitoring of baseline conditions to document natural variability in the ecosystem, assessments of operations and construction of new facilities, and ecological research focusing on wildlife habitat associations. Information management is centered around development of a computerized geographic information system that incorporates remote sensing and digital image processing technologies along with traditional relational data base management capabilities. The proactive program is one in which the initiative is to anticipate potential environmental concerns before they occur and, by utilizing in-house expertise, develop impact minimization or mitigation strategies to reduce environmental risk.

  2. Environmental monitoring and research at the John F. Kennedy Space Center

    SciTech Connect

    Hall, C.R.; Hinkle, C.R.; Knott, W.M.; Summerfield, B.R. )

    1992-08-01

    The Biomedical Operations and Research Office at the NASA John F. Kennedy Space Center has been supporting environmental monitoring and research since the mid-1970s. Program elements include monitoring of baseline conditions to document natural variability in the ecosystem, assessments of operations and construction of new facilities, and ecological research focusing on wildlife habitat associations. Information management is centered around development of a computerized geographic information system that incorporates remote sensing and digital image processing technologies along with traditional relational data base management capabilities. The proactive program is one in which the initiative is to anticipate potential environmental concerns before they occur and, by utilizing in-house expertise, develop impact minimization or mitigation strategies to reduce environmental risk.

  3. RAPID: Collaboration Results from Three NASA Centers in Commanding/Monitoring Lunar Assets

    NASA Technical Reports Server (NTRS)

    Torres, R. Jay; Allan, Mark; Hirsh, Robert; Wallick, Michael N.

    2009-01-01

    Three NASA centers are working together to address the challenge of operating robotic assets in support of human exploration of the Moon. This paper describes the combined work to date of the Ames Research Center (ARC), Jet Propulsion Laboratory (JPL) and Johnson Space Center (JSC) on a common support framework to control and monitor lunar robotic assets. We discuss how we have addressed specific challenges including time-delayed operations, and geographically distributed collaborative monitoring and control, to build an effective architecture for integrating a heterogeneous collection of robotic assets into a common work. We describe the design of the Robot Application Programming Interface Delegate (RAPID) architecture that effectively addresses the problem of interfacing a family of robots including the JSC Chariot, ARC K-10 and JPL ATHLETE rovers. We report on lessons learned from the June 2008 field test in which RAPID was used to monitor and control all of these assets. We conclude by discussing some future directions to extend the RAPID architecture to add further support for NASA's lunar exploration program.

  4. Communication infrastructure in a contact center for home care monitoring of chronic disease patients.

    PubMed Central

    Maglaveras, N.; Gogou, G.; Chouvarda, I.; Koutkias, V.; Lekka, I.; Giaglis, G.; Adamidis, D.; Karvounis, C.; Louridas, G.; Goulis, D.; Avramidis, A.; Balas, E. A.

    2002-01-01

    The Citizen Health System (CHS) is a European Commission (EC) funded project in the field of IST for Health. Its main goal is to develop a generic contact center which in its pilot stage can be used in the monitoring, treatment and management of chronically ill patients at home in Greece, Spain and Germany. Such contact centers, which can use any type of communication technology, and can provide timely and preventive prompting to the patients are envisaged in the future to evolve into well-being contact centers providing services to all citizens. In this paper, we present the structure of such a generic contact center and in particular the telecommunication infrastructure, the communication protocols and procedures, and finally the educational modules that are integrated into this contact center. We discuss the procedures followed for two target groups of patients where two randomized control clinical trials are under way, namely diabetic patients with obesity problems, and congestive heart failure patients. We present examples of the communication means between the contact center medical personnel and these patients, and elaborate on the educational issues involved. PMID:12463870

  5. The meteorological monitoring system for the Kennedy Space Center/Cape Canaveral Air Station

    NASA Technical Reports Server (NTRS)

    Dianic, Allan V.

    1994-01-01

    The Kennedy Space Center (KSC) and Cape Canaveral Air Station (CCAS) are involved in many weather-sensitive operations. Manned and unmanned vehicle launches, which occur several times each year, are obvious example of operations whose success and safety are dependent upon favorable meteorological conditions. Other operations involving NASA, Air Force, and contractor personnel, including daily operations to maintain facilities, refurbish launch structures, prepare vehicles for launch, and handle hazardous materials, are less publicized but are no less weather-sensitive. The Meteorological Monitoring System (MMS) is a computer network which acquires, processes, disseminates, and monitors near real-time and forecast meteorological information to assist operational personnel and weather forecasters with the task of minimizing the risk to personnel, materials, and the surrounding population. CLIPS has been integrated into the MMS to provide quality control analysis and data monitoring. This paper describes aspects of the MMS relevant to CLIPS including requirements, actual implementation details, and results of performance testing.

  6. Integration of user centered design in the development of health monitoring system for elderly.

    PubMed

    Jia, Guifeng; Zhou, Jie; Yang, Pan; Lin, Chengyu; Cao, Xia; Hu, Hua; Ning, Gangmin

    2013-01-01

    This paper presents a health monitoring system by incorporating the approach of user centered design (UCD) for enhancing system usability for the elderly. The system is designed for monitoring cardiovascular diseases (CVD) related physiological signals including electrocardiogram (ECG), pulse wave (PW) and body weight (BW). Ease of use and non-obtrusiveness are two key requirements for design criteria. Our health monitoring system is designed on three levels: personal medical device layer, mobile application layer and remote central service layer. A chair-based apparatus was built for physiological signal acquisition and a mobile application was developed for data delivery and health management. Finally, usability evaluation was conducted and the system efficiency was quantitatively analyzed by system usability scale (SUS). The results demonstrate that the performance of the system is acceptable for the elderly and the UCD principle is helpful for health system design. PMID:24110045

  7. Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Bachmann, C. E.; Wiemer, S.; Woessner, J.; Hainzl, S.

    2011-08-01

    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was reduced and then stopped after another ML 2.5 event. A few hours later, an earthquake with ML 3.4, felt within the city, occurred, which led to bleed-off of the well. A risk study was later issued with the outcome that the experiment could not be resumed. We analyse the statistical features of the sequence and show that the sequence is well modelled with the Omori-Utsu law following the termination of water injection. Based on this model, the sequence will last 31+29/-14 years to reach the background level. We introduce statistical models based on Reasenberg and Jones and Epidemic Type Aftershock Sequence (ETAS) models, commonly used to model aftershock sequences. We compare and test different model setups to simulate the sequences, varying the number of fixed and free parameters. For one class of the ETAS models, we account for the flow rate at the injection borehole. We test the models against the observed data with standard likelihood tests and find the ETAS model accounting for the on flow rate to perform best. Such a model may in future serve as a valuable tool for designing probabilistic alarm systems for EGS experiments.

  8. GPS Monitoring of Surface Change During and Following the Fortuitous Occurrence of the M(sub w) = 7.3 Landers Earthquake in our Network

    NASA Technical Reports Server (NTRS)

    Miller, M. Meghan

    1998-01-01

    Accomplishments: (1) Continues GPS monitoring of surface change during and following the fortuitous occurrence of the M(sub w) = 7.3 Landers earthquake in our network, in order to characterize earthquake dynamics and accelerated activity of related faults as far as 100's of kilometers along strike. (2) Integrates the geodetic constraints into consistent kinematic descriptions of the deformation field that can in turn be used to characterize the processes that drive geodynamics, including seismic cycle dynamics. In 1991, we installed and occupied a high precision GPS geodetic network to measure transform-related deformation that is partitioned from the Pacific - North America plate boundary northeastward through the Mojave Desert, via the Eastern California shear zone to the Walker Lane. The onset of the M(sub w) = 7.3 June 28, 1992, Landers, California, earthquake sequence within this network poses unique opportunities for continued monitoring of regional surface deformation related to the culmination of a major seismic cycle, characterization of the dynamic behavior of continental lithosphere during the seismic sequence, and post-seismic transient deformation. During the last year, we have reprocessed all three previous epochs for which JPL fiducial free point positioning products available and are queued for the remaining needed products, completed two field campaigns monitoring approx. 20 sites (October 1995 and September 1996), begun modeling by development of a finite element mesh based on network station locations, and developed manuscripts dealing with both the Landers-related transient deformation at the latitude of Lone Pine and the velocity field of the whole experiment. We are currently deploying a 1997 observation campaign (June 1997). We use GPS geodetic studies to characterize deformation in the Mojave Desert region and related structural domains to the north, and geophysical modeling of lithospheric behavior. The modeling is constrained by our existing and continued GPS measurements, which will provide much needed data on far-field strain accumulation across the region and on the deformational response of continental lithosphere during and following a large earthquake, forming the basis for kinematic and dynamic modeling of secular and seismic-cycle deformation. GPS geodesy affords both regional coverage and high precision that uniquely bear on these problems.

  9. Earthquake prediction

    SciTech Connect

    Ma, Z.; Fu, Z.; Zhang, Y.; Wang, C.; Zhang, G.; Liu, D.

    1989-01-01

    Mainland China is situated at the eastern edge of the Eurasian seismic system and is the largest intra-continental region of shallow strong earthquakes in the world. Based on nine earthquakes with magnitudes ranging between 7.0 and 7.9, the book provides observational data and discusses successes and failures of earthquake prediction. Derived from individual earthquakes, observations of various phenomena and seismic activities occurring before and after earthquakes, led to the establishment of some general characteristics valid for earthquake prediction.

  10. Estimated airborne release of plutonium from the 102 Building at the General Electric Vallecitos Nuclear Center, Vallecitos, California, as a result of damage from severe wind and earthquake hazard

    SciTech Connect

    Mishima, J.; Ayer, J.E.; Hays, I.D.

    1980-12-01

    This report estimates the potential airborne releases of plutonium as a consequence of various severities of earthquake and wind hazard postulated for the 102 Building at the General Electric Vallecitos Nuclear Center in California. The releases are based on damage scenarios developed by other specialists. The hazard severities presented range up to a nominal velocity of 230 mph for wind hazard and are in excess of 0.8 g linear acceleration for earthquakes. The consequences of thrust faulting are considered. The approaches and factors used to estimate the releases are discussed. Release estimates range from 0.003 to 3 g Pu.

  11. Near real-time model to monitor SST anomalies related to undersea earthquakes and SW monsoon phenomena from TRMM-AQUA satellite data

    NASA Astrophysics Data System (ADS)

    Chakravarty, Subhas

    Near real-time interactive computer model has been developed to extract daily mean global Sea Surface Temperature (SST) values of 1440x720 pixels, each one covering 0.25° x0.25° lat-long area and SST anomalies from longer period means pertaining to any required oceanic grid size of interest. The core MATLAB code uses the daily binary files (3-day aggregate values) of global SST data (derived from TRMM/TMI-AQUA/AMSRE satellite sensors) available on near real-time basis through the REMSS/NASA website and converts these SSTs into global/regional maps and displays as well as digitised text data tables for further analysis. As demonstrated applications of the model, the SST data for the period between 2003-2009 has been utilised to study (a) SST anomalies before, during and after the occurrence of two great under-sea earthquakes of 26 December 2004 and 28 March 2005 near the western coast of Sumatra and (b) variation of pixel numbers with SSTs between 27-31° C within (i) Nino 4 region and (ii) a broader western Pacific region (say Nino-BP) affected by ENSO events before (January-May) and during (June-October) Monsoon onset/progress. Preliminary results of these studies have been published (Chakravarty, The Open Oceanography Journal, 2009 and Chakravarty, IEEE Xplore, 2009). The results of the SST-earthquake analysis indicate a small but consistent warming of 0.2-0.3° C in the 2° x2° grid area near the earthquake epicentre starting a week earlier to a week later for the event of 26 December 2004. The changes observed in SST for the second earthquake is also indicated but with less clarity owing to the mixing of land and ocean surfaces and hence less number of SST pixels available within the 2° x 2° grid area near the corresponding epicen-tre. Similar analysis for the same period of non-earthquake years did not show any such SST anomalies. These results have far reaching implications to use SST as a possible parameter to be monitored for signalling occurrence of impending under-sea earthquakes sometimes leading to tsunamis. The results of the analysis for the ENSO-Monsoon rainfall relation show that the time series of SST distribution within the Nino 4 or Nino-BP regions with larger number of pixels with SSTs between 27-28° C is generally a favourable condition for normal rainfall condi-tion. While both Nino 4 and Nino-BP provide similar results, Nino-BP region is found to be a more sensitive region for such assessment of monitoring the trend of SW monsoon rainfall over India. This result has the potential to be used in the prognosis of overall rainfall pattern of the monsoon season at weekly intervals which may serve as vital information for Indian agricul-tural production. While simple geophysical models are able to explain the above correlations, more detailed modelling of the plate tectonics and heat fluxes (for undersea earthquakes) and ocean-cloud interaction/dynamics (for ENSO and Monsoon rainfall pattern) would need to be undertaken.

  12. Wilson Corners SWMU 001 2014 Annual Long Term Monitoring Report Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Langenbach, James

    2015-01-01

    This document presents the findings of the 2014 Long Term Monitoring (LTM) that was completed at the Wilson Corners site, located at the National Aeronautics and Space Administration (NASA) John F. Kennedy Space Center (KSC), Florida. The goals of the 2014 annual LTM event were to evaluate the groundwater flow direction and gradient and to monitor the vertical and downgradient horizontal extent of the volatile organic compounds (VOCs) in groundwater at the site. The LTM activities consisted of an annual groundwater sampling event in December 2014, which included the collection of water levels from the LTM wells. During the annual groundwater sampling event, depth to groundwater was measured and VOC samples were collected using passive diffusion bags (PDBs) from 30 monitoring wells. In addition to the LTM sampling, additional assessment sampling was performed at the site using low-flow techniques based on previous LTM results and assessment activities. Assessment of monitoring well MW0052DD was performed by collecting VOC samples using low-flow techniques before and after purging 100 gallons from the well. Monitoring well MW0064 was sampled to supplement shallow VOC data north of Hot Spot 2 and east of Hot Spot 4. Monitoring well MW0089 was sampled due to its proximity to MW0090. MW0090 is screened in a deeper interval and had an unexpected detection of trichloroethene (TCE) during the 2013 LTM, which was corroborated during the March 2014 verification sampling. Monitoring well MW0130 was sampled to provide additional VOC data beneath the semi-confining clay layer in the Hot Spot 2 area.

  13. The Observing System Monitoring Center: an Emerging Source for Integrated In-Situ Ocean Data

    NASA Astrophysics Data System (ADS)

    Hankin, S.; Habermann, T.; Kern, K.; Little, M.; Mendelssohn, R.; Neufeld, D.; O'Brien, K.; Simons, B.

    2011-12-01

    The Observing System Monitoring Center (OSMC) was originally conceived to serve as a tool to assist managers in monitoring the performance of the integrated global in-situ ocean observing system. For much of the past decade, the OSMC has been storing real time data and metadata from ocean observation sources such as the Global Telecommunications System (GTS), IOC sea level monitoring center, and others. The goal of the OSMC has been to maintain a record of all of the observations that represent the global climate data record. Though the initial purpose of the OSMC was mainly to track platform and observing subsystem performance, it has become clear that the data represented in the OSMC would be a valuable source for anyone interested in ocean processes. This presentation will discuss the implementation details involved in making the OSMC data available to the general public. We'll also discuss how we leveraged the NOAA-led Unified Access Framework (UAF), which defines a framework built upon community-accepted standards and conventions, in order to assist in the creation of the data services. By adhering to these well known and widely used standards and conventions, we ensure that the OSMC data will be available to users through many popular tools, including both web-based services and desktop clients. Additionally, we will also be discussing the modernized OSMC suite of user interfaces which intends to provide access to both ocean data and platform metrics for people ranging from ocean novices to scientific experts.

  14. Appraising the Early-est earthquake monitoring system for tsunami alerting at the Italian candidate Tsunami Service Provider

    NASA Astrophysics Data System (ADS)

    Bernardi, F.; Lomax, A.; Michelini, A.; Lauciani, V.; Piatanesi, A.; Lorito, S.

    2015-04-01

    In this paper we present the procedure for earthquake location and characterization implemented in the Italian candidate Tsunami Service Provider at INGV in Roma. Following the ICG/NEAMTWS guidelines, the first tsunami warning messages are based only on seismic information, i.e. epicenter location, hypocenter depth and magnitude, which are automatically computed by the software Early-est. Early-est is a package for rapid location and seismic/tsunamigenic characterization of earthquakes. The Early-est software package operates on offline-event or continuous-realtime seismic waveform data to perform trace processing and picking, and, at a regular report interval, phase association, event detection, hypocenter location, and event characterization. In this paper we present the earthquake parameters computed by Early-est from the beginning of 2012 till the end of December 2014 at global scale for events with magnitude M ≥ 5.5, and the detection timeline. The earthquake parameters computed automatically by Early-est are compared with reference manually revised/verified catalogs. From our analysis the epicenter location and hypocenter depth parameters do not differ significantly from the values in the reference catalogs. The epicenter coordinates generally differ less than 20 ∓ 20 km from the reference epicenter coordinates; focal depths are less well constrained and differ generally less than 0 ∓ 30 km. Early-est also provides mb, Mwp and Mwpd magnitude estimations. mb magnitudes are preferred for events with Mwp ≲ 5.8, while Mwpd are valid for events with Mwp ≳ 7.2. The magnitude mb show wide differences with respect to the reference catalogs, we thus apply a linear correction mbcorr = mb · 0.52 + 2.46, such correction results into ?mb ? 0.0 ∓ 0.2 uncertainty with respect the reference catalogs. As expected the Mwp show distance dependency. Mwp values at stations with epicentral distance ? ≲ 30° are significantly overestimated with respect the CMT-global solutions, whereas Mwp values at stations with epicentral distance ? ≳ 90° are slightly underestimated. We thus apply a 3rd degree polynomial distance correction. After applying the distance correction, the Mwp provided by Early-est differs from CMT-global catalog values of about ? Mwp ? 0.0 ∓ 0.2. Early-est continuously acquires time series data and updates the earthquake source parameters. Our analysis shows that the epicenter coordinates and the magnitude values converge rather quickly toward the final values. Generally we can provide robust and reliable earthquake source parameters to compile tsunami warning message within less than about 15 min after event origin time.

  15. An Evaluation of North Korea’s Nuclear Test by Belbasi Nuclear Tests Monitoring Center-KOERI

    NASA Astrophysics Data System (ADS)

    Necmioglu, O.; Meral Ozel, N.; Semin, K.

    2009-12-01

    Bogazici University and Kandilli Observatory and Earthquake Research Institute (KOERI) is acting as the Turkish National Data Center (NDC) and responsible for the operation of the International Monitoring System (IMS) Primary Seismic Station (PS-43) under Belbasi Nuclear Tests Monitoring Center for the verification of compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) since February 2000. The NDC is responsible for operating two arrays which are part of the IMS, as well as for transmitting data from these stations to the International Data Centre (IDC) in Vienna. The Belbasi array was established in 1951, as a four-element (Benioff 1051) seismic array as part of the United States Atomic Energy Detection System (USAEDS). Turkish General Staff (TGS) and U.S. Air Force Technical Application Center (AFTAC) under the Defense and Economic Cooperation Agreement (DECA) jointly operated this short period array. The station was upgraded and several seismometers were added to array during 1951 and 1994 and the station code was changed from BSRS (Belbasi Seismic Research Station) to BRTR-PS43 later on. PS-43 is composed of two sub-arrays (Ankara and Keskin): the medium-period array with a ~40 km radius located in Ankara and the short-period array with a ~3 km radius located in Keskin. Each array has a broadband element located at the middle of the circular geometry. Short period instruments are installed at depth 30 meters from the surface while medium and broadband instruments are installed at depth 60 meters from surface. On 25 May 2009, The Democratic People’s Republic of Korea (DPRK) claimed that it had conducted a nuclear test. Corresponding seismic event was recorded by IMS and IDC released first automatic estimation of time (00:54:43 GMT), location (41.2896°N and 129.0480°E) and the magnitude (4.52 mb) of the event in less than two hours time (USGS: 00:54:43 GMT; 41.306°N, 129.029°E; 4.7 mb) During our preliminary analysis of the 25th May 2009 DPRK event, we saw a very clear P arrival at 01:05:47 (GMT) at BRTR SP array. The result of the f-k analysis performed in Geotool software, installed at NDC facilities in 2008 and is in full use currently, was also indicating that the arrival belongs to the DPRK event. When comparing our f-k results (calculated at 1-2 Hz) with IDC-REB, however, we have noticed that our calculation and therefore corresponding residuals (calculated with reference to REB residuals) are much better in comparison to REB. The reasons of this ambiguity have been explored and for the first time a comprehensive seismological analysis of a Nuclear Test has been conducted in Turkey. CTBT has an important role for the implementation of the non-proliferation of nuclear weapons and it is a key element for the pursuit of nuclear disarmament. In this study, we would like to reflect the technical and scientific aspects of the 25 May 2009 DPRK event analysis, together with our involvement in CTBT(O) affairs, which we believe it brings new dimensions to Turkey especially in the area of Geophysics.

  16. Lecture Demonstrations on Earthquakes for K-12 Teachers and Students

    NASA Astrophysics Data System (ADS)

    Dry, M. D.; Patterson, G. L.

    2005-12-01

    Lecture Demonstrations on Earthquakes for K-12 Teachers and Students Since 1975, the Center for Earthquake Research and Information, (CERI), at The University of Memphis, has strived to satisfy its information transfer directives through diverse education and outreach efforts, providing technical and non-technical earthquake information to the general public, K-16 teachers and students, professional organizations, and state and federal organizations via all forms of written and electronic communication. <> Through these education and outreach efforts, CERI tries to increase earthquake hazard awareness to help limit future losses. <>In the past three years, education programs have reached over 20,000 K-16 students and teachers through in-service training workshops for teachers and earthquake/earth science lecture demonstrations for students. The presentations include an hour-long lecture demonstration featuring graphics and an informal question and answer format. Graphics used include seismic hazard maps, damage photos, plate tectonic maps, layers of the Earth, and more, all adapted for the audience. Throughout this presentation, manipulatives such as a Slinky, Silly Putty, a foam Earth with depth and temperature features, and Popsicle sticks are used to demonstrate seismic waves, the elasticity of the Earth, the Earth's layers and their features, and the brittleness of the crust. Toward the end, a demonstration featuring a portable shake table with a dollhouse mounted on it is used to illustrate earthquake-shaking effects. This presentation is also taken to schools when they are unable to visit CERI. Following this presentation, groups are then taken to the Public Earthquake Resource Center at CERI, a space featuring nine displays, seven of which are interactive. The interactive displays include a shake table and building blocks, a trench with paleoliquefaction features, computers with web access to seismology sites, a liquefaction model, an oscilloscope and attached geophone, a touch-screen monitor, and various manipulatives. CERI is also developing suitcase kits and activities for teachers to borrow and use in their classrooms. The suitcase kits include activities based on state learning standards, such as layers of the Earth and plate tectonics. Items included in the suitcase modules include a shake table and dollhouse, an oscilloscope and geophone, a resonance model, a Slinky, Silly putty, Popsicle sticks, and other items. Almost all of the activities feature a lecture demonstration component. These projects would not be possible without leveraged funding from the Mid-America Earthquake Center (MAEC) and the Center for Earthquake Research and Information, with additional funding from the National Earthquake Hazards Reduction Program (NEHRP).

  17. A real-time navigation monitoring expert system for the Space Shuttle Mission Control Center

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Fletcher, Malise

    1993-01-01

    The ONAV (Onboard Navigation) Expert System has been developed as a real time console assistant for use by ONAV flight controllers in the Mission Control Center at the Johnson Space Center. This expert knowledge based system is used to monitor the Space Shuttle onboard navigation system, detect faults, and advise flight operations personnel. This application is the first knowledge-based system to use both telemetry and trajectory data from the Mission Operations Computer (MOC). To arrive at this stage, from a prototype to real world application, the ONAV project has had to deal with not only AI issues but operating environment issues. The AI issues included the maturity of AI languages and the debugging tools, verification, and availability, stability and size of the expert pool. The environmental issues included real time data acquisition, hardware suitability, and how to achieve acceptance by users and management.

  18. Tracing back ionospheric perturbations detected by GPS monitoring in the epicenter area after large earthquakes to their source mechanisms.

    NASA Astrophysics Data System (ADS)

    Artru, J.; Ji, C.; Ducic, V.; Kanamori, H.; Lognonné, P.; Murakami, M.

    2004-12-01

    Dense Global Positioning System (GPS) continuous networks allow to detect and image small-scale perturbations of the ionosphere, through measurement of the Total Electron Content (TEC) along satellite-receiver rays. We present observations of ionospheric perturbations observed above the epicenter area during two large earthquakes, the M=8.2 Tokachi-Oki earthquake near Hokkaido (09/22/2003) and the M=6.5 San Simeon earthquake in Central California. In both cases significant variations of TEC have been observed, propagating up to several hundred kilometers from the epicenter, and starting approximately 15 minutes after the event. Very similar in nature, but with different amplitudes, those two observations can be interpreted as the signature in the ionosphere of an acoustic wave induced in the atmosphere by the ground displacement. Indeed, we compared those signals with results from finite-source inversions performed for those two events. Most similarities and differences between those two cases can be traced back to the seismic sources.

  19. Federal Radiological Monitoring and Assessment Center (FRMAC) overview of FRMAC operations

    SciTech Connect

    1996-02-01

    In the event of a major radiological emergency, 17 federal agencies with various statutory responsibilities have agreed to coordinate their efforts at the emergency scene under the umbrella of the Federal Radiological Emergency Response plan (FRERP). This cooperative effort will assure the designated Lead Federal Agency (LFA) and the state(s) that all federal radiological assistance fully supports their efforts to protect the public. The mandated federal cooperation ensures that each agency can obtain the data critical to its specific responsibilities. This Overview of the Federal Radiological Monitoring and Assessment Center (FRMAC) Operations describes the FRMAC response activities to a major radiological emergency. It also describes the federal assets and subsequent operational activities which provide federal radiological monitoring and assessment of the off-site areas. These off-site areas may include one or more affected states.

  20. Classification of Global Urban Centers Using ASTER Data: Preliminary Results From the Urban Environmental Monitoring Program

    NASA Astrophysics Data System (ADS)

    Stefanov, W. L.; Stefanov, W. L.; Christensen, P. R.

    2001-05-01

    Land cover and land use changes associated with urbanization are important drivers of global ecologic and climatic change. Quantification and monitoring of these changes are part of the primary mission of the ASTER instrument, and comprise the fundamental research objective of the Urban Environmental Monitoring (UEM) Program. The UEM program will acquire day/night, visible through thermal infrared ASTER data twice per year for 100 global urban centers over the duration of the mission (6 years). Data are currently available for a number of these urban centers and allow for initial comparison of global city structure using spatial variance texture analysis of the 15 m/pixel visible to near infrared ASTER bands. Variance texture analysis highlights changes in pixel edge density as recorded by sharp transitions from bright to dark pixels. In human-dominated landscapes these brightness variations correlate well with urbanized vs. natural land cover and are useful for characterizing the geographic extent and internal structure of cities. Variance texture analysis was performed on twelve urban centers (Albuquerque, Baghdad, Baltimore, Chongqing, Istanbul, Johannesburg, Lisbon, Madrid, Phoenix, Puebla, Riyadh, Vancouver) for which cloud-free daytime ASTER data are available. Image transects through each urban center produce texture profiles that correspond to urban density. These profiles can be used to classify cities into centralized (ex. Baltimore), decentralized (ex. Phoenix), or intermediate (ex. Madrid) structural types. Image texture is one of the primary data inputs (with vegetation indices and visible to thermal infrared image spectra) to a knowledge-based land cover classifier currently under development for application to ASTER UEM data as it is acquired. Collaboration with local investigators is sought to both verify the accuracy of the knowledge-based system and to develop more sophisticated classification models.

  1. Appraising the Early-est earthquake monitoring system for tsunami alerting at the Italian Candidate Tsunami Service Provider

    NASA Astrophysics Data System (ADS)

    Bernardi, F.; Lomax, A.; Michelini, A.; Lauciani, V.; Piatanesi, A.; Lorito, S.

    2015-09-01

    In this paper we present and discuss the performance of the procedure for earthquake location and characterization implemented in the Italian Candidate Tsunami Service Provider at the Istituto Nazionale di Geofisica e Vulcanologia (INGV) in Rome. Following the ICG/NEAMTWS guidelines, the first tsunami warning messages are based only on seismic information, i.e., epicenter location, hypocenter depth, and magnitude, which are automatically computed by the software Early-est. Early-est is a package for rapid location and seismic/tsunamigenic characterization of earthquakes. The Early-est software package operates using offline-event or continuous-real-time seismic waveform data to perform trace processing and picking, and, at a regular report interval, phase association, event detection, hypocenter location, and event characterization. Early-est also provides mb, Mwp, and Mwpd magnitude estimations. mb magnitudes are preferred for events with Mwp ≲ 5.8, while Mwpd estimations are valid for events with Mwp ≳ 7.2. In this paper we present the earthquake parameters computed by Early-est between the beginning of March 2012 and the end of December 2014 on a global scale for events with magnitude M ≥ 5.5, and we also present the detection timeline. We compare the earthquake parameters automatically computed by Early-est with the same parameters listed in reference catalogs. Such reference catalogs are manually revised/verified by scientists. The goal of this work is to test the accuracy and reliability of the fully automatic locations provided by Early-est. In our analysis, the epicenter location, hypocenter depth and magnitude parameters do not differ significantly from the values in the reference catalogs. Both mb and Mwp magnitudes show differences to the reference catalogs. We thus derived correction functions in order to minimize the differences and correct biases between our values and the ones from the reference catalogs. Correction of the Mwp distance dependency is particularly relevant, since this magnitude refers to the larger and probably tsunamigenic earthquakes. Mwp values at stations with epicentral distance ? ≲ 30° are significantly overestimated with respect to the CMT-global solutions, whereas Mwp values at stations with epicentral distance ? ≳ 90° are slightly underestimated. After applying such distance correction the Mwp provided by Early-est differs from CMT-global catalog values of about ? Mwp ? 0.0 ∓ 0.2. Early-est continuously acquires time-series data and updates the earthquake source parameters. Our analysis shows that the epicenter coordinates and the magnitude values converge within less than 10 min (5 min in the Mediterranean region) toward the stable values. Our analysis shows that we can compute Mwp magnitudes that do not display short epicentral distance dependency overestimation, and we can provide robust and reliable earthquake source parameters to compile tsunami warning messages within less than 15 min after the event origin time.

  2. Seismotectonics of the May 19, 2011 Simav- Kutahya Earthquake Activity

    NASA Astrophysics Data System (ADS)

    Komec Mutlu, Ahu

    2014-05-01

    Aftershock sequence of May 19, 2011 Simav earthquake (Mw = 5.8) is relocated with a new 1-D seismic velocity model and focal mechanisms of largest aftershocks are determined. The May 19, 2011 Simav-Kutahya earthquake is occured in the most seismically active region of western Turkey. During six months after the mainshock, more than 5000 earthquakes are recorded and aftershocks followed over a period of almost two years. In this study, more than 7600 aftershocks occured between years 2011 and 2012 with magnitudes greater than 1.8 relocated. Waveform data is collected by 13 three component seismic stations from three different networks (Kandilli Observatory and Earthquake Research Institute (NEMC-National Earthquake Monitoring Center), Prime Ministry Disaster and Emergency Management Presidency, Department of Earthquake and Canakkale Onsekiz Mart University Geophysics Department). These seismic stations are deployed closer than 80 km epicentral distance in the Simav-Kutahya. Average crustal velocity and average crustal thickness for the region are computed as 5.68 km/sn and 37.6 km, respectively. The source mechanism of fifty aftershocks with magnitudes greater than 4.0 are derived from first motion P phases. Analysis of focal mechanisms indicate mainly normal fault motions with oblique slip.

  3. Real-time prediction of earthquake ground motion using real-time monitoring, and improvement strategy of JMA EEW based on the lessons from M9 Tohoku Earthquake (Invited)

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.

    2013-12-01

    In this presentation, a new approach of real-time prediction of seismic ground motion for Earthquake Early Warning (EEW) is explained, in which real-time monitor is used but hypocentral location and magnitude are not required. Improvement strategy of the Japan Meteorological Agency (JMA) is also explained based on the lessons learned from the 2011 Tohoku Earthquake (Mw9.0). During the Tohoku Earthquake, EEW system of JMA issued warnings before the S-wave arrival and more than 15 s earlier than the strong ground motion in the Tohoku district. So it worked well as rapidly as designed. However, it under-predicted the seismic intensity for the Kanto district due to the very large extent of the fault rupture, and it issued some false alarms due to multiple simultaneous aftershocks. To address these problems, a new method of time-evolutional prediction is proposed that uses the real-time monitor of seismic wave propagation. This method makes it possible to predict ground motion without a hypocenter and magnitude. Effects of rupture directivity, source extent and simultaneous multiple events are substantially included in this method. In the time evolutional prediction, future wavefield is predicted from the wavefield at a certain time, that is u(x, t+?t)=P(u(x, t)), where u is the wave motion at location x at lapse time t, and P is the prediction operator. The determination of detailed distribution of current wavefield is an important key, so that dense seismic observation network is required. Here, current wavefield, u(x, t), observed by the real time monitoring is used as the initial condition, and then wave propagation is predicted based on time evolutional approach. The method is based on the following three techniques. To enhance the estimation of the current wavefield, data assimilation is applied. The data assimilation is a technique to produce artificially denser network, which is widely used for numerical weather forecast and oceanography. Propagation is predicted using P from the distribution of current wave motion, u(x, t), estimated from the data assimilation technique. For P, finite difference technique or boundary integral equation method, such as Kirchhoff integral, is used. Kirchhoff integral is qualitatively approximated by Huygens principle. Site amplification is an important factor to determine the seismic ground motion in addition to source and propagation factors. Site factor is usually frequency-dependent, and should be corrected in real time manner for EEW. The frequency-dependence is reproduced using a causal filter in the time domain applying bilinear transform and pre-warping techniques. Our final goal is the time evolutional prediction of seismic waveforms. Instead of the waveforms, prediction of the seismic intensity is applied in a preliminary version of this method, in which real-time observation of seismic intensities is used. JMA intends to introduce the preliminary version into their system within a couple of years, and integrate it with the current method which is based on the hypocenter and magnitude.

  4. Post disaster monitoring for the Great East Japan Earthquake with a new L-band airborne SAR "Pi-SAR-L2"

    NASA Astrophysics Data System (ADS)

    Kawano, Noriyuki

    2013-04-01

    A new L-band airborne SAR, Polarimetric and interferometry Synthetic Aperture Radar with L-band type-2 (Pi-SAR-L2) was developed in April 2012 by Japan Aerospace exploration Agency(JAXA). Pi-SAR-L2 employs a L-band with a band width of 85 MHz (1,215 - 1,300 MHz) with a peak power of 3.5 kW boarded on the Galfstream II. Pi-SAR-L2 conducted its first acquisitions for calibrations and validations over Tomakomai, Hokkaido, where is a test site with some corner reflectors in April 2012. The Great East Japan Earthquake with a magnitude 9.0 occurred at 14:46 on 11 Mar. 2011 and terribly big Tsunami attacked Tohoku district after the earthquake. The tsunami caused huge damage along its coast in Touhoku. Pi-SAR-L2 acquired these post disaster regions in Fukushima and Miyagi Prefectures along the coast on the way to Hokkaido in April 2012, some region still remain flooded area and debris caused by Tsumani. We will present Pi-SAR-L2 systems and specifications, and discuss monitoring these damages.

  5. Photovoltaic Performance and Reliability Database: A Gateway to Experimental Data Monitoring Projects for PV at the Florida Solar Energy Center

    DOE Data Explorer

    This site is the gateway to experimental data monitoring projects for photovoltaic (PV) at the Florida Solar Energy Center. The website and the database were designed to facilitate and standardize the processes for archiving, analyzing and accessing data collected from dozens of operational PV systems and test facilities monitored by FSEC's Photovoltaics and Distributed Generation Division. [copied from http://www.fsec.ucf.edu/en/research/photovoltaics/data_monitoring/index.htm

  6. Earthquake monitoring of eastern Washington (Operation of a Hanford seismic network and related studies): Final technical report

    SciTech Connect

    Not Available

    1988-07-01

    This is the final report for the operations and research performed by the University of Washington Geophysics Program on the seismicity and structure of eastern Washington and northeastern Oregon. There have been an average of 64 seismic stations operating between 1975 and 1979 and 105 stations between 1980 and 1988 whose data are telemetered to the University for recording, analysis and interpretation. Since 1976, annual technical reports have been produced that summarize network operation, analysis of data, and research results. The reports include earthquakes that have occurred since 1969. 18 refs., 15 figs., 2 tabs.

  7. Data Management Coordinators Monitor STS-78 Mission at the Huntsville Operations Support Center

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Launched on June 20, 1996, the STS-78 mission's primary payload was the Life and Microgravity Spacelab (LMS), which was managed by the Marshall Space Flight Center (MSFC). During the 17 day space flight, the crew conducted a diverse slate of experiments divided into a mix of life science and microgravity investigations. In a manner very similar to future International Space Station operations, LMS researchers from the United States and their European counterparts shared resources such as crew time and equipment. Five space agencies (NASA/USA, European Space Agency/Europe (ESA), French Space Agency/France, Canadian Space Agency /Canada, and Italian Space Agency/Italy) along with research scientists from 10 countries worked together on the design, development and construction of the LMS. This photo represents Data Management Coordinators monitoring the progress of the mission at the Huntsville Operations Support Center (HOSC) Spacelab Payload Operations Control Center (SL POCC) at MSFC. Pictured are assistant mission scientist Dr. Dalle Kornfeld, Rick McConnel, and Ann Bathew.

  8. The Deep Impact Network Experiment Operations Center Monitor and Control System

    NASA Technical Reports Server (NTRS)

    Wang, Shin-Ywan (Cindy); Torgerson, J. Leigh; Schoolcraft, Joshua; Brenman, Yan

    2009-01-01

    The Interplanetary Overlay Network (ION) software at JPL is an implementation of Delay/Disruption Tolerant Networking (DTN) which has been proposed as an interplanetary protocol to support space communication. The JPL Deep Impact Network (DINET) is a technology development experiment intended to increase the technical readiness of the JPL implemented ION suite. The DINET Experiment Operations Center (EOC) developed by JPL's Protocol Technology Lab (PTL) was critical in accomplishing the experiment. EOC, containing all end nodes of simulated spaces and one administrative node, exercised publish and subscribe functions for payload data among all end nodes to verify the effectiveness of data exchange over ION protocol stacks. A Monitor and Control System was created and installed on the administrative node as a multi-tiered internet-based Web application to support the Deep Impact Network Experiment by allowing monitoring and analysis of the data delivery and statistics from ION. This Monitor and Control System includes the capability of receiving protocol status messages, classifying and storing status messages into a database from the ION simulation network, and providing web interfaces for viewing the live results in addition to interactive database queries.

  9. Cost-effective monitoring of ground motion related to earthquakes, landslides, or volcanic activity by joint use of a single-frequency GPS and a MEMS accelerometer

    NASA Astrophysics Data System (ADS)

    Tu, R.; Wang, R.; Ge, M.; Walter, T. R.; Ramatschi, M.; Milkereit, C.; Bindi, D.; Dahm, T.

    2013-08-01

    detection and precise estimation of strong ground motion are crucial for rapid assessment and early warning of geohazards such as earthquakes, landslides, and volcanic activity. This challenging task can be accomplished by combining GPS and accelerometer measurements because of their complementary capabilities to resolve broadband ground motion signals. However, for implementing an operational monitoring network of such joint measurement systems, cost-effective techniques need to be developed and rigorously tested. We propose a new approach for joint processing of single-frequency GPS and MEMS (microelectromechanical systems) accelerometer data in real time. To demonstrate the performance of our method, we describe results from outdoor experiments under controlled conditions. For validation, we analyzed dual-frequency GPS data and images recorded by a video camera. The results of the different sensors agree very well, suggesting that real-time broadband information of ground motion can be provided by using single-frequency GPS and MEMS accelerometers.

  10. Rapid Estimates of Rupture Extent for Large Earthquakes Using Aftershocks

    NASA Astrophysics Data System (ADS)

    Polet, J.; Thio, H. K.; Kremer, M.

    2009-12-01

    The spatial distribution of aftershocks is closely linked to the rupture extent of the mainshock that preceded them and a rapid analysis of aftershock patterns therefore has potential for use in near real-time estimates of earthquake impact. The correlation between aftershocks and slip distribution has frequently been used to estimate the fault dimensions of large historic earthquakes for which no, or insufficient, waveform data is available. With the advent of earthquake inversions that use seismic waveforms and geodetic data to constrain the slip distribution, the study of aftershocks has recently been largely focused on enhancing our understanding of the underlying mechanisms in a broader earthquake mechanics/dynamics framework. However, in a near real-time earthquake monitoring environment, in which aftershocks of large earthquakes are routinely detected and located, these data may also be effective in determining a fast estimate of the mainshock rupture area, which would aid in the rapid assessment of the impact of the earthquake. We have analyzed a considerable number of large recent earthquakes and their aftershock sequences and have developed an effective algorithm that determines the rupture extent of a mainshock from its aftershock distribution, in a fully automatic manner. The algorithm automatically removes outliers by spatial binning, and subsequently determines the best fitting “strike” of the rupture and its length by projecting the aftershock epicenters onto a set of lines that cross the mainshock epicenter with incremental azimuths. For strike-slip or large dip-slip events, for which the surface projection of the rupture is recti-linear, the calculated strike correlates well with the strike of the fault and the corresponding length, determined from the distribution of aftershocks projected onto the line, agrees well with the rupture length. In the case of a smaller dip-slip rupture with an aspect ratio closer to 1, the procedure gives a measure of the rupture extent and dimensions, but not necessarily the strike. We found that using standard earthquake catalogs, such as the National Earthquake Information Center catalog, we can constrain the rupture extent, rupture direction, and in many cases the type of faulting, of the mainshock with the aftershocks that occur within the first hour after the mainshock. However, this data may not be currently available in near real-time. Since our results show that these early aftershock locations may be used to estimate first order rupture parameters for large global earthquakes, the near real-time availability of these data would be useful for fast earthquake damage assessment.

  11. Prompt Assessment of Global Earthquakes for Response (PAGER): An Automated System to Estimate Impact Following Significant Earthquakes Worldwide

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Wald, D. J.; Lastowka, L. A.; Quitoriano, V.; Donnelly, M. J.

    2004-12-01

    The US Geological Survey's National Earthquake Information Center (USGS/NEIC) is developing a system to rapidly assess societal impact immediately following significant global earthquakes. NEIC's near realtime earthquake solutions are being monitored to automatically identify quakes that likely caused human suffering or damage to infrastructure, or that will attract significant media attention. Our goal is to help the USGS fulfill its mission to provide critical earthquake-related information to emergency response agencies, government agencies, the scientific community, the media, and the general public. Currently, it takes several hours to days for the media and other organizations to provide an assessment of a damaging earthquake. Our system, known as Prompt Assessment of Global Earthquakes for Response (PAGER), will estimate the severity of damage caused by an earthquake immediately following its location and magnitude estimation (minutes to an hour). PAGER will assess the situation based on estimated and any observed ground motions, total population exposed to varying degrees of shaking, and vulnerability of the affected region. We expect that an automated summary impact statement and associated alarms can be deployed within seconds of computing the ground-motion estimates, well before ground truth damage estimates arrive. The USGS is collaborating with the US Agency for International Development (USAID) to develop a prototype system. The prototype will estimate ground motions using modifications to the methodology developed for ShakeMap, extended to the entire globe. Since strong-motion recordings will rarely be available for global earthquakes in realtime, we will rely on predicted rather than observed ground motions. Site corrections will be approximated using a combination of elevation and topographic slope (see Wald et al. this meeting) and the exposed population will be determined using Oak Ridge National Lab's Landscan2002 global population database. PAGER will be an iterative system with new alarms issued as better estimates of magnitude, location, fault orientation, finite fault effects, and felt reports become available. We will present details of the assessment algorithm and examples from the prototype system.

  12. Pain Reduction and Financial Incentives to Improve Glucose Monitoring Adherence in a Community Health Center

    PubMed Central

    Huntsman, Mary Ann H.; Olivares, Faith J.; Tran, Christina P.; Billimek, John; Hui, Elliot E.

    2014-01-01

    Self-monitoring of blood glucose is a critical component of diabetes management. However, patients often do not maintain the testing schedule recommended by their healthcare provider. Many barriers to testing have been cited, including cost and pain. We present a small pilot study to explore whether the use of financial incentives and pain-free lancets could improve adherence to glucose testing in a community health center patient population consisting largely of non-English speaking ethnic minorities with low health literacy. The proportion of patients lost to follow-up was 17%, suggesting that a larger scale study is feasible in this type of setting, but we found no preliminary evidence suggesting a positive effect on adherence by either financial incentives or pain-free lancets. Results from this pilot study will guide the design of larger-scale studies to evaluate approaches to overcome the variety of barriers to glucose testing that are present in disadvantaged patient populations. PMID:25486531

  13. Activation and implementation of a Federal Radiological Monitoring and Assessment Center

    SciTech Connect

    Doyle, J.F. III

    1989-01-01

    The Nevada Operations Office of the U.S. Department of Energy (DOE/NV) has been assigned the primary responsibility for responding to a major radiological emergency. The initial response to any radiological emergency, however, will probably be conducted under the DOE regional radiological assistance plan (RAP). If the dimensions of the crisis demand federal assistance, the following sequence of events may be anticipated: (1) DOE regional RAP response, (2) activation of the Federal Radiological Monitoring and Assistance Center (FRMAC) requested, (3) aerial measuring systems and DOE/NV advance party respond, (4) FRMAC activated, (5) FRMAC responds to state(s) and cognizant federal agency (CFA), and (6) management of FRMAC transferred to the Environmental Protection Agency (EPA). The paper discusses activation channels, authorization, notification, deployment, and interfaces.

  14. Atmospheric monitoring of a perfluorocarbon tracer at the 2009 ZERT Center experiment

    NASA Astrophysics Data System (ADS)

    Pekney, Natalie; Wells, Arthur; Rodney Diehl, J.; McNeil, Matthew; Lesko, Natalie; Armstrong, James; Ference, Robert

    2012-02-01

    Field experiments at Montana State University are conducted for the U.S. Department of Energy as part of the Zero Emissions Research and Technology Center (ZERT) to test and verify monitoring techniques for carbon capture and storage (CCS). A controlled release of CO 2 with an added perfluorocarbon tracer was conducted in July 2009 in a multi-laboratory study of atmospheric transport and detection technologies. Tracer plume dispersion was measured with various meteorological conditions using a tethered balloon system with Multi-Tube Remote Samplers (MTRS) at elevations of 10 m, 20 m, and 40 m above ground level (AGL), as well as a ground-based portable tower with monitors containing sorbent material to collect the tracer at 1 m, 2 m, 3 m, and 4 m AGL. Researchers designed a horizontal grid of sampling locations centered at the tracer plume source, with the tower positioned at 10 m and 30 m in both upwind and downwind directions, and the MTRS spaced at 50 m and 90 m downwind and 90 m upwind. Tracer was consistently detected at elevated concentrations at downwind sampling locations. With very few exceptions, higher tracer concentrations correlated with lower elevations. Researchers observed no statistical difference between sampling at 50 m and 90 m downwind at the same elevation. The US EPA AERMOD model applied using site-specific information predicted transport and dispersion of the tracer. Model results are compared to experimental data from the 2009 ZERT experiment. Successful characterization of the tracer plume simulated by the ZERT experiment is considered a step toward demonstrating the feasibility of remote sampling with unmanned aerial systems (UAS's) at future sequestration sites.

  15. Incorporating Fundamentals of Climate Monitoring into Climate Indicators at the National Climatic Data Center

    NASA Astrophysics Data System (ADS)

    Arndt, D. S.

    2014-12-01

    In recent years, much attention has been dedicated to the development, testing and implementation of climate indicators. Several Federal agencies and academic groups have commissioned suites of indicators drawing upon and aggregating information available across the spectrum of climate data stewards and providers. As a long-time participant in the applied climatology discipline, NOAA's National Climatic Data Center (NCDC) has generated climate indicators for several decades. Traditionally, these indicators were developed for sectors with long-standing relationships with, and needs of, the applied climatology field. These have recently been adopted and adapted to meet the needs of sectors who have newfound sensitivities to climate and needs for climate data. Information and indices from NOAA's National Climatic Data Center have been prominent components of these indicator suites, and in some cases have been drafted in toto by these aggregators, often with improvements to the communicability and aesthetics of the indicators themselves. Across this history of supporting needs for indicators, NCDC climatologists developed a handful of practical approaches and philosophies that inform a successful climate monitoring product. This manuscript and presentation will demonstrate the utility this set of practical applications that translate raw data into useful information.

  16. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    NASA Technical Reports Server (NTRS)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  17. The Swift X-ray monitoring campaign of the center of the Milky Way

    NASA Astrophysics Data System (ADS)

    Degenaar, N.; Wijnands, R.; Miller, J. M.; Reynolds, M. T.; Kennea, J.; Gehrels, N.

    2015-09-01

    In 2006 February, shortly after its launch, Swift began monitoring the center of the Milky Way with the on board X-Ray Telescope using short 1-ks exposures performed every 1-4 days. Between 2006 and 2014 over 1200 observations have been obtained, accumulating to ? 1.3 Ms of exposure time. This has yielded a wealth of information about the long-term X-ray behavior of the supermassive black hole Sgr A*, and numerous transient X-ray binaries that are located within the 25? Ś25? region covered by the campaign. In this review we highlight the discoveries made during these first nine years, which include 1) the detection of seven bright X-ray flares from Sgr A*, 2) the discovery of the magnetar SGR J1745-29, 3) the first systematic analysis of the outburst light curves and energetics of the peculiar class of very-faint X-ray binaries, 4) the discovery of three new transient X-ray sources, 5) the exposure of low-level accretion in otherwise bright X-ray binaries, and 6) the identification of a candidate X-ray binary/millisecond radio pulsar transitional object. We also reflect on future science to be done by continuing this Swift's legacy campaign, such as high-cadence monitoring to study how the interaction between the gaseous object 'G2' and Sgr A* plays out in the future.

  18. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  19. Seismological investigation of earthquakes in the New Madrid Seismic Zone. Final report, September 1986--December 1992

    SciTech Connect

    Herrmann, R.B.; Nguyen, B.

    1993-08-01

    Earthquake activity in the New Madrid Seismic Zone had been monitored by regional seismic networks since 1975. During this time period, over 3,700 earthquakes have been located within the region bounded by latitudes 35{degrees}--39{degrees}N and longitudes 87{degrees}--92{degrees}W. Most of these earthquakes occur within a 1.5{degrees} x 2{degrees} zone centered on the Missouri Bootheel. Source parameters of larger earthquakes in the zone and in eastern North America are determined using surface-wave spectral amplitudes and broadband waveforms for the purpose of determining the focal mechanism, source depth and seismic moment. Waveform modeling of broadband data is shown to be a powerful tool in defining these source parameters when used complementary with regional seismic network data, and in addition, in verifying the correctness of previously published focal mechanism solutions.

  20. The Terminator Time in subionospheric VLF/LF diurnal variation as recorded by the Romanian VLF/LF radio monitoring system related to earthquake occurrence and volcano erruptions

    NASA Astrophysics Data System (ADS)

    Moldovan, I. A.; Moldovan, A. S.; Biagi, P. F.; Ionescu, C.; Schwingenschuh, K.; Boudjada, M. Y.

    2012-04-01

    The Romanian VLF/LF monitoring system consisting in a radio receiver and the infrastructure that is necessary to record and transmit the collected data is part of the European international network named INFREP. Information on electromagnetic fields' intensities created by transmitters at a receiving site are indicating the quality of the propagation along the paths between the receivers and transmitters. Studying the ionosphere's influences on the electromagnetic waves' propagation along a certain path is a method to put into evidence possible modifications of its lower structure and composition as earthquakes' precursors. The VLF/LF receiver installed in Romania was put into operation in February 2009 and has already 3 years of testing, functioning and proving its utility in the forecast of some earthquakes or volcanic eruptions. Simultaneously we monitor, in the same site with the VLF/LF receiver, the vertical atmospheric electric field and different other meteorological parameters as: temperature, pressure or rainfall. The global magnetic conditions are emphasized with the help of Daily Geomagnetic Index Kp. At a basic level, the adopted analysis consists in a simple statistical evaluation of the signals by comparing the instantaneous values to the trend of the signal. In this paper we pay attention to the terminator times in subionospheric VLF/LF diurnal variation, which are defined as the times of minimum in amplitude (or phase) around sunrise and sunset. These terminator times are found to shift significantly just around the earthquake. In the case of Kobe earthquake, there were found significant shifts in both morning and evening terminator times and these authors interpreted the shift in terminator time in terms of the lowering of lower ionosphere by using the full-wave mode theory. A LabVIEW application which accesses the VLF/LF receiver through internet was developed. This program opens the receiver's web-page and automatically retrieves the list of data files to synchronize the user-side data with the receiver's data. Missing zipped files are also automatically downloaded. The application appends daily files into monthly and anual files and performs 3D colour-coded maps with graphic representations of VLF and LF signals' intensities versus the minute-of-the-day and the day-of-the-month, facilitating a near real-time observation of VLF and LF electromagnetic waves' propagation. This type of representation, highlights the modification of the terminator time versus the length of the solar-day, improves the user's capability to detect possible propagation anomalies due to ionosphere conditions and allows a quick visual inspection of unexpected behaviors of transmission channels at different frequencies and paths. A very special result, was observed on the recordings made on the propagation path to Iceland (NRK, 37.5kHz). Recordings are made once a minute, for a period of 303 days. Icelandic channel propagation anomalies present in the range of 40-90 days are considered to be precursory phenomena associated with Eyjafjallajokull - Iceland, volcanic eruption occurred in April-May 2010.

  1. Analog earthquakes

    SciTech Connect

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  2. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    USGS Publications Warehouse

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    On March 27, 1964, at 5:36 p.m., a magnitude 9.2 earthquake, the largest recorded earthquake in U.S. history, struck southcentral Alaska (fig. 1). The Great Alaska Earthquake (also known as the Good Friday Earthquake) occurred at a pivotal time in the history of earth science, and helped lead to the acceptance of plate tectonic theory (Cox, 1973; Brocher and others, 2014). All large subduction zone earthquakes are understood through insights learned from the 1964 event, and observations and interpretations of the earthquake have influenced the design of infrastructure and seismic monitoring systems now in place. The earthquake caused extensive damage across the State, and triggered local tsunamis that devastated the Alaskan towns of Whittier, Valdez, and Seward. In Anchorage, the main cause of damage was ground shaking, which lasted approximately 4.5 minutes. Many buildings could not withstand this motion and were damaged or collapsed even though their foundations remained intact. More significantly, ground shaking triggered a number of landslides along coastal and drainage valley bluffs underlain by the Bootlegger Cove Formation, a composite of facies containing variably mixed gravel, sand, silt, and clay which were deposited over much of upper Cook Inlet during the Late Pleistocene (Ulery and others, 1983). Cyclic (or strain) softening of the more sensitive clay facies caused overlying blocks of soil to slide sideways along surfaces dipping by only a few degrees. This guide is the document version of an interactive web map that was created as part of the commemoration events for the 50th anniversary of the 1964 Great Alaska Earthquake. It is accessible at the U.S. Geological Survey (USGS) Alaska Science Center website: http://alaska.usgs.gov/announcements/news/1964Earthquake/. The website features a map display with suggested tour stops in Anchorage, historical photographs taken shortly after the earthquake, repeat photography of selected sites, scanned documents, and small-scale maps, as well as links to slideshows of additional photographs and Google Street Viewℱ scenes. Buildings in Anchorage that were severely damaged, sites of major landslides, and locations of post-earthquake engineering responses are highlighted. The web map can be used online as a virtual tour or in a physical self-guided tour using a web-enabled Global Positioning System (GPS) device. This publication serves the purpose of committing most of the content of the web map to a single distributable document. As such, some of the content differs from the online version.

  3. Rapid and robust characterization of the earthquake source for tsunami early-warning

    NASA Astrophysics Data System (ADS)

    Lomax, Anthony; Michelini, Alberto; Bernardi, Fabrizio; Lauciani, Valentino

    2015-04-01

    Effective tsunami early-warning after an earthquake is difficult when the distances and tsunami travel-times between earthquake/tsunami source regions and coast lines at risk are small, especially since the density of seismic and other monitoring stations is very low in most regions of risk. For tsunami warning worldwide, seismic monitoring and analysis currently provide the majority of information available within the first tens of minutes after an earthquake. This information is used for direct tsunami hazard assessment, and as basic input to real-time, tsunami hazard modeling. It is thus crucial that key earthquake parameters are determined as rapidly and reliably as possible, in a probabilistic, time-evolving manner, along with full uncertainties. Early-est (EArthquake Rapid Location sYstem with EStimation of Tsunamigenesis) is the module for rapid earthquake detection, location and analysis at the INGV tsunami alert center (CAT, "Centro di Allerta Tsunami"), part of the Italian, candidate Tsunami Watch Provider. Here we present the information produced by Early-est within the first 10 min after an earthquake to characterize the location, depth, magnitude, mechanism and tsunami potential of an earthquake. We discuss key algorithms in Early-est that produce fully automatic, robust results and their uncertainties in the shortest possible time using sparse observations. For example, a broadband picker and a robust, probabilistic, global-search detector/associator/locator component of Early-est can detect and locate a seismic event with as few as 4 to 5 P onset observations. We also discuss how these algorithms may be further evolved to provide even earlier and more robust results. Finally, we illustrate how the real-time, evolutionary and probabilistic earthquake information produced by Early-est, along with prior and non-seismic information and later seismic information (e.g., full-waveform moment-tensors), may be used within time-evolving, decision and modeling systems for tsunami early warning.

  4. Monitoring Stellar Orbits Around the Massive Black Hole in the Galactic Center

    NASA Astrophysics Data System (ADS)

    Gillessen, S.; Eisenhauer, F.; Trippe, S.; Alexander, T.; Genzel, R.; Martins, F.; Ott, T.

    2009-02-01

    We present the results of 16 years of monitoring stellar orbits around the massive black hole in the center of the Milky Way, using high-resolution near-infrared techniques. This work refines our previous analysis mainly by greatly improving the definition of the coordinate system, which reaches a long-term astrometric accuracy of ≈300 ÎŒas, and by investigating in detail the individual systematic error contributions. The combination of a long-time baseline and the excellent astrometric accuracy of adaptive optics data allows us to determine orbits of 28 stars, including the star S2, which has completed a full revolution since our monitoring began. Our main results are: all stellar orbits are fit extremely well by a single-point-mass potential to within the astrometric uncertainties, which are now ≈6× better than in previous studies. The central object mass is (4.31 ± 0.06|_{stat} ± 0.36|_{R_0})× 10^6 M_⊙, where the fractional statistical error of 1.5% is nearly independent from R 0, and the main uncertainty is due to the uncertainty in R 0. Our current best estimate for the distance to the Galactic center is R 0 = 8.33 ± 0.35 kpc. The dominant errors in this value are systematic. The mass scales with distance as (3.95 ± 0.06) × 106(R 0/8 kpc)2.19 M sun. The orientations of orbital angular momenta for stars in the central arcsecond are random. We identify six of the stars with orbital solutions as late-type stars, and six early-type stars as members of the clockwise-rotating disk system, as was previously proposed. We constrain the extended dark mass enclosed between the pericenter and apocenter of S2 at less than 0.066, at the 99% confidence level, of the mass of Sgr A*. This is two orders of magnitudes larger than what one would expect from other theoretical and observational estimates.

  5. Interoperable Access to Near Real Time Ocean Observations with the Observing System Monitoring Center

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Hankin, S.; Mendelssohn, R.; Simons, R.; Smith, B.; Kern, K. J.

    2013-12-01

    The Observing System Monitoring Center (OSMC), a project funded by the National Oceanic and Atmospheric Administration's Climate Observations Division (COD), exists to join the discrete 'networks' of In Situ ocean observing platforms -- ships, surface floats, profiling floats, tide gauges, etc. - into a single, integrated system. The OSMC is addressing this goal through capabilities in three areas focusing on the needs of specific user groups: 1) it provides real time monitoring of the integrated observing system assets to assist management in optimizing the cost-effectiveness of the system for the assessment of climate variables; 2) it makes the stream of real time data coming from the observing system available to scientific end users into an easy-to-use form; and 3) in the future, it will unify the delayed-mode data from platform-focused data assembly centers into a standards- based distributed system that is readily accessible to interested users from the science and education communities. In this presentation, we will be focusing on the efforts of the OSMC to provide interoperable access to the near real time data stream that is available via the Global Telecommunications System (GTS). This is a very rich data source, and includes data from nearly all of the oceanographic platforms that are actively observing. We will discuss how the data is being served out using a number of widely used 'web services' (including OPeNDAP and SOS) and downloadable file formats (KML, csv, xls, netCDF), so that it can be accessed in web browsers and popular desktop analysis tools. We will also be discussing our use of the Environmental Research Division's Data Access Program (ERDDAP), available from NOAA/NMFS, which has allowed us to achieve our goals of serving the near real time data. From an interoperability perspective, it's important to note that access to the this stream of data is not just for humans, but also for machine-to-machine requests. We'll also delve into how we configured access to the near real time ocean observations in accordance with the Climate and Forecast (CF) metadata conventions describing the various 'feature types' associated with particular in situ observation types, or discrete sampling geometries (DSG). Wrapping up, we'll discuss some of the ways this data source is already being used.

  6. Broadband characteristics of earthquakes recorded during a dome-building eruption at Mount St. Helens, Washington, between October 2004 and May 2005: Chapter 5 in A volcano rekindled: the renewed eruption of Mount St. Helens, 2004-2006

    USGS Publications Warehouse

    Horton, Stephen P.; Norris, Robert D.; Moran, Seth C.

    2008-01-01

    From October 2004 to May 2005, the Center for Earthquake Research and Information of the University of Memphis operated two to six broadband seismometers within 5 to 20 km of Mount St. Helens to help monitor recent seismic and volcanic activity. Approximately 57,000 earthquakes identified during the 7-month deployment had a normal magnitude distribution with a mean magnitude of 1.78 and a standard deviation of 0.24 magnitude units. Both the mode and range of earthquake magnitude and the rate of activity varied during the deployment. We examined the time domain and spectral characteristics of two classes of events seen during dome building. These include volcano-tectonic earthquakes and lower-frequency events. Lower-frequency events are further classified into hybrid earthquakes, low-frequency earthquakes, and long-duration volcanic tremor. Hybrid and low-frequency earthquakes showed a continuum of characteristics that varied systematically with time. A progressive loss of high-frequency seismic energy occurred in earthquakes as magma approached and eventually reached the surface. The spectral shape of large and small earthquakes occurring within days of each other did not vary with magnitude. Volcanic tremor events and lower-frequency earthquakes displayed consistent spectral peaks, although higher frequencies were more favorably excited during tremor than earthquakes.

  7. Anthropogenic triggering of large earthquakes.

    PubMed

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1?MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10?MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor "foreshocks", since the induction may occur with a delay up to several years. PMID:25156190

  8. Anthropogenic Triggering of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Bizzarri, Andrea

    2014-08-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor ``foreshocks'', since the induction may occur with a delay up to several years.

  9. Anthropogenic Triggering of Large Earthquakes

    PubMed Central

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1–10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor “foreshocks”, since the induction may occur with a delay up to several years. PMID:25156190

  10. Multi-Year Combination of Tide Gauge Benchmark Monitoring (TIGA) Analysis Center Products

    NASA Astrophysics Data System (ADS)

    Hunegnaw, A.; Teferle, F. N.

    2014-12-01

    In 2013 the International GNSS Service (IGS) Tide Gauge Benchmark Monitoring (TIGA) Working Group (WG) started their reprocessing campaign, which proposes to re-analyze all relevant Global Positioning System (GPS) observations from 1994 to 2013. This re-processed dataset will provide high quality estimates of land motions, enabling regional and global high-precision geophysical/geodetic studies. Several of the individual TIGA Analysis Centers (TACs) have completed processing the full history of GPS observations recorded by the IGS global network, as well as, many other GPS stations at or close to tide gauges, which are available from the TIGA data centre at the University of La Rochelle (www.sonel.org). The TAC solutions contain a total of over 700 stations. Following the recent improvements in processing models and strategies, this is the first complete reprocessing attempt by the TIGA WG to provide homogeneous position time series. The TIGA Combination Centre (TCC) at the University of Luxembourg (UL) has computed a first multi-year weekly combined solution using two independent combination software packages: CATREF and GLOBK. These combinations allow an evaluation of any effects from the combination software and of the individual TAC contributions and their influences on the combined solution. In this study we will present the first UL TIGA multi-year combination results and discuss these in terms of geocentric sea level changes

  11. Earthquake Analysis.

    ERIC Educational Resources Information Center

    Espinoza, Fernando

    2000-01-01

    Indicates the importance of the development of students' measurement and estimation skills. Analyzes earthquake data recorded at seismograph stations and explains how to read and modify the graphs. Presents an activity for student evaluation. (YDS)

  12. Rapid characterization of the 2015 Mw 7.8 Gorkha, Nepal, earthquake sequence and its seismotectonic context

    USGS Publications Warehouse

    Hayes, Gavin; Briggs, Richard; Barnhart, William D.; Yeck, William; McNamara, Daniel E.; Wald, David J.; Nealy, Jennifer; Benz, Harley M.; Gold, Ryan D.; Jaiswal, Kishor S.; Marano, Kristin; Earle, Paul; Hearne, Mike; Smoczyk, Gregory M.; Wald, Lisa A.; Samsonov, Sergey

    2015-01-01

    Earthquake response and related information products are important for placing recent seismic events into context and particularly for understanding the impact earthquakes can have on the regional community and its infrastructure. These tools are even more useful if they are available quickly, ahead of detailed information from the areas affected by such earthquakes. Here we provide an overview of the response activities and related information products generated and provided by the U.S. Geological Survey National Earthquake Information Center in association with the 2015 M 7.8 Gorkha, Nepal, earthquake. This group monitors global earthquakes 24??hrs/day and 7??days/week to provide rapid information on the location and size of recent events and to characterize the source properties, tectonic setting, and potential fatalities and economic losses associated with significant earthquakes. We present the timeline over which these products became available, discuss what they tell us about the seismotectonics of the Gorkha earthquake and its aftershocks, and examine how their information is used today, and might be used in the future, to help mitigate the impact of such natural disasters.

  13. Impending ionospheric anomaly preceding the Iquique Mw8.2 earthquake in Chile on 2014 April 1

    NASA Astrophysics Data System (ADS)

    Guo, Jinyun; Li, Wang; Yu, Hongjuan; Liu, Zhimin; Zhao, Chunmei; Kong, Qiaoli

    2015-12-01

    To investigate the coupling relationship between great earthquake and ionosphere, the GPS-derived total electron contents (TECs) by the Center for Orbit Determination in Europe and the foF2 data from the Space Weather Prediction Center were used to analyse the impending ionospheric anomalies before the Iquique Mw8.2 earthquake in Chile on 2014 April 1. Eliminating effects of the solar and geomagnetic activities on ionosphere by the sliding interquartile range with the 27-day window, the TEC analysis results represent that there were negative anomalies occurred on 15th day prior to the earthquake, and positive anomalies appeared in 5th day before the earthquake. The foF2 analysis results of ionosonde stations Jicamarca, Concepcion and Ramey show that the foF2 increased by 40, 50 and 45 per cent, respectively, on 5th day before the earthquake. The TEC anomalous distribution indicates that there was a widely TEC decrement over the epicentre with the duration of 6 hr on 15th day before the earthquake. On 5th day before the earthquake, the TEC over the epicentre increased with the amplitude of 15 TECu, and the duration exceeded 6 hr. The anomalies occurred on the side away from the equator. All TEC anomalies in these days were within the bounds of equatorial anomaly zone where should be the focal area to monitor ionospheric anomaly before strong earthquakes. The relationship between ionospheric anomalies and geomagnetic activity was detected by the cross wavelet analysis, which implied that the foF2 was not affected by the magnetic activities on 15th day and 5th day prior to the earthquake, but the TECs were partially affected by anomalous magnetic activity during some periods of 5th day prior to the earthquake.

  14. Satellite relay telemetry of seismic data in earthquake prediction and control

    USGS Publications Warehouse

    Jackson, Wayne H.; Eaton, Jerry P.

    1971-01-01

    The Satellite Telemetry Earthquake Monitoring Program was started in FY 1968 to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research (NCER) in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project.

  15. Deep earthquakes

    SciTech Connect

    Frohlich, C.

    1989-01-01

    Earthquakes are often recorded at depths as great as 650 kilometers or more. These deep events mark regions where plates of the earth's surface are consumed in the mantle. But the earthquakes themselves present a conundrum: the high pressures and temperatures at such depths should keep rock from fracturing suddenly and generating a tremor. This paper reviews the research on this problem. Almost all deep earthquakes conform to the pattern described by Wadati, namely, they generally occur at the edge of a deep ocean and define an inclined zone extending from near the surface to a depth of 600 kilometers of more, known as the Wadati-Benioff zone. Several scenarios are described that were proposed to explain the fracturing and slipping of rocks at this depth.

  16. Ground-water-level monitoring for earthquake prediction; a progress report based on data collected in Southern California, 1976-79

    USGS Publications Warehouse

    Moyle, W.R., Jr.

    1980-01-01

    The U.S. Geological Survey is conducting a research program to determine if groundwater-level measurements can be used for earthquake prediction. Earlier studies suggest that water levels in wells may be responsive to small strains on the order of 10 to the minus 8th power to 10 to the minus 10th power (dimensionless). Water-level data being collected in the area of the southern California uplift show response to earthquakes and other natural and manmade effects. The data are presently (1979) being made ready for computer analysis. The completed analysis may indicate the presence of precursory earthquake information. (USGS)

  17. Groundwater monitoring program plan and conceptual site model for the Al-Tuwaitha Nuclear Research Center in Iraq.

    SciTech Connect

    Copland, John Robin; Cochran, John Russell

    2013-07-01

    The Radiation Protection Center of the Iraqi Ministry of Environment is developing a groundwater monitoring program (GMP) for the Al-Tuwaitha Nuclear Research Center located near Baghdad, Iraq. The Al-Tuwaitha Nuclear Research Center was established in about 1960 and is currently being cleaned-up and decommissioned by Iraq's Ministry of Science and Technology. This Groundwater Monitoring Program Plan (GMPP) and Conceptual Site Model (CSM) support the Radiation Protection Center by providing:A CSM describing the hydrogeologic regime and contaminant issues,recommendations for future groundwater characterization activities, anddescriptions of the organizational elements of a groundwater monitoring program. The Conceptual Site Model identifies a number of potential sources of groundwater contamination at Al-Tuwaitha. The model also identifies two water-bearing zones (a shallow groundwater zone and a regional aquifer). The depth to the shallow groundwater zone varies from approximately 7 to 10 meters (m) across the facility. The shallow groundwater zone is composed of a layer of silty sand and fine sand that does not extend laterally across the entire facility. An approximately 4-m thick layer of clay underlies the shallow groundwater zone. The depth to the regional aquifer varies from approximately 14 to 17 m across the facility. The regional aquifer is composed of interfingering layers of silty sand, fine-grained sand, and medium-grained sand. Based on the limited analyses described in this report, there is no severe contamination of the groundwater at Al-Tuwaitha with radioactive constituents. However, significant data gaps exist and this plan recommends the installation of additional groundwater monitoring wells and conducting additional types of radiological and chemical analyses.

  18. Cooperative Monitoring Center Occasional Paper/11: Cooperative Environmental Monitoring in the Coastal Regions of India and Pakistan

    SciTech Connect

    Rajen, Gauray

    1999-06-01

    The cessation of hostilities between India and Pakistan is an immediate need and of global concern, as these countries have tested nuclear devices, and have the capability to deploy nuclear weapons and long-range ballistic missiles. Cooperative monitoring projects among neighboring countries in South Asia could build regional confidence, and, through gradual improvements in relations, reduce the threat of war and the proliferation of weapons of mass destruction. This paper discusses monitoring the trans-border movement of flow and sediment in the Indian and Pakistani coastal areas. Through such a project, India and Pakistan could initiate greater cooperation, and engender movement towards the resolution of the Sir Creek territorial dispute in their coastal region. The Joint Working Groups dialogue being conducted by India and Pakistan provides a mechanism for promoting such a project. The proposed project also falls within a regional framework of cooperation agreed to by several South Asian countries. This framework has been codified in the South Asian Seas Action Plan, developed by Bangladesh, India, Maldives, Pakistan and Sri Lanka. This framework provides a useful starting point for Indian and Pakistani cooperative monitoring in their trans-border coastal area. The project discussed in this paper involves computer modeling, the placement of in situ sensors for remote data acquisition, and the development of joint reports. Preliminary computer modeling studies are presented in the paper. These results illustrate the cross-flow connections between Indian and Pakistani coastal regions and strengthen the argument for cooperation. Technologies and actions similar to those suggested for the coastal project are likely to be applied in future arms control and treaty verification agreements. The project, therefore, serves as a demonstration of cooperative monitoring technologies. The project will also increase people-to-people contacts among Indian and Pakistani policy makers and scientists. In the perceptions of the general public, the project will crystallize the idea that the two countries share ecosystems and natural resources, and have a vested interest in increased collaboration.

  19. Earthquake Education and Outreach in Haiti

    USGS Multimedia Gallery

    Following the devastating 2010 Haiti earthquake, the USGS has been helping with earthquake awareness and monitoring in the country, with continued support from the U.S. Agency for International Development (USAID). This assistance has helped the Bureau des Mines et de l'Energie (BME) in Port-au-Prin...

  20. Earthquakes in Arkansas and vicinity 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Ausbrooks, Scott M.

    2011-01-01

    This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  1. A National Tracking Center for Monitoring Shipments of HEU, MOX, and Spent Nuclear Fuel: How do we implement?

    SciTech Connect

    Mark Schanfein

    2009-07-01

    Nuclear material safeguards specialists and instrument developers at US Department of Energy (USDOE) National Laboratories in the United States, sponsored by the National Nuclear Security Administration (NNSA) Office of NA-24, have been developing devices to monitor shipments of UF6 cylinders and other radioactive materials , . Tracking devices are being developed that are capable of monitoring shipments of valuable radioactive materials in real time, using the Global Positioning System (GPS). We envision that such devices will be extremely useful, if not essential, for monitoring the shipment of these important cargoes of nuclear material, including highly-enriched uranium (HEU), mixed plutonium/uranium oxide (MOX), spent nuclear fuel, and, potentially, other large radioactive sources. To ensure nuclear material security and safeguards, it is extremely important to track these materials because they contain so-called “direct-use material” which is material that if diverted and processed could potentially be used to develop clandestine nuclear weapons . Large sources could be used for a dirty bomb also known as a radioactive dispersal device (RDD). For that matter, any interdiction by an adversary regardless of intent demands a rapid response. To make the fullest use of such tracking devices, we propose a National Tracking Center. This paper describes what the attributes of such a center would be and how it could ultimately be the prototype for an International Tracking Center, possibly to be based in Vienna, at the International Atomic Energy Agency (IAEA).

  2. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-31

    ... Geological Survey National Earthquake Prediction Evaluation Council (NEPEC) AGENCY: Department of the... National Earthquake Prediction Evaluation Council (NEPEC) will hold a 1\\1/2\\ day meeting on September 17 and 18, 2012, at the U.S. Geological Survey National Earthquake Information Center (NEIC),...

  3. Earthquake Shaking - Finding the "Hot Spots"

    USGS Publications Warehouse

    Field, Ned; Jones, Lucile; Jordan, Tom; Benthien, Mark; Wald, Lisa

    2001-01-01

    A new Southern California Earthquake Center study has quantified how local geologic conditions affect the shaking experienced in an earthquake. The important geologic factors at a site are softness of the rock or soil near the surface and thickness of the sediments above hard bedrock. Even when these 'site effects' are taken into account, however, each earthquake exhibits unique 'hotspots' of anomalously strong shaking. Better predictions of strong ground shaking will therefore require additional geologic data and more comprehensive computer simulations of individual earthquakes.

  4. CTEPP NC DATA COLLECTED ON FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data set contains data concerning the potential sources of pollutants at the day care center including the chemicals that have been applied in the past at the day care center by staff members or by commercial contractors. The day care teacher was asked questions related to t...

  5. CTEPP-OH DATA COLLECTED ON FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data set contains data for CTEPP-OH concerning the potential sources of pollutants at the day care center including the chemicals that have been applied in the past at the day care center by staff members or by commercial contractors. The day care teacher was asked questions...

  6. Aftershocks series monitoring of the September 18, 2004 M = 4.6 earthquake at the western Pyrenees: A case of reservoir-triggered seismicity?

    NASA Astrophysics Data System (ADS)

    Ruiz, M.; GaspĂ , O.; Gallart, J.; DĂ­az, J.; Pulgar, J. A.; GarcĂ­a-Sansegundo, J.; LĂłpez-FernĂĄndez, C.; GonzĂĄlez-Cortina, J. M.

    2006-10-01

    On September 18, 2004, a 4.6 mbLg earthquake was widely felt in the region around Pamplona, at the western Pyrenees. Preliminary locations reported an epicenter less than 20 km ESE of Pamplona and close to the Itoiz reservoir, which started impounding in January 2004. The area apparently lacks of significant seismic activity in recent times. After the main shock, which was preceded by series of foreshocks reaching magnitudes of 3.3 mbLg, a dense temporal network of 13 seismic stations was deployed there to monitor the aftershocks series and to constrain the hypocentral pattern. Aftershock determinations obtained with a double-difference algorithm define a narrow epicentral zone of less than 10 km 2, ESE-WNW oriented. The events are mainly concentrated between 3 and 9 km depth. Focal solutions were computed for the main event and 12 aftershocks including the highest secondary one of 3.8 mbLg. They show mainly normal faulting with some strike-slip component and one of the nodal planes oriented NW-SE and dipping to the NE. Cross-correlation techniques applied to detect and associate events with similar waveforms, provided up to 33 families relating the 67% of the 326 relocated aftershocks. Families show event clusters grouped by periods and migrating from NW to SE. Interestingly, the narrow epicentral zone inferred here is located less than 4 km away from the 111-m high Itoiz dam. These hypocentral results, and the correlation observed between fluctuations of the reservoir water level and the seismic activity, favour the explanation of this foreshock-aftershock series as a rapid response case of reservoir-triggered seismicity, burst by the first impoundment of the Itoiz reservoir. The region is folded and affected by shallow dipping thrusts, and the Itoiz reservoir is located on the hangingwall of a low angle southward verging thrust, which might be a case sensible to water level fluctuations. However, continued seismic monitoring in the coming years is mandatory in this area to infer more reliable seismotectonic and hazard assessments.

  7. The parkfield, california, earthquake prediction experiment.

    PubMed

    Bakun, W H; Lindh, A G

    1985-08-16

    Five moderate (magnitude 6) earthquakes with similar features have occurred on the Parkfield section of the San Andreas fault in central California since 1857. The next moderate Parkfield earthquake is expected to occur before 1993. The Parkfield prediction experiment is designed to monitor the details of the final stages of the earthquake preparation process; observations and reports of seismicity and aseismic slip associated with the last moderate Parkfield earthquake in 1966 constitute much of the basis of the design of the experiment. PMID:17739363

  8. Earthquake tectonics

    SciTech Connect

    Steward, R.F. )

    1991-02-01

    Earthquakes release a tremendous amount of energy into the subsurface in the form of seismic waves. The seismic wave energy of the San Francisco 1906 (M = 8.2) earthquake was equivalent to over 8 billion tons of TNT (3.3 {times} 10{sup 19} joules). Four basic wave types are propagated form seismic sources, two non-rotational and two rotational. As opposed to the non-rotational R and SH waves, the rotational compressional (RC) and rotational shear (RS) waves carry the bulk of the energy from a seismic source. RC wavefronts propagate in the subsurface and refract similarly to P waves, but are considerably slower. RC waves are critically refracted beneath the air surface interface at velocities less than the velocity of sound in air because they refract at the velocity of sound in air minus the retrograde particle velocity at the top of the wave. They propagate at tsunami waves in the open ocean, and produce loud sounds on land that are heard by humans and animals during earthquakes. The energy of the RS wave dwarfs that of the P, SH, and even the RC wave. The RS wave is the same as what is currently called the S wave in earthquake seismology, and produces both folding and strike-slip faulting at considerable distances from the epicenter. RC and RS waves, propagated during earthquakes from the Santa Ynez fault and a right-slip fault on trend with the Red Mountain fault, produced the Santa Ynez Mountains in California beginning in the middle Pliocene and continuing until the present.

  9. The EM Earthquake Precursor

    NASA Astrophysics Data System (ADS)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two directional techniques were employed, resulting in three mapped, potential epicenters. The remaining, weaker signals presented similar directionality results to more epicentral locations. In addition, the directional results of the Timpson field tests lead to the design and construction of a third prototype antenna. In a laboratory setting, experiments were created to fail igneous rock types within a custom-designed Faraday Cage. An antenna emplaced within the cage detected EM emissions, which were both reproducible and distinct, and the laboratory results paralleled field results. With a viable system and continuous monitoring, a fracture cycle could be established and observed in real-time. Sequentially, field data would be reviewed quickly for assessment; thus, leading to a much improved earthquake forecasting capability. The EM precursor determined by this method may surpass all prior precursor claims, and the general public will finally receive long overdue forecasting.

  10. Cooperative Monitoring Center Occasional Paper/4: Missile Control in South Asia and the Role of Cooperative Monitoring Technology

    SciTech Connect

    Kamal, N.; Sawhney, P.

    1998-10-01

    The succession of nuclear tests by India and Pakistan in May 1998 has changed the nature of their missile rivalry, which is only one of numerous manifestations of their relationship as hardened adversaries, deeply sensitive to each other's existing and evolving defense capabilities. The political context surrounding this costly rivalry remains unmediated by arms control measures or by any nascent prospect of detente. As a parallel development, sensible voices in both countries will continue to talk of building mutual confidence through openness to avert accidents, misjudgments, and misinterpretations. To facilitate a future peace process, this paper offers possible suggestions for stabilization that could be applied to India's and Pakistan's missile situation. Appendices include descriptions of existing missile agreements that have contributed to better relations for other countries as well as a list of the cooperative monitoring technologies available to provide information useful in implementing subcontinent missile regimes.

  11. 2010 Chile Earthquake Aftershock Response

    NASA Astrophysics Data System (ADS)

    Barientos, Sergio

    2010-05-01

    The Mw=8.8 earthquake off the coast of Chile on 27 February 2010 is the 5th largest megathrust earthquake ever to be recorded and provides an unprecedented opportunity to advance our understanding of megathrust earthquakes and associated phenomena. The 2010 Chile earthquake ruptured the Concepcion-Constitucion segment of the Nazca/South America plate boundary, south of the Central Chile region and triggered a tsunami along the coast. Following the 2010 earthquake, a very energetic aftershock sequence is being observed in an area that is 600 km along strike from Valparaiso to 150 km south of Concepcion. Within the first three weeks there were over 260 aftershocks with magnitude 5.0 or greater and 18 with magnitude 6.0 or greater (NEIC, USGS). The Concepcion-Constitucion segment lies immediately north of the rupture zone associated with the great magnitude 9.5 Chile earthquake, and south of the 1906 and the 1985 Valparaiso earthquakes. The last great subduction earthquake in the region dates back to the February 1835 event described by Darwin (1871). Since 1835, part of the region was affected in the north by the Talca earthquake in December 1928, interpreted as a shallow dipping thrust event, and by the Chillan earthquake (Mw 7.9, January 1939), a slab-pull intermediate depth earthquake. For the last 30 years, geodetic studies in this area were consistent with a fully coupled elastic loading of the subduction interface at depth; this led to identify the area as a mature seismic gap with potential for an earthquake of magnitude of the order 8.5 or several earthquakes of lesser magnitude. What was less expected was the partial rupturing of the 1985 segment toward north. Today, the 2010 earthquake raises some disturbing questions: Why and how the rupture terminated where it did at the northern end? How did the 2010 earthquake load the adjacent segment to the north and did the 1985 earthquake only partially ruptured the plate interface leaving loaded asperities since 1906? Since the number of M>7.0 aftershocks has been low, does the distribution of large-magnitude aftershocks differ from previous events of this size? What is the origin of the extensional-type aftershocks at shallow depths within the upper plate? The international seismological community (France, Germany, U.K., U.S.A.) in collaboration with the Chilean seismological community responded with a total of 140 portable seismic stations to deploy in order to record aftershocks. This combined with the Chilean permanent seismic network, in the area results in 180 stations now in operation recording continuous at 100 cps. The seismic equipment is a mix of accelerometers, short -period and broadband seismic sensors deployed along the entire length of the aftershock zone that will record the aftershock sequence for three to six months. The collected seismic data will be merged and archived to produce an international data set open to the entire seismological community immediately after archiving. Each international group will submit their data as soon as possible in standard (mini seed) format with accompanying meta data to the IRIS DMC where the data will be merged into a combined data set and available to individuals and other data centers. This will be by far the best-recorded aftershock sequence of a large megathrust earthquake. This outstanding international collaboration will provide an open data set for this important earthquake as well as provide a model for future aftershock deployments around the world.

  12. The World Trade Center Attack: Similarities to the 1988 earthquake in Armenia: time to teach the public life-supporting first aid?

    PubMed Central

    Crippen, David

    2001-01-01

    On 7 December 1988, a severe earthquake hit in Armenia, a former republic of the Soviet Union (USSR); on 11 September 2001, a manmade attack of similar impact hit New York City. These events share similar implications for the role of the uninjured survivor. With basic training, the uninjured survivors could save lives without tools or resuscitation equipment. This article makes the case for teaching life-supporting first aid to the public in the hope that one day, should another such incident occur, they would be able to preserve injured victims until formal rescue occurs. PMID:11737915

  13. Analysis of Instrumentation to Monitor the Hydrologic Performance of Green Infrastructure at the Edison Environmental Center

    EPA Science Inventory

    Infiltration is one of the primary functional mechanisms of green infrastructure stormwater controls, so this study explored selection and placement of embedded soil moisture and water level sensors to monitor surface infiltration and infiltration into the underlying soil for per...

  14. Cooperative Monitoring Center Occasional Paper/7: A Generic Model for Cooperative Border Security

    SciTech Connect

    Netzer, Colonel Gideon

    1999-03-01

    This paper presents a generic model for dealing with security problems along borders between countries. It presents descriptions and characteristics of various borders and identifies the threats to border security, while emphasizing cooperative monitoring solutions.

  15. Monitoring and Modeling of Ground Deformation at Three Sisters volcanic center, central Oregon Cascade Range, 1997-2009 (Invited)

    NASA Astrophysics Data System (ADS)

    Dzurisin, D.; Lisowski, M.; Wicks, C. W.

    2009-12-01

    Modeling of InSAR, GPS, and leveling data indicates that uplift of a broad area centered ~6 km west of the summit of South Sister volcano started in 1997 and is continuing at a declining rate. Surface displacements were measured every summer when possible since August 1992 with InSAR, annually since August 2001 using GPS and leveling surveys, and since May 2001 using continuous GPS. Our best-fit model to the deformation data is a vertical, prolate, spheroidal point-pressure source located 4.9-5.4 km below the surface. A more complicated source of this type that includes dip as a free model parameter does not improve the fit to data significantly, and other source types including tabular bodies (dike or sill) produce decidedly poorer results. The source inflation rate decreased exponentially during 2001-2006 with a 1/e decay time of 5.3 ± 1.1 years. The net increase in source volume from September 1997 to August 2006 was 36-42 x 106 m3. A swarm of ~300 small (maximum magnitude 1.9) earthquakes occurred beneath the deforming area in March 2004; no other unusual seismicity has been noted. We attribute surface deformation to intrusion of magma, perhaps at the brittle-ductile transition in hot, thermally altered crust beneath the active Three Sisters volcanic center. Elastic models like those we investigated cannot distinguish between ongoing intrusion at a declining rate and viscoelastic response of the overlying crust and hydrothermal system to an intrusion that might have ended some time ago. Repeated gravity surveys that began in 2002 might help to resolve this ambiguity; gravity results through summer 2009 will be presented separately at this meeting. Similar deformation episodes in the past probably would have gone unnoticed if, as we suspect, most are caused by small intrusions that do not culminate in eruptions.

  16. Continuous Video Electroencephalographic (EEG) Monitoring for Electrographic Seizure Diagnosis in Neonates: A Single-Center Study.

    PubMed

    Wietstock, S O; Bonifacio, S L; Sullivan, J E; Nash, K B; Glass, H C

    2016-03-01

    The objective of this study was to determine the diagnostic yield of continuous video electroencephalographic (EEG) monitoring in critically ill neonates in the setting of a novel, university-based Neonatal Neurocritical Care Service. Patient demographic characteristics, indication for seizure monitoring, and presence of electrographic seizures were obtained by chart review. Among 595 patients cared for by the Neonatal Neurocritical Care Service, 400 (67%) received continuous video EEG. The median duration of continuous video EEG monitoring was 49 (interquartile range = 22-87) hours. Electrographic seizures were captured in 105 of 400 (26% of monitored patients) and of those, 25 of 105 (24%) had no clinical correlate. In addition, 52 of 400 subjects (13%) were monitored due to paroxysmal events concerning for seizures, but never had electrographic seizures. Continuous video EEG monitoring helped confirm or rule out ongoing seizures in more than one-third of the cases. This finding helps to support the use of continuous video EEG in critically ill neonates. PMID:26129976

  17. Great Sumatra Earthquake registers on electrostatic sensor

    NASA Astrophysics Data System (ADS)

    Röder, Helmut; Schuhmann, Wolfram; BĂŒttner, Ralf; Zimanowski, Bernard; Braun, Thomas; Boschi, Enzo

    Strong electrical signals that correspond to the Mw = 9.3 earthquake of 26 December 2004, whichoccurred at 0058:50.7 UTC off the west coast of northern Sumatra, Indonesia, were recorded by an electrostatic sensor (a device that detects short-term variations in Earth's electrostatic field) at a seismic station in Italy, which had been installed to study the influence of local earthquakes on a new landslide monitoring system.Electrical signals arrived at the station practically instantaneously and were detected up to several hours before the onset of the Sumatra earthquake (Figure 1) as well as before local quakes. The corresponding seismic signals (p-waves) arrived 740 seconds after the start of the earthquake. Because the electrical signals travel at the speed of light, electrical monitoring for the global detection of very strong earthquakes could be an important tool in significantly increasing the hazard alert window.

  18. Earthquakes in Mississippi and vicinity 1811-2010

    USGS Publications Warehouse

    Dart, Richard L.; Bograd, Michael B.E.

    2011-01-01

    This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  19. Darwin's earthquake.

    PubMed

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant. PMID:21038753

  20. Research on Earthquake Precursor in E-TEC: A Study on Land Surface Thermal Anomalies Using MODIS LST Product in Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, W. Y.; Wu, M. C.

    2014-12-01

    Taiwan has been known as an excellent natural laboratory characterized by rapid active tectonic rate and high dense seismicity. The Eastern Taiwan Earthquake Research Center (E-TEC) is established on 2013/09/24 in National Dong Hwa University and collaborates with Central Weather Bureau (CWB), National Center for Research on Earthquake Engineering (NCREE), National Science and Technology Center for Disaster Reduction (NCDR), Institute of Earth Science of Academia Sinica (IES, AS) and other institutions (NCU, NTU, CCU) and aims to provide an integrated platform for researchers to conduct the new advances on earthquake precursors and early warning for seismic disaster prevention in the eastern Taiwan, as frequent temblors are most common in the East Taiwan rift valley. E-TEC intends to integrate the multi-disciplinary observations and is equipped with stations to monitor a wide array of factors of quake precursors, including seismicity, GPS, strain-meter, ground water, geochemistry, gravity, electromagnetic, ionospheric density, thermal infrared remote sensing, gamma radiation etc, and will maximize the value of the data for researches with the range of monitoring equipment that enable to predict where and when the next devastated earthquake will strike Taiwan and develop reliable earthquake prediction models. A preliminary study on earthquake precursor using monthly Moderate Resolution Imaging Spectroradiometer (MODIS) Land Surface Temperature (LST) data before 2013/03/27 Mw6.2 Nantou earthquake in Taiwan is presented. Using the statistical analysis, the result shows the peak of the anomalous LST that exceeds a standard deviation of LST appeared on 2013/03/09 and became less or none anomalies observed on 2013/03/16 before the main-shock, which is in consist with the phenomenon observed by other researchers. This preliminary experimental result shows that the thermal anomalies reveal the possibility to associate surface thermal phenomena before the strong earthquakes.

  1. Advances in Earthquake Prediction Research and the June 2000 Earthquakes in Iceland

    NASA Astrophysics Data System (ADS)

    Stefansson, R.

    2006-12-01

    In June 2000, two earthquakes with magnitude 6.6 (Ms) occurred in the central part of the South Iceland seismic zone (SISZ). Earthquakes in this region have, according to historical information, in some cases caused collapse of the majority of houses in areas encompassing 1,000 square kilometers in this relatively densely populated farming region. Because large earthquakes were expected to occur soon, much attention was given to preparedness in the region and for the last two decades it has been the subject of multi- national, mainly European, co-operation in earthquake prediction research and in the development of a high- level micro-earthquake system: the SIL system. Despite intensive surface fissuring caused by the earthquakes and measured accelerations reaching 0.8 g, the earthquakes in 2000 caused no serious injuries and no structural collapse. The relatively minor destruction led to more optimism regarding the safety of living in the area. But it also lead to some optimism about the significance of earthquake prediction research. Both earthquakes had a long-term prediction and the second of the two earthquakes had a short- term warning about place, size and immediacy. In this presentation, I will describe the warnings that were given ahead of the earthquakes. Also, I will reconsider these warnings in light of new results from multi-national earthquake prediction research in Iceland. This modeling work explains several observable patterns caused by crustal process ahead of large earthquakes. Micro-seismic observations and modeling show that, in conditions prevailing in the Icelandic crust, fluids can be carried upward from the brittle-ductile boundary in response to strain, bringing high, near- lithostatic pore pressures into the brittle crust, preparing a region for the release of a large earthquake; monitoring this process will enable long- and short- term earthquakes warnings.

  2. Establishment of Antakya Basin Strong Ground Motion Monitoring System

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Özel, O.; Bikce, M.; Gene?, M. C.; Kac?n, S.; Erdik, M.; Safak, E.; Över, S.

    2009-04-01

    Turkey is located in one of the most active earthquake zones of the world. The cities located along the North Anatolian Fault (NAF) and the East Anatolian Fault (EAF) are exposed to significant earthquake hazard. The Hatay province near the southern terminus of the EAF has always experienced a significant seismic activity, since it is on the intersection of the northernmost segment of Dead Sea Fault Zone coming from the south, with the Cyprean Arc approaching from south-west. Historical records extending over the last 2000 years indicate that Antakya, founded in the 3rd century B.C., is effected by intensity IX-X earthquakes every 150 years. In the region, the last destructive earthquake occurred in 1872. Destructive earthquakes should be expected in the region in the near future similar to the ones that occurred in the past. The strong response of sedimentary basins to seismic waves was largely responsible for the damage produced by the devastating earthquakes of 1985 Michoacan Earthquake which severely damaged parts of Mexico City, and the 1988 Spitak Earthquake which destroyed most of Leninakan, Armenia. Much of this devastating response was explained by the conversion of seismic body waves to surface waves at the sediment/rock contacts of sedimentary basins. "Antakya Basin Strong Ground Motion Monitoring System" is set up with the aim of monitoring the earthquake response of the Antakya Basin, contributing to our understanding of basin response, contributing to earthquake risk assessment of Antakya, monitoring of regional earthquakes and determining the effects of local and regional earthquakes on the urban environment of Antakya. The soil properties beneath the strong motion stations (S-Wave velocity structure and dominant soil frequency) are determined by array measurements that involve broad-band seismometers. The strong motion monitoring system consists of six instruments installed in small buildings. The stations form a straight line along the short axis of Antakya basin passing through the city center. They are equipped with acceleration sensors, GPS and communication units and operate in continuous recording mode. For on-line data transmission the EDGE mode of available GSM systems are employed. In the array measurements for the determination of soil properties beneath the stations two 4-seismometer sets have been utilized. The system is the first monitoring installment in Turkey dedicated to understanding basin effects. The records obtained will allow for the visualization of the propagation of long-period ground motion in the basin and show the refraction of surface waves at the basin edge. The records will also serve to enhance our capacity to realistically synthesize the strong ground motion in basin-type environments.

  3. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  4. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    PubMed

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-01

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  5. Earthquake Information System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  6. Earthquakes and plate tectonics.

    USGS Publications Warehouse

    Spall, H.

    1982-01-01

    Earthquakes occur at the following three kinds of plate boundary: ocean ridges where the plates are pulled apart, margins where the plates scrape past one another, and margins where one plate is thrust under the other. Thus, we can predict the general regions on the earth's surface where we can expect large earthquakes in the future. We know that each year about 140 earthquakes of magnitude 6 or greater will occur within this area which is 10% of the earth's surface. But on a worldwide basis we cannot say with much accuracy when these events will occur. The reason is that the processes in plate tectonics have been going on for millions of years. Averaged over this interval, plate motions amount to several mm per year. But at any instant in geologic time, for example the year 1982, we do not know, exactly where we are in the worldwide cycle of strain build-up and strain release. Only by monitoring the stress and strain in small areas, for instance, the San Andreas fault, in great detail can we hope to predict when renewed activity in that part of the plate tectonics arena is likely to take place. -from Author

  7. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by our actions. Using these global datasets will help to make the model as uniform as possible. The model must be built by scientists in the affected countries with GEM's support, augmented by their insights and data. The model will launch in 2014; to succeed it must be open, international, independent, and continuously tested. But the mission of GEM is not just the likelihood of ground shaking, but also gaging the economic and social consequences of earthquakes, which greatly amplify the losses. For example, should the municipality of Istanbul retrofit schools, or increase its insurance reserves and recovery capacity? Should a homeowner in a high-risk area move or strengthen her building? This is why GEM is a public-private partnership. GEM's fourteen public sponsors and eight non-governmental organization members are standing for the developing world. To extend GEM into the financial world, we draw upon the expertise of companies. GEM's ten private sponsors have endorsed the acquisition of public knowledge over private gain. In a competitive world, this is a courageous act. GEM is but one link in a chain of preparedness: from earth science and engineering research, through groups like GEM, to mitigation, retrofit or relocate decisions, building codes and insurance, and finally to prepared hospitals, schools, and homes. But it is a link that our community can make strong.

  8. Patient-Centered Technological Assessment and Monitoring of Depression for Low-Income Patients

    PubMed Central

    Wu, Shinyi; Vidyanti, Irene; Liu, Pai; Hawkins, Caitlin; Ramirez, Magaly; Guterman, Jeffrey; Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Ell, Kathleen

    2014-01-01

    Depression is a significant challenge for ambulatory care because it worsens health status and outcomes, increases health care utilizations and costs, and elevates suicide risk. An automatic telephonic assessment (ATA) system that links with tasks and alerts to providers may improve quality of depression care and increase provider productivity. We used ATA system in a trial to assess and monitor depressive symptoms of 444 safety-net primary care patients with diabetes. We assessed system properties, evaluated preliminary clinical outcomes, and estimated cost savings. The ATA system is feasible, reliable, valid, safe, and likely cost-effective for depression screening and monitoring for low-income primary care population. PMID:24525531

  9. Upgrading the Digital Electronics of the PEP-II Bunch Current Monitors at the Stanford Linear Accelerator Center

    SciTech Connect

    Kline, Josh; /SLAC

    2006-08-28

    The testing of the upgrade prototype for the bunch current monitors (BCMs) in the PEP-II storage rings at the Stanford Linear Accelerator Center (SLAC) is the topic of this paper. Bunch current monitors are used to measure the charge in the electron/positron bunches traveling in particle storage rings. The BCMs in the PEP-II storage rings need to be upgraded because components of the current system have failed and are known to be failure prone with age, and several of the integrated chips are no longer produced making repairs difficult if not impossible. The main upgrade is replacing twelve old (1995) field programmable gate arrays (FPGAs) with a single Virtex II FPGA. The prototype was tested using computer synthesis tools, a commercial signal generator, and a fast pulse generator.

  10. User-centered development and testing of a monitoring system that provides feedback regarding physical functioning to elderly people

    PubMed Central

    Vermeulen, Joan; Neyens, Jacques CL; Spreeuwenberg, Marieke D; van Rossum, Erik; Sipers, Walther; Habets, Herbert; Hewson, David J; de Witte, Luc P

    2013-01-01

    Purpose To involve elderly people during the development of a mobile interface of a monitoring system that provides feedback to them regarding changes in physical functioning and to test the system in a pilot study. Methods and participants The iterative user-centered development process consisted of the following phases: (1) selection of user representatives; (2) analysis of users and their context; (3) identification of user requirements; (4) development of the interface; and (5) evaluation of the interface in the lab. Subsequently, the monitoring and feedback system was tested in a pilot study by five patients who were recruited via a geriatric outpatient clinic. Participants used a bathroom scale to monitor weight and balance, and a mobile phone to monitor physical activity on a daily basis for six weeks. Personalized feedback was provided via the interface of the mobile phone. Usability was evaluated on a scale from 1 to 7 using a modified version of the Post-Study System Usability Questionnaire (PSSUQ); higher scores indicated better usability. Interviews were conducted to gain insight into the experiences of the participants with the system. Results The developed interface uses colors, emoticons, and written and/or spoken text messages to provide daily feedback regarding (changes in) weight, balance, and physical activity. The participants rated the usability of the monitoring and feedback system with a mean score of 5.2 (standard deviation 0.90) on the modified PSSUQ. The interviews revealed that most participants liked using the system and appreciated that it signaled changes in their physical functioning. However, usability was negatively influenced by a few technical errors. Conclusion Involvement of elderly users during the development process resulted in an interface with good usability. However, the technical functioning of the monitoring system needs to be optimized before it can be used to support elderly people in their self-management. PMID:24039407

  11. MONITORING TOXIC ORGANIC GASES AND PARTICLES NEAR THE WORLD TRADE CENTER AFTER SEPTEMBER 11, 2001

    EPA Science Inventory

    The September 11, 2001 attack on the World Trade Center (WTC) resulted in an intense fire and the subsequent, complete collapse of the two main structures and adjacent buildings, as well as significant damage to many surrounding buildings within and around the WTC complex. Thi...

  12. CTEPP DATA COLLECTION FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data collection form is used to identify the potential sources of pollutants at the day care center. The day care teacher is asked questions related to the age of their day care building; age and frequency of cleaning carpets or rugs; types of heating and air conditioning de...

  13. Hatfield Marine Science Center Dynamic Revetment Project DSL Permit # 45455-FP. Monitoring Report. February, 2016

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  14. Hatfield Marine Science Center Dynamic Revetment Project DSL permit # 45455-FP, Monitoring Report February, 2015

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  15. Hatfield Marine Science Center Dynamic Revetment Project DSL Permit # 45455-FP. Monitoring Report. February, 2014.

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  16. CTEPP DATA COLLECTION FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data collection form is used to provide information on the child's daily activities and potential exposures to pollutants at their homes. It includes questions on chemicals applied and cigarettes smoked at the home over the 48-hr monitoring period. It also collects informati...

  17. CTEPP-OH DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the child’s daily activities and potential exposures to pollutants at their homes for CTEPP-OH. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on th...

  18. CTEPP NC DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the child’s daily activities and potential exposures to pollutants at their homes. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on the child’s han...

  19. Monitoring of the permeable pavement demonstration site at Edison Environmental Center

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch has installed an instrumented, working full-scale 110-space pervious pavement parking lot and has been monitoring several environmental stressors and runoff. This parking lot demonstration site has allowed the investigation of differenc...

  20. CTEPP-OH DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the child’s daily activities and potential exposures to pollutants at their homes for CTEPP-OH. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on th...

  1. CTEPP NC DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the child’s daily activities and potential exposures to pollutants at their homes. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on the child’s han...

  2. EVALUATION OF ENVIROSCAN CAPACITANCE PROBES FOR MONITORING SOIL MOISTURE IN CENTER PIVOT IRRIGATED POTATOES

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Careful irrigation scheduling is the key to providing adequate water to minimize potential leaching losses below the rootzone, while supplying adequate water to minimize negative effects of water stress. Capacitance probes were used for real-time continuous monitoring of soil moisture content at va...

  3. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  4. Long Period Earthquakes Beneath California's Young and Restless Volcanoes

    NASA Astrophysics Data System (ADS)

    Pitt, A. M.; Dawson, P. B.; Shelly, D. R.; Hill, D. P.; Mangan, M.

    2013-12-01

    The newly established USGS California Volcano Observatory has the broad responsibility of monitoring and assessing hazards at California's potentially threatening volcanoes, most notably Mount Shasta, Medicine Lake, Clear Lake Volcanic Field, and Lassen Volcanic Center in northern California; and Long Valley Caldera, Mammoth Mountain, and Mono-Inyo Craters in east-central California. Volcanic eruptions occur in California about as frequently as the largest San Andreas Fault Zone earthquakes-more than ten eruptions have occurred in the last 1,000 years, most recently at Lassen Peak (1666 C.E. and 1914-1917 C.E.) and Mono-Inyo Craters (c. 1700 C.E.). The Long Valley region (Long Valley caldera and Mammoth Mountain) underwent several episodes of heightened unrest over the last three decades, including intense swarms of volcano-tectonic (VT) earthquakes, rapid caldera uplift, and hazardous CO2 emissions. Both Medicine Lake and Lassen are subsiding at appreciable rates, and along with Clear Lake, Long Valley Caldera, and Mammoth Mountain, sporadically experience long period (LP) earthquakes related to migration of magmatic or hydrothermal fluids. Worldwide, the last two decades have shown the importance of tracking LP earthquakes beneath young volcanic systems, as they often provide indication of impending unrest or eruption. Herein we document the occurrence of LP earthquakes at several of California's young volcanoes, updating a previous study published in Pitt et al., 2002, SRL. All events were detected and located using data from stations within the Northern California Seismic Network (NCSN). Event detection was spatially and temporally uneven across the NCSN in the 1980s and 1990s, but additional stations, adoption of the Earthworm processing system, and heightened vigilance by seismologists have improved the catalog over the last decade. LP earthquakes are now relatively well-recorded under Lassen (~150 events since 2000), Clear Lake (~60 events), Mammoth Mountain (~320 events), and Long Valley Caldera (~40 events). LP earthquakes are notably absent under Mount Shasta. With the exception of Long Valley Caldera where LP earthquakes occur at depths of ?5 km, hypocenters are generally between 15-25 km. The rates of LP occurrence over the last decade have been relatively steady within the study areas, except at Mammoth Mountain, where years of gradually declining LP activity abruptly increased after a swarm of unusually deep (20 km) VT earthquakes in October 2012. Epicenter locations relative to the sites of most recent volcanism vary across volcanic centers, but most LP earthquakes fall within 10 km of young vents. Source models for LP earthquakes often involve the resonance of fluid-filled cracks or nonlinear flow of fluids along irregular cracks (reviewed in Chouet and Matoza, 2013, JVGR). At mid-crustal depths the relevant fluids are likely to be low-viscosity basaltic melt and/or exsolved CO2-rich volatiles (Lassen, Clear Lake, Mammoth Mountain). In the shallow crust, however, hydrothermal waters/gases are likely involved in the generation of LP seismicity (Long Valley Caldera).

  5. RST (Robust Satellite Techiniques) analysis for monitoring earth emitted radiation at the time of the Hector Mine 16th October 1999 earthquake

    NASA Astrophysics Data System (ADS)

    Lisi, M.; Filizzola, C.; Genzano, N.; Mazzeo, G.; Pergola, N.; Tramutoli, V.

    2009-12-01

    Several studies have been performed, in the past years, reporting the appearance of space-time anomalies in TIR satellite imagery, from weeks to days, before severe earthquakes. Different authors, in order to explain the appearance of anomalously high TIR records near the place and the time of earthquake occurrence, attributed their appearance to the increase of green-house gas (such as CO2, CH4, etc.) emission rates, to the modification of ground water regime and/or to the increase of convective heat flux. Among the others, a Robust Satellite data analysis Technique (RST), based on the RAT - Robust AVHRR (Advanced Very High Resolution Radiometer) Techniques - approach, was proposed to investigate possible relations between earthquake occurrence and space-time fluctuations of Earth’s emitted TIR radiation observed from satellite. The RST analysis is based on a statistically definition of “TIR anomalies” allowing their identification even in very different natural (e.g. related to atmosphere and/or surface) and observational (e.g. related to time/season, but also to solar and satellite zenithal angles) conditions. The correlation analysis (in the space-time domain) with earthquake occurrence is always carried out by using a validation/confutation approach, in order to verify the presence/absence of anomalous space-time TIR transients in the presence/absence of significant seismic activity. The RST approach was already tested in the case of tens of earthquakes occurred in different continents (Europe, Asia, America and Africa), in various geo-tectonic settings (compressive, extensional and transcurrent) and with a wide range of magnitudes (from 4.0 to 7.9). In this paper, the results of RST analysis performed over 7 years of TIR satellite records collected over the western part of the United States of America at the time of Hector Mine earthquake (16th October 1999, M 7.1) are presented and compared with an identical analysis (confutation) performed in different years (characterized by the absence of earthquakes of similar magnitude over the same area), in order to verify the presence /absence of anomalous space-time TIR transients in both cases.

  6. Feasibility of breath monitoring in patients undergoing elective colonoscopy under propofol sedation: A single-center pilot study

    PubMed Central

    Anand, Gurpreet W; Heuss, Ludwig T

    2014-01-01

    AIM: To determine whether a newly developed respiratory rate monitor can practically and accurately monitor ventilation under propofol sedation in combination with standard monitoring. METHODS: Patients [American Society of Anesthesiologists (ASA) Classification?I-III] scheduled for elective colonoscopy under propofol sedation were monitored with a new device that measures the respiratory rate based on humidity in expired air. Patients with clinically significant cardiac disorders or pulmonary disease and patients requiring emergency procedures were excluded from study participation. All of the patients also received standard monitoring with pulse oximetry. This was a single-center study conducted in a community hospital in Switzerland. After obtaining written informed consent from all subjects, 76 patients (51 females and 25 males) were monitored during colonoscopy under propofol sedation. The primary endpoint was the occurrence of any respiratory event (apnea or hypopnea). Apnea was defined as the cessation of breathing for a minimum of 10 s. Significant apnea was defined as the cessation of breathing for more than 30 s. Hypopnea was defined as a reduction in the respiratory rate below 6/min for a minimum of 10 s. Any cases of significant apnea triggered interventions by the endoscopy team. The interventions included withholding propofol, verbal stimulation of the patients, and increased oxygen supplementation or the chin lift maneuver. A secondary endpoint was the correlation of apnea or hypopnea with hypoxemia (measured as a decrease in SaO2 of at least 5% from baseline or less than 90%). RESULTS: At least one respiratory event was detected in thirty-seven patients (48.7%). In total, there were 73 respiratory events, ranging from one to six events in a single patient. Significant apnea (> 30 s) occurred in five patients (6%). Only one episode of apnea led to a relative SaO2 reduction (from 98% to 93%) after a 50 s lag time. No event requiring assisted ventilation was recorded. Our analysis revealed that the total propofol dose was an independent risk factor for respiratory events (P = 0.01). Artifacts developed with the same frequency with the new device as with conventional pulse oximetry. Compared with pulse oximetry alone, this new monitoring device detected more respiratory events and may provide earlier warning of impending respiratory abnormalities. CONCLUSION: Apnea commonly occurs during endoscopy under sedation and may precede hypoxemia. We recommend this respiration rate monitor as an alternative to capnography to aid in detecting ventilatory problems. PMID:24634712

  7. Monitoring postseismic deformation of the Mw=6.4 February 24, 2004 Al Hoceima (Morocco) earthquake using Multi-Temporal InSAR

    NASA Astrophysics Data System (ADS)

    Cetin, Esra; Cakir, Ziyadin; Meghraoui, Mustapha

    2014-05-01

    The Al Hoceima earthquakes of the May 26, 1994 (Mw=6.0) and February 24, 2004 (Mw=6.4) are the largest seismic events that affected the northern part of Morocco in the last century. The Al Hoceima region is located in the east-west-trending imbricated thrust-and-fold system of the Rif Mountain range that results from the African-Eurasian convergence. The transpressive tectonics and existence of a complex fault network with thrust, normal and strike-slip faulting in the Rif probably reflect the rapidly changing local tectonic regime with block rotations during the Neogene and Quaternary (Meghraoui et al., 1996). The 1994 and 2004 earthquake sequence occurred on conjugate strike-slip faults trending approximately NNE-SSW and NW-SE. The best coseismic model of the 2004 earthquake from InSAR suggests a curved right-lateral strike-slip fault about 21 km-long and 16.5 km-wide, dipping 87-88o eastward with a strike changing from N85oW in the south to N50oW in the north (Cakir et al., 2006). We study the postseismic deformation of the 2004 (Mw=6.4) Al Hoceima earthquake using Multi-Temporal InSAR (MT-InSAR) technique. InSAR time series calculated from 14 ERS-2 SAR images reveals subtle ground movements on the Al Hoceima region between 2004 and 2010 where remarkable coseismic displacement was observed after the earthquake. We used Stanford Method (STaMPS; Hooper, 2008) for analyzing the SAR data that takes the advantage of spatial correlation between pixels and does not use any temporal deformation model in the persistent scatterer identification step. MT-InSAR analysis shows cumulative line-of-sight (LOS) up to 4 cm uplift and subsidence in the region of coseismic surface deformation. Preliminary analysis suggests that the postseismic deformation is likely associated with afterslip.

  8. Cooperative Monitoring Center Occasional Paper/9: De-Alerting Strategic Ballistic Missiles

    SciTech Connect

    Connell, Leonard W.; Edenburn, Michael W.; Fraley, Stanley K.; Trost, Lawrence C.

    1999-03-01

    This paper presents a framework for evaluating the technical merits of strategic ballistic missile de-alerting measures, and it uses the framework to evaluate a variety of possible measures for silo-based, land-mobile, and submarine-based missiles. De-alerting measures are defined for the purpose of this paper as reversible actions taken to increase the time or effort required to launch a strategic ballistic missile. The paper does not assess the desirability of pursuing a de-alerting program. Such an assessment is highly context dependent. The paper postulates that if de-alerting is desirable and is used as an arms control mechanism, de-alerting measures should satisfy specific cirteria relating to force security, practicality, effectiveness, significant delay, and verifiability. Silo-launched missiles lend themselves most readily to de-alerting verification, because communications necessary for monitoring do not increase the vulnerabilty of the weapons by a significant amount. Land-mobile missile de-alerting measures would be more challenging to verify, because monitoring measures that disclose the launcher's location would potentially increase their vulnerability. Submarine-launched missile de-alerting measures would be extremely challlenging if not impossible to monitor without increasing the submarine's vulnerability.

  9. X-ray Weekly Monitoring of the Galactic Center Sgr A* with Suzaku

    NASA Astrophysics Data System (ADS)

    Maeda, Yoshitomo; Nobukawa, Masayoshi; Hayashi, Takayuki; Iizuka, Ryo; Saitoh, Takayuki; Murakami, Hiroshi

    A small gas cloud, G2, is on an orbit almost straight into the supermassive blackhole Sgr A* by spring 2014. This event gives us a rare opportunity to test the mass feeding onto the blackhole by a gas. To catch a possible rise of the mass accretion from the cloud, we have been performing the bi-week monitoring of Sgr A* in autumn and spring in the 2013 fiscal year. The key feature of Suzaku is the high-sensitivity wide-band X-ray spectroscopy all in one observatory. It is characterized by a large effective area combined with low background and good energy resolution, in particular a good line spread function in the low-energy range. Since the desired flare events associated with the G2 approach is a transient event, the large effective area is critical and powerful tools to hunt them. The first monitoring in 2013 autumn was successfully made. The X-rays from Sgr A* and its nearby emission were clearly resolved from the bright transient source AX J1745.6-2901. No very large flare from Sgr A*was found during the monitoring. We also may report the X-ray properties of two serendipitous sources, the neutron star binary AX J1745.6-2901 and a magnetar SGR J1745-29.

  10. Space weather monitoring by ground-based means carried out in Polar Geophysical Center at Arctic and Antarctic Research Institute

    NASA Astrophysics Data System (ADS)

    Janzhura, Alexander

    A real-time information on geophysical processes in polar regions is very important for goals of Space Weather monitoring by the ground-based means. The modern communication systems and computer technology makes it possible to collect and process the data from remote sites without significant delays. A new acquisition equipment based on microprocessor modules and reliable in hush climatic conditions was deployed at the Roshydromet networks of geophysical observations in Arctic and is deployed at observatories in Antarctic. A contemporary system for on-line collecting and transmitting the geophysical data from the Arctic and Antarctic stations to AARI has been realized and the Polar Geophysical Center (PGC) arranged at AARI ensures the near-real time processing and analyzing the geophysical information from 11 stations in Arctic and 5 stations in Antarctic. The space weather monitoring by the ground based means is one of the main tasks standing before the Polar Geophysical Center. As studies by Troshichev and Janzhura, [2012] showed, the PC index characterizing the polar cap magnetic activity appeared to be an adequate indicator of the solar wind energy that entered into the magnetosphere and the energy that is accumulating in the magnetosphere. A great advantage of the PC index application over other methods based on satellite data is a permanent on-line availability of information about magnetic activity in both northern and southern polar caps. A special procedure agreed between Arctic and Antarctic Research Institute (AARI) and Space Institute of the Danish Technical University (DTUSpace) ensures calculation of the unified PC index in quasi-real time by magnetic data from the Thule and Vostok stations (see public site: http://pc-index.org). The method for estimation of AL and Dst indices (as indicators of state of the disturbed magnetosphere) based on data on foregoing PC indices has been elaborated and testified in the Polar Geophysical Center. It is demonstrated that the PC index can be successfully used to monitor the state of the magnetosphere (space weather monitoring) and the readiness of the magnetosphere to producing substorm or storm (space weather nowcasting).

  11. Earthquake Prognosis With Applied Microseism.

    NASA Astrophysics Data System (ADS)

    Ahmedov, N.; Nagiyev, A.

    Earthquakes are the most dangerous natural catastrophy in terms of numerous casualties, amount of damages, areal coverage and difficulties associated with a need to provide secure measures. Inability to forecast these events makes the situation worse due to the following circumstances:-their buried focuses are invisible in the subsurface, they occur suddenly as a thunder, and some tens of the seconds later they leave devastated areas and casualties of tens of thousands of people. Currently earthquake forecausting is actually absolutely inefficient. Microseism application is one of the possible ways to forecast earthquakes. These small oscillation of up-going low-ampitude, irregular wawes observed on seismograms are refered to as microseism. Having been different from earhquakes itself, they are continuous, that is, have no origin coordinate on time axis. Their occurence is associated with breakers observed along shorelines, strong wind and hurricane patterns and so on. J.J.Linch has discovered a new tool to monitor hurricane motion trend over the seas with applied microseism recorded at ad hocstations. Similar to these observations it became possible to monitor the formation of the earthquake focuses based on correlation between low-frequency horizontal ahannels'N-S and E-W components. Microseism field and preceding abnormal variations monitoring data derived from "Cherepaha" 3M and 6/12 device enable to draw out some systematic trend in amplitude/frecuency domain. This relationship observed in a certain frequency range made it possible to define the generation of earthquake focuses with regard to the monitoring station. This variation trend was observed while Turkish and Iranian events happened during 1990,1992, and 1997. It is suggested to be useful to verify these effects in other regions to further correlate available data and work out common forecausting criteria.

  12. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected level of shaking intensity with empirical models of fatality losses calibrated on past earthquakes in each country. Non-seismic detections and macroseismic questionnaires collected online are combined to identify as many as possible of the felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the US Geological Survey, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. All together, we estimate that the number of detected felt earthquakes is around 1 000 per year, compared with the 35 000 earthquakes annually reported by the EMSC! Felt events are already the subject of the web page "Latest significant earthquakes" on EMSC website (http://www.emsc-csem.org/Earthquake/significant_earthquakes.php) and of a dedicated Twitter service @LastQuake. We will present the identification process of the earthquakes that matter, the smartphone application itself (to be released in May) and its future evolutions.

  13. Tohoku earthquake shook the ionosphere

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2011-08-01

    The giant 11 March 2011 magnitude 9 Tohoku earthquake not only shook the Earth and caused devastating tsunamis but also rattled the ionosphere, according to a new study. The surface seismic waves and tsunamis triggered waves in the atmosphere. These atmospheric waves propagated upward into the ionosphere, creating ripples in ionized gas nearly 350 kilometers above the Earth. Liu et al. measured these disturbances, called seismotraveling ionospheric disturbances (STID), using GPS receivers in Japan. The first disturbance appeared as a disk-shaped increase in electron density in the ionosphere about 7 minutes after the earthquake. Sequences of concentric waves of increased electron density then traveled from the STID center. Similar ionospheric disturbances have been observed following other earthquakes, but these were the largest ever seen, the authors report. (Journal of Geophysical Research-Space Physics, doi:10.1029/2011JA016761, 2011)

  14. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for recurrence, duration, and frequency response. At the Southern California field sites, one loop antenna was positioned for omni-directional reception and also detected a strong First Schumann Resonance; however, additional Schumann Resonances were absent. At the Timpson, TX field sites, loop antennae were positioned for directional reception, due to earthquake-induced, hydraulic fracturing activity currently conducted by the oil and gas industry. Two strong signals, one moderately strong signal, and approximately 6-8 weaker signals were detected in the immediate vicinity. The three stronger signals were mapped by a biangulation technique, followed by a triangulation technique for confirmation. This was the first antenna mapping technique ever performed for determining possible earthquake epicenters. Six and a half months later, Timpson experienced two M4 (M4.1 and M4.3) earthquakes on September 2, 2013 followed by a M2.4 earthquake three days later, all occurring at a depth of five kilometers. The Timpson earthquake activity now has a cyclical rate and a forecast was given to the proper authorities. As a result, the Southern California and Timpson, TX field results led to an improved design and construction of a third prototype antenna. With a loop antenna array, a viable communication system, and continuous monitoring, a full fracture cycle can be established and observed in real-time. In addition, field data could be reviewed quickly for assessment and lead to a much more improved earthquake forecasting capability. The EM precursors determined by this method appear to surpass all prior precursor claims, and the general public will finally receive long overdue forecasting.

  15. Evidence for Ancient Mesoamerican Earthquakes

    NASA Astrophysics Data System (ADS)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the collapse of the hiearchical authority at these locations and may have contributed to the end of the Classic culture at other nearby sites in proximity to the Caribbean plate boundary zone.

  16. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will complete 450 entries, which will populate the E3 collection to a level that fully spans earthquake science and engineering. Scientists, engineers, and educators who have suggestions for content to be included in the Encyclopedia can visit www.earthquake.info now to complete the "Suggest a Web Page" form.

  17. Monitoring the Implementation of Consultation Planning, Recording, and Summarizing in a Breast Care Center

    PubMed Central

    Belkora, Jeffrey K.; Loth, Meredith K.; Chen, Daniel F.; Chen, Jennifer Y.; Volz, Shelley; Esserman, Laura J.

    2008-01-01

    OBJECTIVE We implemented and monitored a clinical service, Consultation Planning, Recording and Summarizing (CPRS), in which trained facilitators elicit patient questions for doctors, and then audio-record, and summarize the doctor-patient consultations. METHODS We trained 8 schedulers to offer CPRS to breast cancer patients making treatment decisions, and trained 14 premedical interns to provide the service. We surveyed a convenience sample of patients regarding their self-efficacy and decisional conflict. We solicited feedback from physicians, schedulers, and CPRS staff on our implementation of CPRS. RESULTS 278 patients used CPRS over the 22 month study period, an exploitation rate of 32% compared to our capacity. Thirty-seven patients responded to surveys, providing pilot data showing improvements in self-efficacy and decisional conflict. Physicians, schedulers, and premedical interns recommended changes in the program’s locations; delivery; products; and screening, recruitment and scheduling processes. CONCLUSION Our monitoring of this implementation found elements of success while surfacing recommendations for improvement. PRACTICE IMPLICATIONS We made changes based on study findings. We moved Consultation Planning to conference rooms or telephone sessions; shortened the documents produced by CPRS staff; diverted slack resources to increase recruitment efforts; and obtained a waiver of consent in order to streamline and improve ongoing evaluation. PMID:18755564

  18. Cooperative Monitoring Center Occasional Paper/8: Cooperative Border Security for Jordan: Assessment and Options

    SciTech Connect

    Qojas, M.

    1999-03-01

    This document is an analysis of options for unilateral and cooperative action to improve the security of Jordan's borders. Sections describe the current political, economic, and social interactions along Jordan's borders. Next, the document discusses border security strategy for cooperation among neighboring countries and the adoption of confidence-building measures. A practical cooperative monitoring system would consist of hardware for early warning, command and control, communications, and transportation. Technical solutions can expand opportunities for the detection and identification of intruders. Sensors (such as seismic, break-wire, pressure-sensing, etc.) can warn border security forces of intrusion and contribute to the identification of the intrusion and help formulate the response. This document describes conceptual options for cooperation, offering three scenarios that relate to three hypothetical levels (low, medium, and high) of cooperation. Potential cooperative efforts under a low cooperation scenario could include information exchanges on military equipment and schedules to prevent misunderstandings and the establishment of protocols for handling emergency situations or unusual circumstances. Measures under a medium cooperation scenario could include establishing joint monitoring groups for better communications, with hot lines and scheduled meetings. The high cooperation scenario describes coordinated responses, joint border patrols, and sharing border intrusion information. Finally, the document lists recommendations for organizational, technical, and operational initiatives that could be applicable to the current situation.

  19. Earthquake location determination using data from DOMERAPI and BMKG seismic networks: A preliminary result of DOMERAPI project

    NASA Astrophysics Data System (ADS)

    Ramdhan, Mohamad; Nugraha, Andri Dian; Widiyantoro, Sri; Métaxian, Jean-Philippe; Valencia, Ayunda Aulia

    2015-04-01

    DOMERAPI project has been conducted to comprehensively study the internal structure of Merapi volcano, especially about deep structural features beneath the volcano. DOMERAPI earthquake monitoring network consists of 46 broad-band seismometers installed around the Merapi volcano. Earthquake hypocenter determination is a very important step for further studies, such as hypocenter relocation and seismic tomographic imaging. Ray paths from earthquake events occurring outside the Merapi region can be utilized to delineate the deep magma structure. Earthquakes occurring outside the DOMERAPI seismic network will produce an azimuthal gap greater than 1800. Owing to this situation the stations from BMKG seismic network can be used jointly to minimize the azimuthal gap. We identified earthquake events manually and carefully, and then picked arrival times of P and S waves. The data from the DOMERAPI seismic network were combined with the BMKG data catalogue to determine earthquake events outside the Merapi region. For future work, we will also use the BPPTKG (Center for Research and Development of Geological Disaster Technology) data catalogue in order to study shallow structures beneath the Merapi volcano. The application of all data catalogues will provide good information as input for further advanced studies and volcano hazards mitigation.

  20. Earthquake location determination using data from DOMERAPI and BMKG seismic networks: A preliminary result of DOMERAPI project

    SciTech Connect

    Ramdhan, Mohamad; Nugraha, Andri Dian; Widiyantoro, Sri; MĂ©taxian, Jean-Philippe; Valencia, Ayunda Aulia

    2015-04-24

    DOMERAPI project has been conducted to comprehensively study the internal structure of Merapi volcano, especially about deep structural features beneath the volcano. DOMERAPI earthquake monitoring network consists of 46 broad-band seismometers installed around the Merapi volcano. Earthquake hypocenter determination is a very important step for further studies, such as hypocenter relocation and seismic tomographic imaging. Ray paths from earthquake events occurring outside the Merapi region can be utilized to delineate the deep magma structure. Earthquakes occurring outside the DOMERAPI seismic network will produce an azimuthal gap greater than 180{sup 0}. Owing to this situation the stations from BMKG seismic network can be used jointly to minimize the azimuthal gap. We identified earthquake events manually and carefully, and then picked arrival times of P and S waves. The data from the DOMERAPI seismic network were combined with the BMKG data catalogue to determine earthquake events outside the Merapi region. For future work, we will also use the BPPTKG (Center for Research and Development of Geological Disaster Technology) data catalogue in order to study shallow structures beneath the Merapi volcano. The application of all data catalogues will provide good information as input for further advanced studies and volcano hazards mitigation.

  1. Earthquake occurrence and effects.

    PubMed

    Adams, R D

    1990-01-01

    Although earthquakes are mainly concentrated in zones close to boundaries of tectonic plates of the Earth's lithosphere, infrequent events away from the main seismic regions can cause major disasters. The major cause of damage and injury following earthquakes is elastic vibration, rather than fault displacement. This vibration at a particular site will depend not only on the size and distance of the earthquake but also on the local soil conditions. Earthquake prediction is not yet generally fruitful in avoiding earthquake disasters, but much useful planning to reduce earthquake effects can be done by studying the general earthquake hazard in an area, and taking some simple precautions. PMID:2347628

  2. Nonextensive characteristics of earthquakes magnitude distribution in Javakheti region, Georgia

    NASA Astrophysics Data System (ADS)

    Chelidze, Tamaz; Matcharashvili, Teimuraz; Jorjiashvili, Nato; Javakhishvili, Zurab

    2010-05-01

    For last several years nonextensive statistical mechanics is increasingly used to study wide range of complex phenomena exhibiting the scale free nature in different domains. It is assumed that nonextensivity concepts may provide a suitable framework to shed new light on features of spatiotemporal and energetic behavior of seismic processes which presently are not fully understood. In present research we studied cumulative distribution of earthquakes magnitudes in Caucasus from both common and nonextensive statistical mechanics point of views. Data sets of earthquakes magnitudes from 1960 to 1991 have been compiled from data bases of Seismic Monitoring Center at Ilia State University in Georgia. Javakheti Region in Southern Georgia was selected based on its geological structure and high seismic activity; exact time interval was specified because of increased seismic activity in Caucasus for that period. Together with common seismic characteristics such as a andbvalues of Gutenberg-Richter relationship, we evaluated nonextensive characteristics in the framework of earthquakes fragment-asperity interaction model. Namely nonextensive parameter qand energy density value a were calculated. All these characteristics have been assessed for the whole observation period as well as for consecutive 10 year overlapping sliding windows. It was observed that calculated nonextensive characteristics both for whole catalogue and for sliding windows (q=1.6-1.83) are close to the range found earlier for other regions. At the same time we see that bothaandq values vary in the investigated period, for consecutive sliding windows. These changes are statistically significant and obviously are related to the earthquake generation process of Javakheti region. Indeed, it was observed that nonextensivity parameter increases according to local seismic activity, which may point to the increase of functional relationship between above parameters prior and during earthquake generation. At the same time energy density value a, which is assumed to be related with spatial distribution, decreases after strongest event for the considered time period. These results point to increased long range correlations of seismic process in energetic and spatial domains prior and during strongest regional earthquakes. Results of nonextensive analysis are in good accordance with b value analysis. After strongest event b value increases and a decreases that is consistent with a physical meaning of these parameters. Results of our research supports assumption that nonextensive statistics can provide a new promising approach to earthquakes distribution features in different domains.

  3. Retrofitting Laboratory Fume Hoods With Face Velocity Monitors at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Wagner, Ingrid E.; Bold, Margaret D.; Diamond, David B.; Kall, Phillip M.

    1997-01-01

    Extensive use and reliance on laboratory fume hoods exist at LeRC for the control of chemical hazards (nearly 175 fume hoods). Flow-measuring devices are necessary to continually monitor hood performance. The flow-measuring device should he tied into an energy management control system to detect problems at a central location without relying on the users to convey information of a problem. Compatibility concerns and limitations should always be considered when choosing the most effective flow-measuring device for a particular situation. Good practice on initial hood design and placement will provide a system for which a flow-measuring device may be used to its full potential and effectiveness.

  4. Monitoring seismic wave velocity changes associated with the Mw 7.9 Wenchuan earthquake: increasing the temporal resolution using curvelet filters

    NASA Astrophysics Data System (ADS)

    Stehly, Laurent; Froment, Bérénice; Campillo, Michel; Liu, Qi Yuan; Chen, Jiu Hui

    2015-06-01

    The aim of this study is to improve the temporal resolution of seismic wave velocity variations measured using ambient noise correlations. We first reproduce the result obtained by Chen et al. using a network of 21 broad-band stations ideally located around the fault system activated during the Wenchuan earthquake.We measure a velocity drop of 0.07 per cent that was associated with the main shock, with a temporal resolution of 30 days. To determine whether this velocity drop is co-seismic or post-seismic, we attempt to increase the temporal resolution of our observations. By taking advantage of the properties of the curvelet transform, we increase the signal-to-noise ratio of the daily correlations computed between each station pair. It is then possible to measure the velocity drop associated with the Wenchuan earthquake with a temporal resolution of 1 day. This shows that the velocity drop started on 2008 May 12, which was the day of the earthquake, and the velocity reached its lowest value 2 days after the main shock. Moreover, there was a second velocity drop on 2008 May 27, which might relate to strong aftershocks.

  5. Investigation on the Possible Relationship between Magnetic Pulsations and Earthquakes

    NASA Astrophysics Data System (ADS)

    Jusoh, M.; Liu, H.; Yumoto, K.; Uozumi, T.; Takla, E. M.; Yousif Suliman, M. E.; Kawano, H.; Yoshikawa, A.; Asillam, M.; Hashim, M.

    2012-12-01

    The sun is the main source of energy to the solar system, and it plays a major role in affecting the ionosphere, atmosphere and the earth surface. The connection between solar wind and the ground magnetic pulsations has been proven empirically by several researchers previously (H. J. Singer et al., 1977, E. W. Greenstadt, 1979, I. A. Ansari 2006 to name a few). In our preliminary statistical analysis on relationship between solar and seismic activities (Jusoh and Yumoto, 2011, Jusoh et al., 2012), we observed a high possibility of solar-terrestrial coupling. We observed high tendency of earthquakes to occur during lower phase solar cycles which significantly related with solar wind parameters (i.e solar wind dynamic pressure, speed and input energy). However a clear coupling mechanism was not established yet. To connect the solar impact on seismicity, we investigate the possibility of ground magnetic pulsations as one of the connecting agent. In our analysis, the recorded ground magnetic pulsations are analyzed at different ranges of ultra low frequency; Pc3 (22-100 mHz), Pc4 (6.7-22 mHz) and Pc5 (1.7-6.7 mHz) with the occurrence of local earthquake events at certain time periods. This analysis focuses at 2 different major seismic regions; north Japan (mid latitude) and north Sumatera, Indonesia (low latitude). Solar wind parameters were obtained from the Goddard Space Flight Center, NASA via the OMNIWeb Data Explorer and the Space Physics Data Facility. Earthquake events were extracted from the Advanced National Seismic System (ANSS) database. The localized Pc3-Pc5 magnetic pulsations data were extracted from Magnetic Data Acquisition System (MAGDAS)/Circum Pan Magnetic Network (CPMN) located at Ashibetsu (Japan); for earthquakes monitored at north Japan and Langkawi (Malaysia); for earthquakes observed at north Sumatera. This magnetometer arrays has established by International Center for Space Weather Science and Education, Kyushu University, Japan. From the results, we observed significant correlations between ground magnetic pulsations and solar wind speed at difference earthquake epicenter depths. The details of the analysis will be discussed in the presentation.

  6. Napa Earthquake impact on water systems

    NASA Astrophysics Data System (ADS)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  7. Launch Complex 39 Observation Gantry Area (SWMU# 107) Annual Long-Term Monitoring Report (Year 1) Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Johnson, Jill W.; Towns, Crystal

    2015-01-01

    This document has been prepared by Geosyntec Consultants, Inc. (Geosyntec) to present and discuss the findings of the 2014 and 2015 Long-Term Monitoring (LTM) activities that were completed at the Launch Complex 39 (LC39) Observation Gantry Area (OGA) located at the John F. Kennedy Space Center (KSC), Florida (Site). The remainder of this report includes: (i) a description of the Site location; (ii) summary of Site background and previous investigations; (iii) description of field activities completed as part of the annual LTM program at the Site; (iv) groundwater flow evaluation; (v) presentation and discussion of field and analytical results; and (vi) conclusions and recommendations. Applicable KSC Remediation Team (KSCRT) Meeting minutes are included in Attachment A. This Annual LTM Letter Report was prepared by Geosyntec Consultants (Geosyntec) for NASA under contract number NNK12CA13B, Delivery Order NNK13CA39T project number PCN ENV2188.

  8. Radioanalytical Data Quality Objectives and Measurement Quality Objectives during a Federal Radiological Monitoring and Assessment Center Response

    SciTech Connect

    E. C. Nielsen

    2006-01-01

    During the early and intermediate phases of a nuclear or radiological incident, the Federal Radiological Monitoring and Assessment Center (FRMAC) collects environmental samples that are analyzed by organizations with radioanalytical capability. Resources dedicated to quality assurance (QA) activities must be sufficient to assure that appropriate radioanalytical measurement quality objectives (MQOs) and assessment data quality objectives (DQOs) are met. As the emergency stabilizes, QA activities will evolve commensurate with the need to reach appropriate DQOs. The MQOs represent a compromise between precise analytical determinations and the timeliness necessary for emergency response activities. Minimum detectable concentration (MDC), lower limit of detection, and critical level tests can all serve as measurements reflecting the MQOs. The relationship among protective action guides (PAGs), derived response levels (DRLs), and laboratory detection limits is described. The rationale used to determine the appropriate laboratory detection limit is described.

  9. PAGER - Rapid Assessment of an Earthquake's Impact

    USGS Publications Warehouse

    Earle, Paul; Wald, David J.

    2007-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system to rapidly assess the number of people and regions exposed to severe shaking by an earthquake, and inform emergency responders, government agencies, and the media to the scope of the potential disaster. PAGER monitors the U.S. Geological Survey?s near real-time U.S. and global earthquake detections and automatically identifies events that are of societal importance, well in advance of ground-truth or news accounts.

  10. Multiple asperity model for earthquake prediction

    USGS Publications Warehouse

    Wyss, M.; Johnston, A.C.; Klein, F.W.

    1981-01-01

    Large earthquakes often occur as multiple ruptures reflecting strong variations of stress level along faults. Dense instrument networks with which the volcano Kilauea is monitored provided detailed data on changes of seismic velocity, strain accumulation and earthquake occurrence rate before the 1975 Hawaii 7.2-mag earthquake. During the ???4 yr of preparation time the mainshock source volume had separated into crustal volumes of high stress levels embedded in a larger low-stress volume, showing respectively high- and low-stress precursory anomalies. ?? 1981 Nature Publishing Group.

  11. The Cooperative Monitoring Center: Achieving cooperative security objectives through technical collaborations

    SciTech Connect

    Pregenzer, A.

    1996-08-01

    The post cold war security environment poses both difficult challenges and encouraging opportunities. Some of the most difficult challenges are related to regional conflict and the proliferation of weapons of mass destruction. New and innovative approaches to prevent the proliferation of weapons of mass destruction are essential. More effort must be focused on underlying factors that motivate countries to seek weapons of mass destruction. Historically the emphasis has been on denial: denying information, denying technology, and denying materials necessary to build such weapons. Though still important, those efforts are increasingly perceived to be insufficient, and initiatives that address underlying motivational factors are needed. On the opportunity side, efforts to establish regional dialogue and confidence-building measures are increasing in many areas. Such efforts can result in cooperative agreements on security issues such as border control, demilitarized zones, weapons delivery systems, weapons of mass destruction free zones, environmental agreements, and resource sharing. In some cases, implementing such cooperative agreements will mean acquiring, analyzing, and sharing large quantities of data and sensitive information. These arrangements for ``cooperative monitoring`` are becoming increasingly important to the security of individual countries, regions, and international institutions. However, many countries lack sufficient technical and institutional infrastructure to take full advantage of these opportunities. Constructing a peaceful twenty-first century will require that technology is brought to bear in the most productive and innovative ways to meet the challenges of proliferation and to maximize the opportunities for cooperation.

  12. Present Status of the Tsukuba Magnet Laboratory. A Report on the Aftereffects of the March 11, 2011 Earthquake

    NASA Astrophysics Data System (ADS)

    Nimori, Shigeki

    2014-10-01

    The Tsukuba Magnet Laboratory (TML) is located 324 km from the seismic center of the first 9.0 magnitude earthquake that struck Japan on Friday, March 11, 2011. TML suffered peak ground acceleration of 372 Gal. The large 930 and 1030 MHz nuclear magnetic resonance (NMR) magnets of TML were severely affected by the earthquake. The hybrid magnet and its control system were not significantly damaged. After the earthquake, serious electricity shortages occurred and our awareness of the importance of energy conservation increased. A control system for a hybrid magnet has been in development for several years. The system has sophisticated monitoring capability, detailed and rapid data recording, and is now nearing completion. The newly developed system provides detailed data; our ability to interpret this data and identify difficulties in the acquisition of critical data is improving. We are now beginning to optimize operations to reduce electricity consumption and achieve higher efficiency magnet operations.

  13. EARTHQUAKE TRIGGERING AND SPATIAL-TEMPORAL RELATIONS IN THE VICINITY OF YUCCA MOUNTAIN, NEVADA

    SciTech Connect

    na

    2001-02-08

    It is well accepted that the 1992 M 5.6 Little Skull Mountain earthquake, the largest historical event to have occurred within 25 km of Yucca Mountain, Nevada, was triggered by the M 7.2 Landers earthquake that occurred the day before. On the premise that earthquakes can be triggered by applied stresses, we have examined the earthquake catalog from the Southern Great Basin Digital Seismic Network (SGBDSN) for other evidence of triggering by external and internal stresses. This catalog now comprises over 12,000 events, encompassing five years of consistent monitoring, and has a low threshold of completeness, varying from M 0 in the center of the network to M 1 at the fringes. We examined the SGBDSN catalog response to external stresses such as large signals propagating from teleseismic and regional earthquakes, microseismic storms, and earth tides. Results are generally negative. We also examined the interplay of earthquakes within the SGBDSN. The number of ''foreshocks'', as judged by most criteria, is significantly higher than the background seismicity rate. In order to establish this, we first removed aftershocks from the catalog with widely used methodology. The existence of SGBDSN foreshocks is supported by comparing actual statistics to those of a simulated catalog with uniform-distributed locations and Poisson-distributed times of occurrence. The probabilities of a given SGBDSN earthquake being followed by one having a higher magnitude within a short time frame and within a close distance are at least as high as those found with regional catalogs. These catalogs have completeness thresholds two to three units higher in magnitude than the SGBDSN catalog used here. The largest earthquake in the SGBDSN catalog, the M 4.7 event in Frenchman Flat on 01/27/1999, was preceded by a definite foreshock sequence. The largest event within 75 km of Yucca Mountain in historical time, the M 5.7 Scotty's Junction event of 08/01/1999, was also preceded by foreshocks. The monitoring area of the SGBDSN has been in a long period of very low moment release rate since February of 1999. The seismicity catalog to date suggests that the next significant (M > 4) earthquake within the SGBDSN will be preceded by foreshocks.

  14. Data Management and Site-Visit Monitoring of the Multi-Center Registry in the Korean Neonatal Network

    PubMed Central

    Choi, Chang Won

    2015-01-01

    The Korean Neonatal Network (KNN), a nationwide prospective registry of very-low-birth-weight (VLBW, < 1,500 g at birth) infants, was launched in April 2013. Data management (DM) and site-visit monitoring (SVM) were crucial in ensuring the quality of the data collected from 55 participating hospitals across the country on 116 clinical variables. We describe the processes and results of DM and SVM performed during the establishment stage of the registry. The DM procedure included automated proof checks, electronic data validation, query creation, query resolution, and revalidation of the corrected data. SVM included SVM team organization, identification of unregistered cases, source document verification, and post-visit report production. By March 31, 2015, 4,063 VLBW infants were registered and 1,693 queries were produced. Of these, 1,629 queries were resolved and 64 queries remain unresolved. By November 28, 2014, 52 participating hospitals were visited, with 136 site-visits completed since April 2013. Each participating hospital was visited biannually. DM and SVM were performed to ensure the quality of the data collected for the KNN registry. Our experience with DM and SVM can be applied for similar multi-center registries with large numbers of participating centers. PMID:26566353

  15. Monitoring of fungal loads in seabird rehabilitation centers with comparisons to natural seabird environments in northern California.

    PubMed

    Burco, Julia D; Massey, J Gregory; Byrne, Barbara A; Tell, Lisa; Clemons, Karl V; Ziccardi, Michael H

    2014-03-01

    Aspergillosis remains a major cause of mortality in captive and rehabilitated seabirds. To date, there has been poor documentation of fungal (particularly Aspergillus spp.) burdens in natural seabird loafing and roosting sites compared with fungal numbers in rehabilitation or captive settings and the various microenvironments that seabirds are exposed to during the rehabilitation process. This study compares fungal, particularly Aspergillus spp., burdens potentially encountered by seabirds in natural and rehabilitation environments. Differences among the various microenvironments in the rehabilitation facility were evaluated to determine the risk of infection when seabirds are experiencing high stress and poor immune function. Aspergillus spp. counts were quantified in three wildlife rehabilitation centers and five natural seabird loafing and roosting sites in northern California using a handheld impact air sampler and a water filtration system. Wildlife rehabilitation centers demonstrated an increase in numbers of conidia of Aspergillus spp. and Aspergillus fumigatus in air and water samples from select aquatic bird rehabilitation centers compared with natural seabird environments in northern California. Various microenvironments in the rehabilitation facility were identified as having higher numbers of conidia of Aspergillus spp. These results suggest that periodic monitoring of multiple local areas, where the birds spend time in a rehabilitation facility, should be done to identify "high risk" sites, where birds should spend minimal time, or sites that should be cleaned more frequently or have improved air flow to reduce exposure to fungal conidia. Overall, these results suggest that seabirds may be more likely to encounter Aspergillus spp. in various microenvironments in captivity, compared with their native habitats, which could increase their risk of developing disease when in a debilitated state. PMID:24712159

  16. Japanese earthquake predictability experiment with multiple runs before and after the 2011 Tohoku-oki earthquake

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Tsuruoka, H.; Yokoi, S.

    2013-12-01

    The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.

  17. Post-Sumatra Enhancements at the Pacific Tsunami Warning Center

    NASA Astrophysics Data System (ADS)

    McCreery, C.; Weinstein, S.; Becker, N.; Cessaro, R.; Hirshorn, B.; Fryer, G.; Hsu, V.; Sardina, V.; Koyanagi, S.; Shiro, B.; Wang, D.; Walsh, D.

    2007-12-01

    Following the tragic Indian Ocean Tsunami of 2004, the Richard Hagemeyer Pacific Tsunami Warning Center (PTWC) has dramatically enhanced its capabilities. With improved communications PTWC now ingests seismic data from almost all broadband stations of the Global Seismographic Network and will soon add many stations from the International Monitoring System. As data sources are increased PTWC's response time to any earthquake declines; for most earthquakes the center now gets out an initial message in about 12 minutes. With 24-hour staffing, that performance is maintained around the clock. Direct measurement of tsunamis has been improved through communications upgrades to coastal tide gauges by NOAA and other collaborators in the Pacific Tsunami Warning System, and by the NOAA deployment of DART instruments throughout the world's oceans. In addition to providing warnings for the Pacific (with the exception of Alaska and the west coasts of the U.S, and Canada, which are the responsibility of the West Coast and Alaska Tsunami Warning Center), PTWC also operates as an interim warning center for the Indian Ocean (a task performed in collaboration with the Japan Meteorological Agency) and the Caribbean. PTWC also operates as a local warning center for the State of Hawaii. In Hawaii, the installation of new seismometers again means a continuous reduction in PTWC's response times. Initial assessments of local earthquakes are routinely accomplished in less than five minutes, and the first message for the Kiholo Bay Earthquake of 2006 was issued in only three minutes. With the development of the Hawaii Integrated Seismographic Network, in collaboration with the U.S. Geological Survey, the goal is to reduce the time for tsunami warnings to under two minutes for any earthquake in the Hawaiian Islands.

  18. Long-term monitoring of creep rate along the Hayward fault and evidence for a lasting creep response to 1989 Loma Prieta earthquake

    USGS Publications Warehouse

    Lienkaemper, J.J.; Galehouse, J.S.; Simpson, R.W.

    2001-01-01

    We present results from over 30 yr of precise surveys of creep along the Hayward fault. Along most of the fault, spatial variability in long-term creep rates is well determined by these data and can help constrain 3D-models of the depth of the creeping zone. However, creep at the south end of the fault stopped completely for more than 6 years after the M7 1989 Loma Prieta Earthquake (LPEQ), perhaps delayed by stress drop imposed by this event. With a decade of detailed data before LPEQ and a decade after it, we report that creep response to that event does indeed indicate the expected deficit in creep.

  19. The Seminole Serpent Warrior At Miramar, FL, Shows Settlement Locations Enabled Environmental Monitoring Reminiscent Of the Four-corners Kokopelli-like EMF Phenomena, and Related to Earthquakes, Tornados and Hurricanes.

    NASA Astrophysics Data System (ADS)

    Balam Matagamon, Chan; Pawa Matagamon, Sagamo

    2004-03-01

    Certain Native Americans of the past seem to have correctly deduced that significant survival information for their tradition-respecting cultures resided in EMF-based phenomena that they were monitoring. This is based upon their myths and the place or cult-hero names they bequeathed us. The sites we have located in FL have been detectable by us visually, usually by faint blue light, or by the elicitation of pin-like prickings, by somewhat intense nervous-system response, by EMF interactions with aural electrochemical systems that can elicit tinitus, and other ways. In the northeast, Cautantowit served as a harbinger of Indian summer, and appears to be another alter ego of the EMF. The Miami, FL Tequesta site along the river clearly correlates with tornado, earthquake and hurricane locations. Sites like the Mohave Deserts giant man may have had similar significance.

  20. Tracking Earthquake Cascades

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2011-12-01

    In assessing their risk to society, earthquakes are best characterized as cascades that can propagate from the natural environment into the socio-economic (built) environment. Strong earthquakes rarely occur as isolated events; they usually cluster in foreshock-mainshock-aftershock sequences, seismic swarms, and extended sequences of large earthquakes that propagate along major fault systems. These cascades are regulated by stress-mediated interactions among faults driven by tectonic loading. Within these cascades, each large event can itself cause a chain reaction in which the primary effects of faulting and ground shaking induce secondary effects, including tsunami, landslides, liquefaction, and set off destructive processes within the built environment, such as fires and radiation leakage from nuclear plants. Recent earthquakes have demonstrated how the socio-economic effects of large earthquakes can reverberate for many years. To reduce earthquake risk and improve the resiliency of communities to earthquake damage, society depends on five geotechnologies for tracking earthquake cascades: long-term probabilistic seismic hazard analysis (PSHA), short-term (operational) earthquake forecasting, earthquake early warning, tsunami warning, and the rapid production of post-event information for response and recovery (see figure). In this presentation, I describe how recent advances in earthquake system science are leading to improvements in this geotechnology pipeline. In particular, I will highlight the role of earthquake simulations in predicting strong ground motions and their secondary effects before and during earthquake cascades

  1. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and hazard response to create a program that is both educational and provides a public service. Seismic Sleuths and Written in Stone are the harbingers of a new genre of earthquake programs that are the antithesis of the 1974 film Earthquake and the 2004 miniseries 10.5. Film producers and those in the earthquake education community are demonstrating that it is possible to tell an exciting story, inspire awareness, and encourage empowerment without sensationalism.

  2. Increased Seismicity in the Tsaoling Reservoir Region After the 1999 Chi-Chi, Taiwan, Earthquake

    NASA Astrophysics Data System (ADS)

    Chang, K.; Chi, W.

    2006-12-01

    The 1999 Mw7.6 Chi-Chi, Taiwan, earthquake has triggered several large landslides. Among them, the Tsaoling landslide has blocked the flow of Ching-sui River. The stream water backed up behind the landslide deposit, forming a 4.6 million cubic-meter reservoir about 5 km and 50 m deep. This reservoir was then filled by sediments about 4 years later. As a result, it provides a rare opportunity to monitor possible reservoir- induced seismicity. From the earthquake catalog derived from the dense seismic network of Central Weather Bureau of Taiwan, we selected 1666 earthquakes that occurred between the years 1997 and 2004 within a 10 km by 10 km rectangular region centered at 23.584763N and 120.661160E. We compared this catalog with another published catalog that relocated only the earthquakes in 1999 using double-difference method. The double-difference catalog has less events, possibly due to the stricter criteria used in the relocation inversion. However, we found the overall results from these two catalogs are similar, suggesting that the catalog used for this study is of high quality. On average only 0.6 earthquakes occurred per month prior to the Chi-Chi mainshock. However, high seismicity, with an average rate of 27.8 event/month, has occurred right after the Chi-Chi mainshock until the reservoir got filled by sediments in 2003, after which the seismicity almost ceased for 3.5 weeks. Following that, the raining season started, causing the seismicity to increase with a rate of 23.7 event/month. It is not usual for an Mw7.6 earthquake like Chi-Chi earthquake to have aftershocks continuously for more than 4 years. Thus we interpreted that some of these earthquakes were induced by the reservoir. There are 44 earthquakes shallower than 5 km located sparsely in this small region. However, there is one cluster of about 16 earthquakes located south of the reservoir. Field mapping between 1999 and 2004 found that the river channel has cut through a shale unit that overlies a south-dipping sandstone unit, providing a conduit for reservoir water to migrate to the south, and possibly induced this cluster. Due to the complicated geologic structures in this region, some other vertical fluid conduits might have formed by the strong ground shaking of the Chi-Chi mainshock. In addition, we found the shallow earthquakes usually occur during the raining season in this region for the five years post-Chi Chi earthquake period. In sum, the post-Chi Chi earthquake seismicity in Tsaoling region shows increased seismicity. And these earthquakes correlate spatially with landslide-induced reservoir region and temporally with precipitation. We interpret that part of these earthquakes were triggered by fluid-related processes.

  3. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  4. Speeding earthquake disaster relief

    USGS Publications Warehouse

    Mortensen, Carl; Donlin, Carolyn; Page, Robert A.; Ward, Peter

    1995-01-01

    In coping with recent multibillion-dollar earthquake disasters, scientists and emergency managers have found new ways to speed and improve relief efforts. This progress is founded on the rapid availability of earthquake information from seismograph networks.

  5. Design and characterization of the beam monitor detectors of the Italian National Center of Oncological Hadron-therapy (CNAO)

    NASA Astrophysics Data System (ADS)

    Giordanengo, S.; Donetti, M.; Garella, M. A.; Marchetto, F.; Alampi, G.; Ansarinejad, A.; Monaco, V.; Mucchi, M.; Pecka, I. A.; Peroni, C.; Sacchi, R.; Scalise, M.; Tomba, C.; Cirio, R.

    2013-01-01

    A new hadron-therapy facility implementing an active beam scanning technique has been developed at the Italian National Center of Oncological Hadron-therapy (CNAO). This paper presents the design and the characterization of the beam monitor detectors developed for the on-line monitoring and control of the dose delivered during a treatment at CNAO. The detectors are based on five parallel-plate transmission ionization chambers with either a single large electrode or electrodes segmented in 128 strips (strip chambers) and 32Ś32 pixels (pixel chamber). The detectors are arranged in two independent boxes with an active area larger than 200Ś200 mm2 and a total water equivalent thickness along the beam path of about 0.9 mm. A custom front-end chip with 64 channels converts the integrated ionization channels without dead-time. The detectors were tested at the clinical proton beam facility of the Paul Scherrer Institut (PSI) which implements a spot scanning technique, each spot being characterized by a predefined number of protons delivered with a pencil beam in a specified point of the irradiation field. The short-term instability was measured by delivering several identical spots in a time interval of few tenths of seconds and is found to be lower than 0.3%. The non-uniformity, measured by delivering sequences of spots in different points of the detector surface, results to be lower than 1% in the single electrode chambers and lower than 1.5% in the strip and pixel chambers, reducing to less than 0.5% and 1% in the restricted 100Ś100 mm2 central area of the detector.

  6. Lawrence Livermore National Laboratory earthquake safety program

    SciTech Connect

    Freeland, G.E.

    1984-08-21

    Within three minutes on the morning of January 24, 1980, an earthquake and three aftershocks, with Richter magnitudes of 5.8, 5.1, 4.0, and 4.2, respectively, struck the Livermore Valley. Two days later, a Richter magnitude 5.4 earthquake occurred, which had its epicenter about 4 miles northwest of the Lawrence Livermore National Laboratory (LLNL). Although no one at the Laboratory was seriously injured, these earthquakes caused considerable damage and disruption. Masonry and concrete structures cracked and broke, trailers shifted and fell off their pedestals, office ceilings and overhead lighting fell, and bookcases overturned. We suddenly found ourselves immersed in a site-wide program of repairing earthquake-damaged facilities, and protecting our many employees and the surrounding community from future earthquakes. Over the past four years, LLNL has spent approximately $10 million on its earthquake restoration effort for repairs and upgrades. The discussion in this paper centers upon the earthquake damage that occurred, our clean-up and restoration efforts, the seismic review of LLNL facilities, our site-specific seismic design criteria, computer-floor upgrades, ceiling-system upgrades, unique building seismic upgrades, geologic and seismologic studies, and seismic instrumentation. 9 references, 37 figures, 2 tables.

  7. Center for Integration of Natural Disaster Information

    USGS Publications Warehouse

    U.S. Geological Survey

    2001-01-01

    The U.S. Geological Survey's Center for Integration of Natural Disaster Information (CINDI) is a research and operational facility that explores methods for collecting, integrating, and communicating information about the risks posed by natural hazards and the effects of natural disasters. The U.S. Geological Survey (USGS) is mandated by the Robert Stafford Act to warn citizens of impending landslides, volcanic eruptions, and earthquakes. The USGS also coordinates with other Federal, State, and local disaster agencies to monitor threats to communities from floods, coastal storms, wildfires, geomagnetic storms, drought, and outbreaks of disease in wildlife populations.

  8. Real-Time Earthquake Analysis for Disaster Mitigation (READI) Network

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2014-12-01

    Real-time GNSS networks are making a significant impact on our ability to forecast, assess, and mitigate the effects of geological hazards. I describe the activities of the Real-time Earthquake Analysis for Disaster Mitigation (READI) working group. The group leverages 600+ real-time GPS stations in western North America operated by UNAVCO (PBO network), Central Washington University (PANGA), US Geological Survey & Scripps Institution of Oceanography (SCIGN project), UC Berkeley & US Geological Survey (BARD network), and the Pacific Geosciences Centre (WCDA project). Our goal is to demonstrate an earthquake and tsunami early warning system for western North America. Rapid response is particularly important for those coastal communities that are in the near-source region of large earthquakes and may have only minutes of warning time, and who today are not adequately covered by existing seismic and basin-wide ocean-buoy monitoring systems. The READI working group is performing comparisons of independent real time analyses of 1 Hz GPS data for station displacements and is participating in government-sponsored earthquake and tsunami exercises in the Western U.S. I describe a prototype seismogeodetic system using a cluster of southern California stations that includes GNSS tracking and collocation with MEMS accelerometers for real-time estimation of seismic velocity and displacement waveforms, which has advantages for improved earthquake early warning and tsunami forecasts compared to seismic-only or GPS-only methods. The READI working group's ultimate goal is to participate in an Indo-Pacific Tsunami early warning system that utilizes GNSS real-time displacements and ionospheric measurements along with seismic, near-shore buoys and ocean-bottom pressure sensors, where available, to rapidly estimate magnitude and finite fault slip models for large earthquakes, and then forecast tsunami source, energy scale, geographic extent, inundation and runup. This will require cooperation with other real-time efforts around the Pacific Rim in terms of sharing, analysis centers, and advisory bulletins to the responsible government agencies. The IAG's Global Geodetic Observing System (GGOS), in particular its natural hazards theme, provides a natural umbrella for achieving this objective.

  9. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  10. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models


  11. Earthquake Prediction and Forecasting

    NASA Astrophysics Data System (ADS)

    Jackson, David D.

    Prospects for earthquake prediction and forecasting, and even their definitions, are actively debated. Here, "forecasting" means estimating the future earthquake rate as a function of location, time, and magnitude. Forecasting becomes "prediction" when we identify special conditions that make the immediate probability much higher than usual and high enough to justify exceptional action. Proposed precursors run from aeronomy to zoology, but no identified phenomenon consistently precedes earthquakes. The reported prediction of the 1975 Haicheng, China earthquake is often proclaimed as the most successful, but the success is questionable. An earthquake predicted to occur near Parkfield, California in 1988±5 years has not happened. Why is prediction so hard? Earthquakes start in a tiny volume deep within an opaque medium; we do not know their boundary conditions, initial conditions, or material properties well; and earthquake precursors, if any, hide amongst unrelated anomalies. Earthquakes cluster in space and time, and following a quake earthquake probability spikes. Aftershocks illustrate this clustering, and later earthquakes may even surpass earlier ones in size. However, the main shock in a cluster usually comes first and causes the most damage. Specific models help reveal the physics and allow intelligent disaster response. Modeling stresses from past earthquakes may improve forecasts, but this approach has not yet been validated prospectively. Reliable prediction of individual quakes is not realistic in the foreseeable future, but probabilistic forecasting provides valuable information for reducing risk. Recent studies are also leading to exciting discoveries about earthquakes.

  12. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  13. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered


  14. A PDA-based dietary self-monitoring intervention to reduce sodium intake in an in-center hemodialysis patient

    PubMed Central

    Sevick, Mary Ann; Stone, Roslyn A; Novak, Matthew; Piraino, Beth; Snetselaar, Linda; Marsh, Rita M; Hall, Beth; Lash, Heather; Bernardini, Judith; Burke, Lora E

    2008-01-01

    Objective The purpose of the BalanceWise-hemodialysis study is to determine the efficacy of a dietary intervention to reduce dietary sodium intake in patients receiving maintenance, in-center hemodialysis (HD). Personal digital assistant (PDA)-based dietary self-monitoring is paired with behavioral counseling. The purpose of this report is to present a case study of one participant’s progression through the intervention. Methods The PDA was individually programmed with the nutritional requirements of the participant. With 25 minutes of personalized instruction, the participant was able to enter his meals into the PDA using BalanceLogź software. Nutritional counseling was provided based on dietary sodium intake reports generated by BalanceLogź. Results : At initiation of the study the participant required 4 HD treatments per week. The participant entered 342 meals over 16 weeks (?3 meals per day). BalanceLogź revealed that the participant consumed restaurant/fast food on a regular basis, and consumed significant amounts of corned beef as well as canned foods high in sodium. The study dietitian worked with the participant and his wife to identify food alternatives lower in sodium. Baseline sodium consumption was 4,692 mg, and decreased at a rate of 192 mg/week on average. After 11 weeks of intervention, interdialytic weight gains were reduced sufficiently to permit the participant to reduce HD treatments from 4 to 3 per week. Because of a low serum albumin at baseline (2.9 g/dL) the study dietitian encouraged the participant to increase his intake of high quality protein. Serum albumin level at 16 weeks was unchanged (2.9 g/dL). Because of intense pruritis and a high baseline serum phosphorus (6.5 mg/dL) BalanceLogź electronic logs were reviewed to identify sources of dietary phosphorus and counsel the participant regarding food alternatives. At 16 weeks the participant’s serum phosphorus fell to 5.5 mg/dL. Conclusions Self-monitoring rates were excellent. In a HD patient who was willing to self-monitor his dietary intake, BalanceLogź allowed the dietitian to target problematic foods and provide counseling that appeared to be effective in reducing sodium intake, reducing interdialytic weight gain, and alleviating hyperphosphatemia and hyperkalemia. Additional research is needed to evaluate the efficacy of the intervention. PMID:19920960

  15. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time?dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground?motion exceedance probabilities as well as short?term rupture probabilities—in concert with the long?term forecasts of probabilistic seismic?hazard analysis (PSHA).

  16. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  17. Supercomputing meets seismology in earthquake exhibit

    ScienceCinema

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2014-07-22

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  18. Supercomputing meets seismology in earthquake exhibit

    SciTech Connect

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2013-10-03

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  19. Precursory signals around epicenters and local active faults prior to inland or coastal earthquakes

    NASA Astrophysics Data System (ADS)

    Valizadeh Alvan, Habibeh

    Although earthquakes are still considered as unpredictable phenomenon but scientific efforts during the past decade have revealed some pronounced changes in the quality and quantity of some materials and natural phenomenon on and above the earth’s surface taking place before strong shakes. Pre-earthquake physical and chemical interactions in the earth’s ground may cause anomalies in temperature, surface latent heat flux (SLHF), relative humidity, upwelling index and chlorophyll-a (Chl-a) concentration on the ground or sea surface. Earthquakes are triggered when the energy accumulated in rocks releases causing ruptures in place of faults. The main purpose of this study is to explore and demonstrate possibility of any changes in surface temperature or latent heat flux before, during and after earthquakes. We expect that variations in these factors are accompanied with the increase of Chl-a concentration on the sea surface and upwelling events prior to coastal earthquake events. For monitoring the changes in surface temperature we used NOAA-AVHRR and microwave radiometers like AMSR-E/Aqua data. SLHF data and upwelling indices are provided by National Centers for Environmental Prediction (NCEP) Reanalysis Project and Pacific Fisheries Environmental Laboratory (PFEL) respectively. Chl-a concentration is also available in MODIS website. Our detailed analyses show significant increase of SLHF and upwelling of nutrient-rich water prior to the main events which is attributed to the raise in surface temperature and Chl-a concentration at that time. Meaningful increases in temperature, relative humidity and SLHF variations from weeks before the earthquakes in epicentral areas and along local active faults are revealed. In addition, considerable anomalies in Chl-a concentration are also attributed to the raise in upwelling index.

  20. Intracranial Pressure Monitoring in Severe Traumatic Brain Injury in Latin America: Process and Methods for a Multi-Center Randomized Controlled Trial

    PubMed Central

    Lujan, Silvia; Dikmen, Sureyya; Temkin, Nancy; Petroni, Gustavo; Pridgeon, Jim; Barber, Jason; Machamer, Joan; Cherner, Mariana; Chaddock, Kelley; Hendrix, Terence; Rondina, Carlos; Videtta, Walter; Celix, Juanita M.; Chesnut, Randall

    2012-01-01

    Abstract In patients with severe traumatic brain injury (TBI), the influence on important outcomes of the use of information from intracranial pressure (ICP) monitoring to direct treatment has never been tested in a randomized controlled trial (RCT). We are conducting an RCT in six trauma centers in Latin America to test this question. We hypothesize that patients randomized to ICP monitoring will have lower mortality and better outcomes at 6-months post-trauma than patients treated without ICP monitoring. We selected three centers in Bolivia to participate in the trial, based on (1) the absence of ICP monitoring, (2) adequate patient accession and data collection during the pilot phase, (3) preliminary institutional review board approval, and (4) the presence of equipoise about the value of ICP monitoring. We conducted extensive training of site personnel, and initiated the trial on September 1, 2008. Subsequently, we included three additional centers. A total of 176 patients were entered into the trial as of August 31, 2010. Current enrollment is 81% of that expected. The trial is expected to reach its enrollment goal of 324 patients by September of 2011. We are conducting a high-quality RCT to answer a question that is important globally. In addition, we are establishing the capacity to conduct strong research in Latin America, where TBI is a serious epidemic. Finally, we are demonstrating the feasibility and utility of international collaborations that share resources and unique patient populations to conduct strong research about global public health concerns. PMID:22435793

  1. Retrospective stress-forecasting of earthquakes

    NASA Astrophysics Data System (ADS)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to retrospectively stress-forecasting ~17 earthquakes ranging in magnitude from a M1.7 swarm event in N Iceland, to the 1999 M7.7 Chi-Chi Earthquake in Taiwan, and the 2004 Mw9.2 Sumatra-Andaman Earthquake (SAE). Before SAE, the changes in SWS were observed at seismic stations in Iceland at a distance of ~10,500km the width of the Eurasian Plate, from Indonesia demonstrating the 'butterfly wings' sensitivity of the New Geophysics of a critically microcracked Earth. At that time, the sensitivity of the phenomena had not been recognised, and the SAE was not stress-forecast. These results have been published at various times in various formats in various journals. This presentation displays all the results in a normalised format that allows the similarities to be recognised, confirming that observations of SWS time-delays can stress-forecast the times, magnitudes, and in some circumstances fault-breaks, of impending earthquakes. Papers referring to these developments can be found in geos.ed.ac.uk/home/scrampin/opinion. Also see abstracts in EGU2015 Sessions: Crampin & Gao (SM1.1), Liu & Crampin (NH2.5), and Crampin & Gao (GD.1).

  2. Development of a telecare system based on ZigBee mesh network for monitoring blood pressure of patients with hemodialysis in health care centers.

    PubMed

    Du, Yi-Chun; Lee, You-Yun; Lu, Yun-Yuan; Lin, Chia-Hung; Wu, Ming-Jei; Chen, Chung-Lin; Chen, Tainsong

    2011-10-01

    In Taiwan, the number of the patients needing dialysis increases rapidly in recent years. Because there is risk in every hemodialysis session, monitoring physiological status, such as blood pressure measurement every 30 min to 1 h is needed during about 4 h hemodialysis process. Therefore, an assisted measurement on blood pressure is needful in dialysis care centers. Telecare system (TCS) is regarded as one of important technique in the medical care. In this study, we utilized ZigBee wireless technique to establish a mesh network for monitoring blood pressure automatically and data storage in medical record system for display and further analysis. Moreover, while the blood pressure exceeds the normal range, the system could send a warning signal to remind, or inform the relatives and clinicians in health care center through the personal handy-phone system (PHS) immediately. The proposed system provides an assisted device for monitoring patients' blood pressure during hemodialysis process and saving medical manpower. PMID:20703683

  3. Engaging Students in Earthquake Science

    NASA Astrophysics Data System (ADS)

    Cooper, I. E.; Benthien, M.

    2004-12-01

    The Southern California Earthquake Center Communication, Education, and Outreach program (SCEC CEO) has been collaborating with the University of Southern California (USC) Joint Education Project (JEP) and the Education Consortium of Central Los Angeles (ECCLA) to work directly with the teachers and schools in the local community around USC. The community surrounding USC is 57 % Hispanic (US Census, 2000) and 21% African American (US Census, 2000). Through the partnership with ECCLA SCEC has created a three week enrichment intersession program, targeting disadvantaged students at the fourth/fifth grade level, dedicated entirely to earthquakes. SCEC builds partnerships with the intersession teachers, working together to actively engage the students in learning about earthquakes. SCEC provides a support system for the teachers, supplying them with the necessary content background as well as classroom manipulatives. SCEC goes into the classrooms with guest speakers and take the students out of the classroom on two field trips. There are four intersession programs each year. SCEC is also working with USC's Joint Education Project program. The JEP program has been recognized as one of the "oldest and best organized" Service-Learning programs in the country (TIME Magazine and the Princeton Review, 2000). Through this partnership SCEC is providing USC students with the necessary tools to go out to the local schools and teach students of all grade levels about earthquakes. SCEC works with the USC students to design engaging lesson plans that effectively convey content regarding earthquakes. USC students can check out hands-on/interactive materials to use in the classrooms from the SCEC Resource Library. In both these endeavors SCEC has expanded our outreach to the local community. SCEC is reaching over 200 minority children each year through our partnerships, and this number will increase as our programs grow.

  4. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  5. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  6. The Geology of Earthquakes

    NASA Astrophysics Data System (ADS)

    Wallace, Robert E.

    The Geology of Earthquakes is a major contribution that brings together under one cover the many and complex elements of geology that are fundamental to earthquakes and seismology. Here are described and analyzed the basic causes of earthquakes, the resulting effects of earthquakes and faulting on the surface of the Earth, techniques of analyzing these effects, and engineering and public policy considerations for earthquake hazard mitigation. The three authors have played major roles in developing the fundamentals in both scientific and policy matters; thus they speak with an authority that few others could.

  7. Combining Real-time Seismic and Geodetic Data to Improve Rapid Earthquake Information

    NASA Astrophysics Data System (ADS)

    Murray, M. H.; Neuhauser, D. S.; Gee, L. S.; Dreger, D. S.; Basset, A.; Romanowicz, B.

    2002-12-01

    The Berkeley Seismological Laboratory operates seismic and geodetic stations in the San Francisco Bay area and northern California for earthquake and deformation monitoring. The seismic systems, part of the Berkeley Digital Seismic Network (BDSN), include strong motion and broadband sensors, and 24-bit dataloggers. The data from 20 GPS stations, part of the Bay Area Regional Deformation (BARD) network of more than 70 stations in northern California, are acquired in real-time. We have developed methods to acquire GPS data at 12 stations that are collocated with the seismic systems using the seismic dataloggers, which have large on-site data buffer and storage capabilities, merge it with the seismic data stream in MiniSeed format, and continuously stream both data types using reliable frame relay and/or radio modem telemetry. Currently, the seismic data are incorporated into the Rapid Earthquake Data Integration (REDI) project to provide notification of earthquake magnitude, location, moment tensor, and strong motion information for hazard mitigation and emergency response activities. The geodetic measurements can provide complementary constraints on earthquake faulting, including the location and extent of the rupture plane, unambiguous resolution of the nodal plane, and distribution of slip on the fault plane, which can be used, for example, to refine strong motion shake maps. We are developing methods to rapidly process the geodetic data to monitor transient deformation, such as coseismic station displacements, and for combining this information with the seismic observations to improve finite-fault characterization of large earthquakes. The GPS data are currently processed at hourly intervals with 2-cm precision in horizontal position, and we are beginning a pilot project in the Bay Area in collaboration with the California Spatial Reference Center to do epoch-by-epoch processing with greater precision.

  8. ElarmS Earthquake Early Warning System Enhancements and Performance Evaluation

    NASA Astrophysics Data System (ADS)

    Henson, I. H.; Neuhauser, D. S.; Allen, R. M.

    2013-12-01

    ElarmS is an earthquake early warning system that contributes alerts to CISN ShakeAlert, a prototype end-to-end earthquake early warning system being developed and tested by the California Integrated Seismic Network (CISN). ElarmS is one of several systems based on independent methodologies that contribute to CISN ShakeAlert. The UC Berkeley ElarmS system consists of multiple continuous-waveform processors and trigger-association processors running at three geographical locations and communicating via the Apache ActiveMQ Messaging system. Recent enhancements to the ElarmS system include reductions in trigger report times, reductions in trigger association and event alert times, and the development and testing of redundant processing and communication architectures. To enable redundant processing, ElarmS trigger associators handle duplicate trigger information arriving from duplicate waveform processors via different transmission paths. We have developed performance monitoring tools that report system component latencies and earthquake hypocenter parameter accuracy. Statistics for hypocenter and origin time accuracy and alert times latencies can be computed for different time periods, magnitude ranges and geographic regions. Individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers, from arrival detection through several stages of processing to the association with an individual earthquake alert. Detailed event information includes latencies associated with the transmission of individual waveform packets from station to processing centers, waveform processing queues, trigger message queues, trigger message transmissions, trigger association and hypocenter location cpu times. Changes to the ElarmS algorithm and system architecture are frequently tested by running multiple versions of ElarmS simultaneously. A web browser interface to the performance monitoring tools includes tabular, mapping, and statistical analysis graphical components (generated by the R-Statistics System) that make it easy to compare different development versions of ElarmS.

  9. Associating an ionospheric parameter with major earthquake occurrence throughout the world

    NASA Astrophysics Data System (ADS)

    Ghosh, D.; Midya, S. K.

    2014-02-01

    With time, ionospheric variation analysis is gaining over lithospheric monitoring in serving precursors for earthquake forecast. The current paper highlights the association of major (Ms ≄ 6.0) and medium (4.0 ≀ Ms < 6.0) earthquake occurrences throughout the world in different ranges of the Ionospheric Earthquake Parameter (IEP) where `Ms' is earthquake magnitude on the Richter scale. From statistical and graphical analyses, it is concluded that the probability of earthquake occurrence is maximum when the defined parameter lies within the range of 0-75 (lower range). In the higher ranges, earthquake occurrence probability gradually decreases. A probable explanation is also suggested.

  10. Earthquake forecasting and warning

    SciTech Connect

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  11. Commensurability of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Hu, Hui; Han, Yanben; Su, Youjin; Wang, Rui

    2013-07-01

    During recent years huge earthquakes frequently occurred and caused surprise attack on many places of the globe. Frequent exceptional strong disasters of earthquakes remind that we must strengthen our research on cause of formation, mechanism, prediction and forecast of earthquakes, and achieve the goal of advancing the development of Earth science and mitigation of seismic disasters. The commensurability of earthquake occurrences has been studied by means of the commensurability revealed by the Titius-Bode law in the paper. The studied results show that the earthquakes basically all occur at the commensurable point of its time axis, respectively. It also shows that occurrence of the earthquakes is not accidental, showing certain patterns and inevitability, and the commensurable value is different for earthquakes occurring in different areas.

  12. Continental dynamics and continental earthquakes

    NASA Astrophysics Data System (ADS)

    Zhang, Dong-Ning; Zhang, Guo-Min; Zhang, Pei-Zhen

    2003-09-01

    Two key research projects in geoscience field in China since the IUGG meeting in Birmingham in 1999, the project of “East Asian Continental Geodynamics” and the project of “Mechanism and Prediction of Strong Continental Earthquakes” are introduced in this paper. Some details of two projects, such as their sub-projects, some initial research results published are also given here. Because of the large magnitude of the November 14, 2001 Kunlun Mountain Pass M S=8.1 earthquake, in the third part of this paper, some initial research results are reviewed for the after-shock monitoring and the multi-discipline field survey, the impact and disaster of this earthquake on the construction site of Qinghai-Xizang (Tibet) railway and some other infrastructure.

  13. Monitoring

    DOEpatents

    Orr, Christopher Henry (Calderbridge, GB); Luff, Craig Janson (Calderbridge, GB); Dockray, Thomas (Calderbridge, GB); Macarthur, Duncan Whittemore (Los Alamos, NM)

    2004-11-23

    The invention provides apparatus and methods which facilitate movement of an instrument relative to an item or location being monitored and/or the item or location relative to the instrument, whilst successfully excluding extraneous ions from the detection location. Thus, ions generated by emissions from the item or location can successfully be monitored during movement. The technique employs sealing to exclude such ions, for instance, through an electro-field which attracts and discharges the ions prior to their entering the detecting location and/or using a magnetic field configured to repel the ions away from the detecting location.

  14. Research on earthquake prediction from infrared cloud images

    NASA Astrophysics Data System (ADS)

    Fan, Jing; Chen, Zhong; Yan, Liang; Gong, Jing; Wang, Dong

    2015-12-01

    In recent years, the occurrence of large earthquakes is frequent all over the word. In the face of the inevitable natural disasters, the prediction of the earthquake is particularly important to avoid more loss of life and property. Many achievements in the field of predict earthquake from remote sensing images have been obtained in the last few decades. But the traditional prediction methods presented do have the limitations of can't forecast epicenter location accurately and automatically. In order to solve the problem, a new predicting earthquakes method based on extract the texture and emergence frequency of the earthquake cloud is proposed in this paper. First, strengthen the infrared cloud images. Second, extract the texture feature vector of each pixel. Then, classified those pixels and converted to several small suspected area. Finally, tracking the suspected area and estimate the possible location. The inversion experiment of Ludian earthquake show that this approach can forecast the seismic center feasible and accurately.

  15. Seismic Monitoring in Haiti

    USGS Multimedia Gallery

    Following the devastating 2010 Haiti earthquake, the USGS has been helping with earthquake awareness and monitoring in the country, with continued support from the U.S. Agency for International Development (USAID). This assistance has helped the Bureau des Mines et de l'Energie (BME) in Port-au-Prin...

  16. Earthquake Hazard in the Heart of the Homeland

    USGS Publications Warehouse

    Gomberg, Joan; Schweig, Eugene

    2007-01-01

    Evidence that earthquakes threaten the Mississippi, Ohio, and Wabash River valleys of the Central United States abounds. In fact, several of the largest historical earthquakes to strike the continental United States occurred in the winter of 1811-1812 along the New Madrid seismic zone, which stretches from just west of Memphis, Tenn., into southern Illinois. Several times in the past century, moderate earthquakes have been widely felt in the Wabash Valley seismic zone along the southern border of Illinois and Indiana. Throughout the region, between 150 and 200 earthquakes are recorded annually by a network of monitoring instruments, although most are too small to be felt by people. Geologic evidence for prehistoric earthquakes throughout the region has been mounting since the late 1970s. But how significant is the threat? How likely are large earthquakes and, more importantly, what is the chance that the shaking they cause will be damaging?

  17. Seismotectonic constraints at the western edge of the Pyrenees: aftershock series monitoring of the 2002 February 21, 4.1 Lg earthquake

    NASA Astrophysics Data System (ADS)

    Ruiz, M.; DĂ­az, J.; Gallart, J.; Pulgar, J. A.; GonzĂĄlez-Cortina, J. M.; LĂłpez, C.

    2006-07-01

    Seismic data recorded from a temporary network deployed at the western edge of the Pyrenees is used to study the aftershocks series following a magnitude 4.1 earthquake that took place on 2002 February 21, to the NW of Pamplona city. Aftershock determinations showed events distributed between 1 and 4 km depth in a small active area of about 4 km2, E-W oriented delineating the southern sector of the Aralar thrust unit. This seismogenic feature is supported by focal solutions showing a consistent E-W nodal plane with normal faulting following the main strike-slip rupture. The Aralar structure with its shallow activity may be interpreted as a conjugate system of the NE-SW deep-seated Pamplona active fault nearby. Cross-correlation techniques and relative location of event clusters further constrained the epicentral domain to 2 km long and 1 km wide. Statistical relations and parameters established indicate a rather low b-value of 0.8 for the Gutenberg-Richter distribution, denoting a region of concentrated seismicity, and a P-parameter of 0.9 for the Omori's law corresponding to a low decay of the aftershock activity in this area. More than 100 aftershocks were accurately located in this high-resolution experiment, whereas only 13 of them could be catalogued by the permanent agencies in the same period, due to a much sparser distribution. The results enhance the importance of using dense temporary networks to infer relevant seismotectonic and hazard constraints.

  18. Research on the Relation between Anomalous Infrasonic waves and several Earthquakes

    NASA Astrophysics Data System (ADS)

    Zhang, B.

    2013-12-01

    It is well known that earthquakes can generate infrasound signals often detected by infrasound monitoring system. Some of the observations suggest that infrasound with a typical frequency of a few Hz can be generated by vibrating ground surface and propagate at distances of a few thousands kilometers from an earthquake epicenter. In order to receive the anomalous infrasonic waves before earthquakes, we have built three infrasonic monitoring stations in Beijing. And atmospheric pressure is parallel observing at the same time. At first, two infrasonic monitoring equipment was putted in the same station. The data was observed from them has a very good correlation, this means that the performance of the instruments is good. After half a year, three instruments were putted in different stations. Large amounts of data have been acquired and lots of anomalous information has been found before earthquakes, such as Lushan 7.0 earthquake, Okhotsk 8.0 earthquake and Nantou 6.7 earthquake. The anomalous data before three earthquakes is about 7-8days before each earthquake. Moreover, the co-seismic infrasonic waves have been received, which is the similar to seismic wave, so we can know where the earthquake happened through co-seismic infrasonic waves. Using this method, we can inference where the next earthquake will be happened according to the anomalous information. we developed an infrasound generation model for a so-called slow earthquake to show that such kind of earthquake can generate long-period acoustic-gravity waves often observed several days prior to the strong earthquakes. With this model the atmospheric pressure perturbations generated by slow earthquake were calculated, and the occurrence of low frequencies and high amplitudes in the observed signal was explained. A consistency between the results of simulation and observation data indicates that slow earthquake may be a possible source of atmospheric pressure oscillations observed prior to strong earthquakes.

  19. Post-earthquake building safety assessments for the Canterbury Earthquakes

    USGS Publications Warehouse

    Marshall, J.; Barnes, J.; Gould, N.; Jaiswal, K.; Lizundia, B.; Swanson, David A.; Turner, F.

    2012-01-01

    This paper explores the post-earthquake building assessment program that was utilized in Christchurch, New Zealand following the Canterbury Sequence of earthquakes beginning with the Magnitude (Mw.) 7.1 Darfield event in September 2010. The aftershocks or triggered events, two of which exceeded Mw 6.0, continued with events in February and June 2011 causing the greatest amount of damage. More than 70,000 building safety assessments were completed following the February event. The timeline and assessment procedures will be discussed including the use of rapid response teams, selection of indicator buildings to monitor damage following aftershocks, risk assessments for demolition of red-tagged buildings, the use of task forces to address management of the heavily damaged downtown area and the process of demolition. Through the post-event safety assessment program that occurred throughout the Canterbury Sequence of earthquakes, many important lessons can be learned that will benefit future response to natural hazards that have potential to damage structures.

  20. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  1. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  2. Oscillating brittle and viscous behavior through the earthquake cycle in the Red River Shear Zone: Monitoring flips between reaction and textural softening and hardening

    NASA Astrophysics Data System (ADS)

    Wintsch, Robert P.; Yeh, Meng-Wan

    2013-03-01

    Microstructures associated with cataclasites and mylonites in the Red River shear zone in the Diancang Shan block, Yunnan Province, China show evidence for both reaction hardening and softening at lower greenschist facies metamorphic conditions. The earliest fault-rocks derived from Triassic porphyritic orthogneiss protoliths are cataclasites. Brittle fractures and crushed grains are cemented by newly precipitated quartz. These cataclasites are subsequently overprinted by mylonitic fabrics. Truncations and embayments of relic feldspars and biotites show that these protolith minerals have been dissolved and incompletely replaced by muscovite, chlorite, and quartz. Both K-feldspar and plagioclase porphyroclasts are truncated by muscovite alone, suggesting locally metasomatic reactions of the form: 3K-feldspar + 2H+ = muscovite + 6SiO2(aq) + 2K+. Such reactions produce muscovite folia and fish, and quartz bands and ribbons. Muscovite and quartz are much weaker than the reactant feldspars and these reactions result in reaction softening. Moreover, the muscovite tends to align in contiguous bands that constitute textural softening. These mineral and textural modifications occurred at constant temperature and drove the transition from brittle to viscous deformation and the shift in deformation mechanism from cataclasis to dissolution-precipitation and reaction creep. These mylonitic rocks so produced are cut by K-feldspar veins that interrupt the mylonitic fabric. The veins add K-feldspar to the assemblage and these structures constitute both reaction and textural hardening. Finally these veins are boudinaged by continued viscous deformation in the mylonitic matrix, thus defining a late ductile strain event. Together these overprinting textures and microstructures demonstrate several oscillations between brittle and viscous deformation, all at lower greenschist facies conditions where only frictional behavior is predicted by experiments. The overlap of the depths of greenschist facies conditions with the base of the crustal seismic zone suggests that the implied oscillations in strain rate may have been related to the earthquake cycle.

  3. Unexpectedly frequent occurrence of very small repeating earthquakes (-5.1 ≀ Mw ≀ -3.6) in a South African gold mine: Implications for monitoring intraplate faults

    NASA Astrophysics Data System (ADS)

    Naoi, Makoto; Nakatani, Masao; Igarashi, Toshihiro; Otsuki, Kenshiro; Yabe, Yasuo; Kgarume, Thabang; Murakami, Osamu; Masakale, Thabang; Ribeiro, Luiz; Ward, Anthony; Moriya, Hirokazu; Kawakata, Hironori; Nakao, Shigeru; Durrheim, Raymond; Ogasawara, Hiroshi

    2015-12-01

    We observed very small repeating earthquakes with -5.1 ≀ Mw ≀ -3.6 on a geological fault at 1 km depth in a gold mine in South Africa. Of the 851 acoustic emissions that occurred on the fault during the 2 month analysis period, 45% were identified as repeaters on the basis of waveform similarity and relative locations. They occurred steadily at the same location with similar magnitudes, analogous to repeaters at plate boundaries, suggesting that they are repeat ruptures of the same asperity loaded by the surrounding aseismic slip (background creep). Application of the Nadeau and Johnson (1998) empirical formula (NJ formula), which relates the amount of background creep and repeater activity and is well established for plate boundary faults, to the present case yielded an impossibly large estimate of the background creep. This means that the presently studied repeaters were produced more efficiently, for a given amount of background creep, than expected from the NJ formula. When combined with an independently estimated average stress drop of 16 MPa, which is not particularly high, it suggests that the small asperities of the presently studied repeaters had a high seismic coupling (almost unity), in contrast to one physical interpretation of the plate boundary repeaters. The productivity of such repeaters, per unit background creep, is expected to increase strongly as smaller repeaters are considered (∝ Mo -1/3 as opposed to Mo -1/6 of the NJ formula), which may be usable to estimate very slow creep that may occur on intraplate faults.

  4. Uplift and Subsidence Associated with the Great Aceh-Andaman Earthquake of 2004

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The magnitude 9.2 Indian Ocean earthquake of December 26, 2004, produced broad regions of uplift and subsidence. In order to define the lateral extent and the downdip limit of rupture, scientists from Caltech, Pasadena, Calif.; NASA's Jet Propulsion Laboratory, Pasadena, Calif.; Scripps Institution of Oceanography, La Jolla, Calif.; the U.S. Geological Survey, Pasadena, Calif.; and the Research Center for Geotechnology, Indonesian Institute of Sciences, Bandung, Indonesia; first needed to define the pivot line separating those regions. Interpretation of satellite imagery and a tidal model were one of the key tools used to do this.

    These pre-Sumatra earthquake (a) and post-Sumatra earthquake (b) images of North Sentinel Island in the Indian Ocean, acquired from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra spacecraft, show emergence of the coral reef surrounding the island following the earthquake. The tide was 30 plus or minus 14 centimeters lower in the pre-earthquake image (acquired November 21, 2000) than in the post-earthquake image (acquired February 20, 2005), requiring a minimum of 30 centimeters of uplift at this locality. Observations from an Indian Coast Guard helicopter on the northwest coast of the island suggest that the actual uplift is on the order of 1 to 2 meters at this site.

    In figures (c) and (d), pre-earthquake and post-earthquake ASTER images of a small island off the northwest coast of Rutland Island, 38 kilometers east of North Sentinel Island, show submergence of the coral reef surrounding the island. The tide was higher in the pre-earthquake image (acquired January 1, 2004) than in the post-earthquake image (acquired February 4, 2005), requiring subsidence at this locality. The pivot line must run between North Sentinel and Rutland islands. Note that the scale for the North Sentinel Island images differs from that for the Rutland Island images.

    The tidal model used for this study was based on data from JPL's Topex/Poseidon satellite. The model was used to determine the relative sea surface height at each location at the time each image was acquired, a critical component used to quantify the deformation.

    The scientists' method of using satellite imagery to recognize changes in elevation relative to sea surface height and of using a tidal model to place quantitative bounds on coseismic uplift or subsidence is a novel approach that can be adapted to other forms of remote sensing and can be applied to other subduction zones in tropical regions.

    ASTER is one of five Earth-observing instruments launched December 18, 1999, on NASA's Terra satellite. The instrument was built by Japan's Ministry of Economy, Trade and Industry. A joint U.S./Japan science team is responsible for validation and calibration of the instrument and the data products.

    The broad spectral coverage and high spectral resolution of ASTER provides scientists in numerous disciplines with critical information for surface mapping, and monitoring of dynamic conditions and temporal change. Example applications are: monitoring glacial advances and retreats; monitoring potentially active volcanoes; identifying crop stress; determining cloud morphology and physical properties; wetlands evaluation; thermal pollution monitoring; coral reef degradation; surface temperature mapping of soils and geology; and measuring surface heat balance.

    The U.S. science team is located at NASA's Jet Propulsion Laboratory, Pasadena, Calif. The Terra mission is part of NASA's Science Mission Directorate.

  5. Benefits of Earthquake Early Warning to Large Municipalities (Invited)

    NASA Astrophysics Data System (ADS)

    Featherstone, J.

    2013-12-01

    The City of Los Angeles has been involved in the testing of the Cal Tech Shake Alert, Earthquake Early Warning (EQEW) system, since February 2012. This system accesses a network of seismic monitors installed throughout California. The system analyzes and processes seismic information, and transmits a warning (audible and visual) when an earthquake occurs. In late 2011, the City of Los Angeles Emergency Management Department (EMD) was approached by Cal Tech regarding EQEW, and immediately recognized the value of the system. Simultaneously, EMD was in the process of finalizing a report by a multi-discipline team that visited Japan in December 2011, which spoke to the effectiveness of EQEW for the March 11, 2011 earthquake that struck that country. Information collected by the team confirmed that the EQEW systems proved to be very effective in alerting the population of the impending earthquake. The EQEW in Japan is also tied to mechanical safeguards, such as the stopping of high-speed trains. For a city the size and complexity of Los Angeles, the implementation of a reliable EQEW system will save lives, reduce loss, ensure effective and rapid emergency response, and will greatly enhance the ability of the region to recovery from a damaging earthquake. The current Shake Alert system is being tested at several governmental organizations and private businesses in the region. EMD, in cooperation with Cal Tech, identified several locations internal to the City where the system would have an immediate benefit. These include the staff offices within EMD, the Los Angeles Police Department's Real Time Analysis and Critical Response Division (24 hour crime center), and the Los Angeles Fire Department's Metropolitan Fire Communications (911 Dispatch). All three of these agencies routinely manage the collaboration and coordination of citywide emergency information and response during times of crisis. Having these three key public safety offices connected and included in the early testing of an EQEW system will help shape the EQEW policy which will determine the seismic safety of millions of Californians in the years to come.

  6. A continuation of base-line studies for environmentally monitoring Space Transportation Systems at John F. Kennedy Space Center. Volume 2: Chemical studies of rainfall and soil analysis

    NASA Technical Reports Server (NTRS)

    Madsen, B. C.

    1980-01-01

    The results of a study which was designed to monitor, characterize, and evaluate the chemical composition of precipitation (rain) which fell at the Kennedy Space Center, Florida (KSC) during the period July 1977 to March 1979 are reported. Results which were obtained from a soil sampling and associated chemical analysis are discussed. The purpose of these studies was to determine the environmental perturbations which might be caused by NASA space activities.

  7. Earthquakes and Plate Boundaries

    ERIC Educational Resources Information Center

    Lowman, Paul; And Others

    1978-01-01

    Contains the contents of the Student Investigation booklet of a Crustal Evolution Education Project (CEEP) instructional modules on earthquakes. Includes objectives, procedures, illustrations, worksheets, and summary questions. (MA)

  8. Are Earthquake Magnitudes Clustered?

    SciTech Connect

    Davidsen, Joern; Green, Adam

    2011-03-11

    The question of earthquake predictability is a long-standing and important challenge. Recent results [Phys. Rev. Lett. 98, 098501 (2007); ibid.100, 038501 (2008)] have suggested that earthquake magnitudes are clustered, thus indicating that they are not independent in contrast to what is typically assumed. Here, we present evidence that the observed magnitude correlations are to a large extent, if not entirely, an artifact due to the incompleteness of earthquake catalogs and the well-known modified Omori law. The latter leads to variations in the frequency-magnitude distribution if the distribution is constrained to those earthquakes that are close in space and time to the directly following event.

  9. Istanbul Earthquake Early Warning and Rapid Response System

    NASA Astrophysics Data System (ADS)

    Erdik, M. O.; Fahjan, Y.; Ozel, O.; Alcik, H.; Aydin, M.; Gul, M.

    2003-12-01

    As part of the preparations for the future earthquake in Istanbul a Rapid Response and Early Warning system in the metropolitan area is in operation. For the Early Warning system ten strong motion stations were installed as close as possible to the fault zone. Continuous on-line data from these stations via digital radio modem provide early warning for potentially disastrous earthquakes. Considering the complexity of fault rupture and the short fault distances involved, a simple and robust Early Warning algorithm, based on the exceedance of specified threshold time domain amplitude levels is implemented. The band-pass filtered accelerations and the cumulative absolute velocity (CAV) are compared with specified threshold levels. When any acceleration or CAV (on any channel) in a given station exceeds specific threshold values it is considered a vote. Whenever we have 2 station votes within selectable time interval, after the first vote, the first alarm is declared. In order to specify the appropriate threshold levels a data set of near field strong ground motions records form Turkey and the world has been analyzed. Correlations among these thresholds in terms of the epicenter distance the magnitude of the earthquake have been studied. The encrypted early warning signals will be communicated to the respective end users by UHF systems through a "service provider" company. The users of the early warning signal will be power and gas companies, nuclear research facilities, critical chemical factories, subway system and several high-rise buildings. Depending on the location of the earthquake (initiation of fault rupture) and the recipient facility the alarm time can be as high as about 8s. For the rapid response system one hundred 18 bit-resolution strong motion accelerometers were placed in quasi-free field locations (basement of small buildings) in the populated areas of the city, within an area of approximately 50x30km, to constitute a network that will enable early damage assessment and rapid response information after a damaging earthquake. Early response information is achieved through fast acquisition and analysis of processed data obtained from the network. The stations are routinely interrogated on regular basis by the main data center. After triggered by an earthquake, each station processes the streaming strong motion data to yield the spectral accelerations at specific periods, 12Hz filtered PGA and PGV and will send these parameters in the form of SMS messages at every 20s directly to the main data center through a designated GSM network and through a microwave system. A shake map and damage distribution map (using aggregate building inventories and fragility curves) will be automatically generated using the algorithm developed for this purpose. Loss assessment studies are complemented by a large citywide digital database on the topography, geology, soil conditions, building, infrastructure and lifeline inventory. The shake and damage maps will be conveyed to the governor's and mayor's offices, fire, police and army headquarters within 3 minutes using radio modem and GPRS communication. An additional forty strong motion recorders were placed on important structures in several interconnected clusters to monitor the health of these structures after a damaging earthquake.

  10. The land subsidence of the Venice historical center: twenty years of monitoring by SAR-based interferometry

    NASA Astrophysics Data System (ADS)

    Tosi, L.; Strozzi, T.; Teatini, P.

    2012-12-01

    The subsidence of Venice, one of the most beautiful and famous cities in the world, is well known not by reason of the magnitude of the ground movement, which amounts to less than 15 cm over the last century, but because it has seriously compromised the ground safety level of the city in relation of its small elevation above the sea. The lowering of Venice is still today a subject of debates with large rumours on press releases every time a scientific paper is published on the topic. Over the last two decades, satellites instrumented with SAR sensors provided excellent data for detecting land displacements by inteferometric processing. In particular, the accuracy achieved by Persistent Scatterer Interferometry (PSI) and the impressive number of detected measurement points have progressively reduced the use of in situ traditional measurements, i.e. leveling survey, for monitoring land displacements of Venice. In fact, the intensive urban development makes the historical center an optimal site for PSI. On the other hand, the correct interpretation of the PSI outcomes, which provide the relative movement of single churches, palaces, bridges with millimetric precision and metric spatial resolution, require a deep knowledge of the city and its subsoil due to the peculiarity of this urban area developed over the centuries within the sea. We investigate the movements of Venice by Interferometric Point Target Analysis (IPTA) over the last 20 years using SAR acquisitions of the ERS-1/2, ENVISAT, TerraSAR-X, and Cosmo-SkyMed satellites. The density of detected scatterers is one order of magnitude larger with the newest very high resolution X-band sensors from TerraSAR-X and Cosmo-SkyMed, but by reason of the larger observation period the accuracy of the mean displacement rate of the C-band ERS and ENVISAT is higher. IPTA results have been calibrated using leveling and permanent GPS stations to correct the so-called flattening problem, i.e. the slight phase tilt resulting by the inaccuracy in estimation of the orbital baseline due to the not perfect knowledge of the satellite positions. The comparison between the measurements covering the period from 1992 to 2011 confirms the substantial stability of the city in its whole, with a subsidence rate averaging 1 mm/yr. However, the PSI measurements also provide evidence of local zones and single structures that are subsiding at faster rates due to the heterogeneous nature of the of the upper Holocene lagoon subsoil, different load and foundation of the historical palaces, and restoration works along the canals.

  11. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ?3 lower than average for California earthquakes. I present intensity observations from the 2014 South Napa earthquake that suggest that it may have been a low stress drop event.

  12. First Results of the Regional Earthquake Likelihood Models Experiment

    USGS Publications Warehouse

    Schorlemmer, D.; Zechar, J.D.; Werner, M.J.; Field, E.H.; Jackson, D.D.; Jordan, T.H.

    2010-01-01

    The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize earthquake occurrence hypotheses in the form of prospective earthquake rate forecasts in California. RELM members, working in small research groups, developed more than a dozen 5-year forecasts; they also outlined a performance evaluation method and provided a conceptual description of a Testing Center in which to perform predictability experiments. Subsequently, researchers working within the Collaboratory for the Study of Earthquake Predictability (CSEP) have begun implementing Testing Centers in different locations worldwide, and the RELM predictability experiment-a truly prospective earthquake prediction effort-is underway within the U. S. branch of CSEP. The experiment, designed to compare time-invariant 5-year earthquake rate forecasts, is now approximately halfway to its completion. In this paper, we describe the models under evaluation and present, for the first time, preliminary results of this unique experiment. While these results are preliminary-the forecasts were meant for an application of 5 years-we find interesting results: most of the models are consistent with the observation and one model forecasts the distribution of earthquakes best. We discuss the observed sample of target earthquakes in the context of historical seismicity within the testing region, highlight potential pitfalls of the current tests, and suggest plans for future revisions to experiments such as this one. ?? 2010 The Author(s).

  13. Earthquake activity in Oklahoma

    SciTech Connect

    Luza, K.V.; Lawson, J.E. Jr. )

    1989-08-01

    Oklahoma is one of the most seismically active areas in the southern Mid-Continent. From 1897 to 1988, over 700 earthquakes are known to have occurred in Oklahoma. The earliest documented Oklahoma earthquake took place on December 2, 1897, near Jefferson, in Grant County. The largest known Oklahoma earthquake happened near El Reno on April 9, 1952. This magnitude 5.5 (mb) earthquake was felt from Austin, Texas, to Des Moines, Iowa, and covered a felt area of approximately 362,000 km{sup 2}. Prior to 1962, all earthquakes in Oklahoma (59) were either known from historical accounts or from seismograph stations outside the state. Over half of these events were located in Canadian County. In late 1961, the first seismographs were installed in Oklahoma. From 1962 through 1976, 70 additional earthquakes were added to the earthquake database. In 1977, a statewide network of seven semipermanent and three radio-telemetry seismograph stations were installed. The additional stations have improved earthquake detection and location in the state of Oklahoma. From 1977 to 1988, over 570 additional earthquakes were located in Oklahoma, mostly of magnitudes less than 2.5. Most of these events occurred on the eastern margin of the Anadarko basin along a zone 135 km long by 40 km wide that extends from Canadian County to the southern edge of Garvin County. Another general area of earthquake activity lies along and north of the Ouachita Mountains in the Arkoma basin. A few earthquakes have occurred in the shelves that border the Arkoma and Anadarko basins.

  14. Historical earthquakes in Libya

    NASA Astrophysics Data System (ADS)

    Suleiman, A. S.

    2003-04-01

    As a result of the relative motion of the African and European plates, Libya, located at the north central margin of the African continent, has experienced a considerable intraplate tectonism particularly at its northern coastal regions. In this study I present a reevaluation of the seismicity of Libya with special focus on the historical seismicity. Data on historical seismicity is of crucial importance for seismic hazard assessment in Libya. The earliest records of earthquakes in Libya is documented back from the Roman period when two large earthquakes (262 A.D. and 365 A.D) destroyed most of the temples and public buildings of Cyrene. A number of earthquakes that affected Libya in the Middle ages includes the 704 A.D. earthquake of Sabha (southern Libya) which reportedly destroyed several towns and village. In 1183 A.D., a powerful earthquake destroyed Tripoli, killing 20,000 people. Mild tremors were felt in Tripoli in 1803, 1811 and 1903 A.D. The Hun Graben area was the site of several earthquakes through history, in April 19 -1935 a great earthquake (mb=7.1) hit this area, followed by a very large number of aftershocks including two of magnitudes 6.0 and 6.5 on the Richter scale. In 1941 a major earthquake of magnitude 5.6 hit the Hun Graben area. In 1939 an earthquake of magnitude 5.6 occurred in the Gulf of Sirt area, followed by a number of aftershocks. Reinterpretation and improvement of the source quality for selected earthquakes will be presented. The present study aims to focus on investigating the original sources of information and in developing historical earthquake database.

  15. Predicting Earthquake Response of Civil Structures from Ambient Noise

    NASA Astrophysics Data System (ADS)

    Prieto, G.; Lawrence, J. F.; Chung, A. I.; Kohler, M. D.

    2009-12-01

    Increased monitoring of civil structures for response to earthquake motions is fundamental for reducing seismic hazard. Seismic monitoring is difficult because typically only a few useful, intermediate to large earthquakes occur per decade near instrumented structures. Here we demonstrate that the impulse response function (IRF) of a multi-story building can be generated from ambient noise. Estimated shear-wave velocity, attenuation values, and resonance frequencies from the IRFs agree with previous estimates for the instrumented UCLA Factor building. The accuracy of the approach is demonstrated by predicting the Factor building’s response to an M4.2 earthquake. The methodology described here allows for rapid non-invasive determination of structural parameters from the IRFs within days and could be used as a new tool for stateof- health monitoring of civil structures (buildings, bridges, etc.) before and/or after major earthquakes.

  16. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information such as epicenter, magnitude, and strong-motion recordings. Without quantitative data, prioritization of response measures, including building and infrastructure inspection, are not possible. The main advantage of Twitter is speed, especially in sparsely instrumented areas. A Twitter based system potentially could provide a quick notification that there was a possible event and that seismographically derived information will follow. If you are interested in learning more, follow @USGSted on Twitter.

  17. Waveform Cross-Correlation for Improved North Texas Earthquake Locations

    NASA Astrophysics Data System (ADS)

    Phillips, M.; DeShon, H. R.; Oldham, H. R.; Hayward, C.

    2014-12-01

    In November 2013, a sequence of earthquakes began in Reno and Azle, TX, two communities located northwest of Fort Worth in an area of active oil and gas extraction. Only one felt earthquake had been reported within the area before the occurrence of probable injection-induced earthquakes at the Dallas-Fort Worth airport in 2008. The USGS National Earthquakes Information Center (NEIC) has reported 27 felt earthquakes in the Reno-Azle area through January 28, 2014. A temporary seismic network was installed beginning in December 2013 to acquire data to improve location and magnitude estimates and characterize the earthquake sequence. Here, we present high-resolution relative earthquake locations derived using differential time data from waveform cross-correlation. Cross-correlation is computed using the GISMO software suite and event relocation is done using double difference relocation techniques. Waveform cross-correlation of the local data indicates high (>70%) similarity between 4 major swarms of events lasting between 18 and 24 hours. These swarms are temporal zones of high event frequency; 1.4% of the time series data accounts for 42.1% of the identified local earthquakes. Local earthquakes are occurring along the Newark East Fault System, a NE-SW striking normal fault system previously thought inactive at depths between 2 and 8 km in the Ellenburger limestone formation and underlying Precambrian basement. Data analysis is ongoing and continued characterization of the associated fault will provide improved location estimates.

  18. Cooperative Monitoring Center Occasional Paper/13: Cooperative monitoring for confidence building: A case study of the Sino-Indian border areas

    SciTech Connect

    SIDHU,WAHEGURU PAL SINGH; YUAN,JING-DONG; BIRINGER,KENT L.

    1999-08-01

    This occasional paper identifies applicable cooperative monitoring techniques and develops models for possible application in the context of the border between China and India. The 1993 and 1996 Sino-Indian agreements on maintaining peace and tranquility along the Line of Actual Control (LAC) and establishing certain confidence building measures (CBMs), including force reductions and limitation on military exercises along their common border, are used to examine the application of technically based cooperative monitoring in both strengthening the existing terms of the agreements and also enhancing trust. The paper also aims to further the understanding of how and under what conditions technology-based tools can assist in implementing existing agreements on arms control and confidence building. The authors explore how cooperative monitoring techniques can facilitate effective implementation of arms control agreements and CBMS between states and contribute to greater security and stability in bilateral, regional, and global contexts.

  19. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.


  20. Catalog of Earthquake Hypocenters at Alaskan Volcanoes: January 1 through December 31, 2006

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl

    2008-01-01

    Between January 1 and December 31, 2006, AVO located 8,666 earthquakes of which 7,783 occurred on or near the 33 volcanoes monitored within Alaska. Monitoring highlights in 2006 include: an eruption of Augustine Volcano, a volcanic-tectonic earthquake swarm at Mount Martin, elevated seismicity and volcanic unrest at Fourpeaked Mountain, and elevated seismicity and low-level tremor at Mount Veniaminof and Korovin Volcano. A new seismic subnetwork was installed on Fourpeaked Mountain. This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field during 2006, (2) a description of earthquake detection, recording, analysis, and data archival systems, (3) a description of seismic velocity models used for earthquake locations, (4) a summary of earthquakes located in 2006, and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2006.

  1. Earthquake sound perception

    NASA Astrophysics Data System (ADS)

    Tosi, Patrizia; Sbarra, Paola; De Rubeis, Valerio

    2012-12-01

    Sound is an effect produced by almost all earthquakes. Using a web-based questionnaire on earthquake effects that included questions relating to seismic sound, we collected 77,000 responses for recent shallow Italian earthquakes. An analysis of audibility attenuation indicated that the decrease of the percentage of respondents hearing the sound was proportional to the logarithm of the epicentral distance and linearly dependent on earthquake magnitude, in accordance with the behavior of ground displacement. Even if this result was based on Italian data, qualitative agreement with the results of theoretical displacement, and of a similar study based on French seismicity suggests wider validity. We also found that, given earthquake magnitude, audibility increased together with the observed macroseismic intensity, leading to the possibility of accounting for sound audibility in intensity assessment. Magnitude influenced this behavior, making small events easier to recognize, as suggested by their frequency content.

  2. On numerical earthquake prediction

    NASA Astrophysics Data System (ADS)

    Shi, Yaolin; Zhang, Bei; Zhang, Siqi; Zhang, Huai

    2014-06-01

    Can earthquakes be predicted? How should people overcome the difficulties encountered in the study of earthquake prediction? This issue can take inspiration from the experiences of weather forecast. Although weather forecasting took a period of about half a century to advance from empirical to numerical forecast, it has achieved significant success. A consensus has been reached among the Chinese seismological community that earthquake prediction must also develop from empirical forecasting to physical prediction. However, it is seldom mentioned that physical prediction is characterized by quantitatively numerical predictions based on physical laws. This article discusses five key components for numerical earthquake prediction and their current status. We conclude that numerical earthquake prediction should now be put on the planning agenda and its roadmap designed, seismic stations should be deployed and observations made according to the needs of numerical prediction, and theoretical research should be carried out.

  3. Demand surge following earthquakes

    USGS Publications Warehouse

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  4. Disruption of groundwater systems by earthquakes

    NASA Astrophysics Data System (ADS)

    Liao, Xin; Wang, Chi-Yuen; Liu, Chun-Ping

    2015-11-01

    Earthquakes are known to enhance permeability at great distances, and this phenomenon may also disrupt groundwater systems by breaching the barrier between different reservoirs. Here we analyze the tidal response of water level in a deep (~4 km) well before and after the 2008 M7.9 Wenchuan earthquake to show that the earthquake not only changed the permeability but also altered the poroelastic properties of the groundwater system. Based on lithologic well logs and experimental data for rock properties, we interpret the change to reflect a coseismic breaching of aquitards bounding the aquifer, due perhaps to clearing of preexisting cracks and creation of new cracks, to depths of several kilometers. This may cause mixing of groundwater from previously isolated reservoirs and impact the safety of groundwater supplies and underground waste repositories. The method demonstrated here may hold promise for monitoring aquitard breaching by both natural and anthropogenic processes.

  5. Crowd-Sourced Global Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  6. Earthquake-volcano interaction imaged by coda wave interferometry

    NASA Astrophysics Data System (ADS)

    Battaglia, Jean; Métaxian, Jean-Philippe; Garaebiti, Esline

    2012-06-01

    Large earthquakes are often assumed to influence the eruptive activity of volcanoes. A major challenge to better understand the causal relationship between these phenomena is to detect and image, in detail, all induced changes, including subtle, non-eruptive responses. We show that coda wave interferometry can be used to image such earthquake-induced responses, as recorded at Yasur volcano (Vanuatu) following a magnitude 7.3 earthquake which occurred 80 km from its summit. We use repeating Long-Period events to show that the earthquake caused a sudden seismic velocity drop, followed by a slow partial recovery process. The spatial distribution of the response amplitude indicates an effect centered on the volcano. Our result demonstrates that, even if no major change in eruptive activity is observed, volcanoes will be affected by the propagation of large amplitude seismic waves through their structure, suggesting that Earthquake-volcano interaction is likely a more common phenomenon than previously believed.

  7. An appraisal of aftershocks behavior for large earthquakes in Persia

    NASA Astrophysics Data System (ADS)

    Nemati, Majid

    2014-01-01

    This study focuses on the distribution of aftershocks in both location and magnitude for recent earthquakes in Iran. 43 Earthquakes are investigated, using data from the global International Seismological Center (ISC) seismic catalogue and from the regional earthquake catalogue of the Institute of Geophysics, University of Tehran (IGUT) between 1961-2006 and 2006-2012 respectively. We only consider the earthquakes with magnitude greater than 5.0. The majority of these events are intracontinental, occurring over four seismotectonic provinces across Iran. Processing aftershock sequences reported by both catalogues with cut-off magnitude of 2.5 and a sequence duration of 70 days, leads us to define a spatial horizontal area (A) occupied with the aftershocks as a function of mainshock magnitude (M) for Persian earthquakes: ISC: Log10(A) = 0.45MS + 0.23; IGUT: Log10(A) = 0.25MN + 1.7.

  8. Earthquake-induced water-level fluctuations at Yucca Mountain, Nevada, June 1992

    SciTech Connect

    O`Brien, G.M.

    1993-07-01

    This report presents earthquake-induced water-level and fluid-pressure data for wells in the Yucca Mountain area, Nevada, during June 1992. Three earthquakes occurred which caused significant water-level and fluid-pressure responses in wells. Wells USW H-5 and USW H-6 are continuously monitored to detect short-term responses caused by earthquakes. Two wells, monitored hourly, had significant, longer-term responses in water level following the earthquakes. On June 28, 1992, a 7.5-magnitude earthquake occurred near Landers, California causing an estimated maximum water-level change of 90 centimeters in well USW H-5. Three hours later a 6.6-magnitude earthquake occurred near Big Bear Lake, California; the maximum water-level fluctuation was 20 centimeters in well USW H-5. A 5.6-magnitude earthquake occurred at Little Skull Mountain, Nevada, on June 29, approximately 23 kilometers from Yucca Mountain. The maximum estimated short-term water-level fluctuation from the Little Skull Mountain earthquake was 40 centimeters in well USW H-5. The water level in well UE-25p {number_sign}1, monitored hourly, decreased approximately 50 centimeters over 3 days following the Little Skull Mountain earthquake. The water level in UE-25p {number_sign}1 returned to pre-earthquake levels in approximately 6 months. The water level in the lower interval of well USW H-3 increased 28 centimeters following the Little Skull Mountain earthquake. The Landers and Little Skull Mountain earthquakes caused responses in 17 intervals of 14 hourly monitored wells, however, most responses were small and of short duration. For several days following the major earthquakes, many smaller magnitude aftershocks occurred causing measurable responses in the continuously monitored wells.

  9. Phase Transformations and Earthquakes

    NASA Astrophysics Data System (ADS)

    Green, H. W.

    2011-12-01

    Phase transformations have been cited as responsible for, or at least involved in, "deep" earthquakes for many decades (although the concept of "deep" has varied). In 1945, PW Bridgman laid out in detail the string of events/conditions that would have to be achieved for a solid/solid transformation to lead to a faulting instability, although he expressed pessimism that the full set of requirements would be simultaneously achieved in nature. Raleigh and Paterson (1965) demonstrated faulting during dehydration of serpentine under stress and suggested dehydration embrittlement as the cause of intermediate depth earthquakes. Griggs and Baker (1969) produced a thermal runaway model of a shear zone under constant stress, culminating in melting, and proposed such a runaway as the origin of deep earthquakes. The discovery of Plate Tectonics in the late 1960s established the conditions (subduction) under which Bridgman's requirements for earthquake runaway in a polymorphic transformation could be possible in nature and Green and Burnley (1989) found that instability during the transformation of metastable olivine to spinel. Recent seismic correlation of intermediate-depth-earthquake hypocenters with predicted conditions of dehydration of antigorite serpentine and discovery of metastable olivine in 4 subduction zones, suggests strongly that dehydration embrittlement and transformation-induced faulting are the underlying mechanisms of intermediate and deep earthquakes, respectively. The results of recent high-speed friction experiments and analysis of natural fault zones suggest that it is likely that similar processes occur commonly during many shallow earthquakes after initiation by frictional failure.

  10. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt earthquakes within, as an average 90s of their occurrence, and can map, in certain cases, the damaged areas. Thanks to the flashsourced and crowdsourced information, we developed an innovative Twitter earthquake information service (currently under test and to be open by November) which intends to offer notifications for earthquakes that matter for the public only. It provides timely information for felt and damaging earthquakes regardless their magnitude and heads-up for seismologists. In conclusion, the experience developed at the EMSC demonstrates the benefit of involving eyewitnesses in earthquake surveillance. The data collected directly and indirectly from eyewitnesses complement information derived from monitoring networks and contribute to improved services. By increasing interaction between science and society, it opens new opportunities for raising awareness on seismic hazard.

  11. A search for paleoliquefaction and evidence bearing on the recurrence behavior of the great 1811-12 New Madrid earthquakes

    USGS Publications Warehouse

    Wesnousky, S.G.; Leffler, L.M.

    1994-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Missouri. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This professional paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  12. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Saragoni, G. Rodolfo

    2008-07-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  13. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    SciTech Connect

    Saragoni, G. Rodolfo

    2008-07-08

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  14. Implications for earthquake risk reduction in the United States from the Kocaeli, Turkey, earthquake of August 17, 1999

    USGS Publications Warehouse

    U.S. Geological Survey

    2000-01-01

    This report documents implications for earthquake risk reduction in the U.S. The magnitude 7.4 earthquake caused 17,127 deaths, 43,953 injuries, and displaced more than 250,000 people from their homes. The report warns that similar disasters are possible in the United States where earthquakes of comparable size strike the heart of American urban areas. Another concern described in the report is the delayed emergency response that was caused by the inadequate seismic monitoring system in Turkey, a problem that contrasts sharply with rapid assessment and response to the September Chi-Chi earthquake in Taiwan. Additionally, the experience in Turkey suggests that techniques for forecasting earthquakes may be improving.

  15. Magnitude and location of historical earthquakes in Japan and implications for the 1855 Ansei Edo earthquake

    USGS Publications Warehouse

    Bakun, W.H.

    2005-01-01

    Japan Meteorological Agency (JMA) intensity assignments IJMA are used to derive intensity attenuation models suitable for estimating the location and an intensity magnitude Mjma for historical earthquakes in Japan. The intensity for shallow crustal earthquakes on Honshu is equal to -1.89 + 1.42MJMA - 0.00887?? h - 1.66log??h, where MJMA is the JMA magnitude, ??h = (??2 + h2)1/2, and ?? and h are epicentral distance and focal depth (km), respectively. Four earthquakes located near the Japan Trench were used to develop a subducting plate intensity attenuation model where intensity is equal to -8.33 + 2.19MJMA -0.00550??h - 1.14 log ?? h. The IJMA assignments for the MJMA7.9 great 1923 Kanto earthquake on the Philippine Sea-Eurasian plate interface are consistent with the subducting plate model; Using the subducting plate model and 226 IJMA IV-VI assignments, the location of the intensity center is 25 km north of the epicenter, Mjma is 7.7, and MJMA is 7.3-8.0 at the 1?? confidence level. Intensity assignments and reported aftershock activity for the enigmatic 11 November 1855 Ansei Edo earthquake are consistent with an MJMA 7.2 Philippine Sea-Eurasian interplate source or Philippine Sea intraslab source at about 30 km depth. If the 1855 earthquake was a Philippine Sea-Eurasian interplate event, the intensity center was adjacent to and downdip of the rupture area of the great 1923 Kanto earthquake, suggesting that the 1855 and 1923 events ruptured adjoining sections of the Philippine Sea-Eurasian plate interface.

  16. Earthquakes and emergence

    NASA Astrophysics Data System (ADS)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  17. An Atlas of ShakeMaps for Selected Global Earthquakes

    USGS Publications Warehouse

    Allen, Trevor I.; Wald, David J.; Hotovec, Alicia J.; Lin, Kuo-Wan; Earle, Paul; Marano, Kristin D.

    2008-01-01

    An atlas of maps of peak ground motions and intensity 'ShakeMaps' has been developed for almost 5,000 recent and historical global earthquakes. These maps are produced using established ShakeMap methodology (Wald and others, 1999c; Wald and others, 2005) and constraints from macroseismic intensity data, instrumental ground motions, regional topographically-based site amplifications, and published earthquake-rupture models. Applying the ShakeMap methodology allows a consistent approach to combine point observations with ground-motion predictions to produce descriptions of peak ground motions and intensity for each event. We also calculate an estimated ground-motion uncertainty grid for each earthquake. The Atlas of ShakeMaps provides a consistent and quantitative description of the distribution and intensity of shaking for recent global earthquakes (1973-2007) as well as selected historic events. As such, the Atlas was developed specifically for calibrating global earthquake loss estimation methodologies to be used in the U.S. Geological Survey Prompt Assessment of Global Earthquakes for Response (PAGER) Project. PAGER will employ these loss models to rapidly estimate the impact of global earthquakes as part of the USGS National Earthquake Information Center's earthquake-response protocol. The development of the Atlas of ShakeMaps has also led to several key improvements to the Global ShakeMap system. The key upgrades include: addition of uncertainties in the ground motion mapping, introduction of modern ground-motion prediction equations, improved estimates of global seismic-site conditions (VS30), and improved definition of stable continental region polygons. Finally, we have merged all of the ShakeMaps in the Atlas to provide a global perspective of earthquake ground shaking for the past 35 years, allowing comparison with probabilistic hazard maps. The online Atlas and supporting databases can be found at http://earthquake.usgs.gov/eqcenter/shakemap/atlas.php/.

  18. Patient experiences with self-monitoring renal function after renal transplantation: results from a single-center prospective pilot study

    PubMed Central

    van Lint, Céline L; van der Boog, Paul JM; Wang, Wenxin; Brinkman, Willem-Paul; Rövekamp, Ton JM; Neerincx, Mark A; Rabelink, Ton J; van Dijk, Sandra

    2015-01-01

    Background After a kidney transplantation, patients have to visit the hospital often to monitor for early signs of graft rejection. Self-monitoring of creatinine in addition to blood pressure at home could alleviate the burden of frequent outpatient visits, but only if patients are willing to self-monitor and if they adhere to the self-monitoring measurement regimen. A prospective pilot study was conducted to assess patients’ experiences and satisfaction. Materials and methods For 3 months after transplantation, 30 patients registered self-measured creatinine and blood pressure values in an online record to which their physician had access to. Patients completed a questionnaire at baseline and follow-up to assess satisfaction, attitude, self-efficacy regarding self-monitoring, worries, and physician support. Adherence was studied by comparing the number of registered with the number of requested measurements. Results Patients were highly motivated to self-monitor kidney function, and reported high levels of general satisfaction. Level of satisfaction was positively related to perceived support from physicians (P<0.01), level of self-efficacy (P<0.01), and amount of trust in the accuracy of the creatinine meter (P<0.01). The use of both the creatinine and blood pressure meter was considered pleasant and useful, despite the level of trust in the accuracy of the creatinine device being relatively low. Trust in the accuracy of the creatinine device appeared to be related to level of variation in subsequent measurement results, with more variation being related to lower levels of trust. Protocol adherence was generally very high, although the range of adherence levels was large and increased over time. Conclusion Patients’ high levels of satisfaction suggest that at-home monitoring of creatinine and blood pressure after transplantation offers a promising strategy. Important prerequisites for safe implementation in transplant care seem to be support from physicians and patients’ confidence in both their own self-monitoring skills and the accuracy of the devices used. PMID:26673985

  19. Earthquakes in the Central United States, 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Volpi, Christina M.

    2010-01-01

    This publication is an update of an earlier report, U.S. Geological Survey (USGS) Geologic Investigation I-2812 by Wheeler and others (2003), titled ?Earthquakes in the Central United States-1699-2002.? Like the original poster, the center of the updated poster is a map showing the pattern of earthquake locations in the most seismically active part of the central United States. Arrayed around the map are short explanatory texts and graphics, which describe the distribution of historical earthquakes and the effects of the most notable of them. The updated poster contains additional, post 2002, earthquake data. These are 38 earthquakes covering the time interval from January 2003 to June 2010, including the Mount Carmel, Illinois, earthquake of 2008. The USGS Preliminary Determination of Epicenters (PDE) was the source of these additional data. Like the I-2812 poster, this poster was prepared for a nontechnical audience and designed to inform the general public as to the widespread occurrence of felt and damaging earthquakes in the Central United States. Accordingly, the poster should not be used to assess earthquake hazard in small areas or at individual locations.

  20. An application of earthquake prediction algorithm M8 in eastern Anatolia at the approach of the 2011 Van earthquake

    NASA Astrophysics Data System (ADS)

    Mojarab, Masoud; Kossobokov, Vladimir; Memarian, Hossein; Zare, Mehdi

    2015-07-01

    On 23rd October 2011, an M7.3 earthquake near the Turkish city of Van, killed more than 600 people, injured over 4000, and left about 60,000 homeless. It demolished hundreds of buildings and caused great damages to thousand others in Van, Ercis, Muradiye, and Çaldıran. The earthquake's epicenter is located about 70 km from a preceding M7.3 earthquake that occurred in November 1976 and destroyed several villages near the Turkey-Iran border and killed thousands of people. This study, by means of retrospective application of the M8 algorithm, checks to see if the 2011 Van earthquake could have been predicted. The algorithm is based on pattern recognition of Times of Increased Probability (TIP) of a target earthquake from the transient seismic sequence at lower magnitude ranges in a Circle of Investigation (CI). Specifically, we applied a modified M8 algorithm adjusted to a rather low level of earthquake detection in the region following three different approaches to determine seismic transients. In the first approach, CI centers are distributed on intersections of morphostructural lineaments recognized as prone to magnitude 7 + earthquakes. In the second approach, centers of CIs are distributed on local extremes of the seismic density distribution, and in the third approach, CI centers were distributed uniformly on the nodes of a 1∘×1∘ grid. According to the results of the M8 algorithm application, the 2011 Van earthquake could have been predicted in any of the three approaches. We noted that it is possible to consider the intersection of TIPs instead of their union to improve the certainty of the prediction results. Our study confirms the applicability of a modified version of the M8 algorithm for predicting earthquakes at the Iranian-Turkish plateau, as well as for mitigation of damages in seismic events in which pattern recognition algorithms may play an important role.

  1. Earthquakes and faults in southern California (1970-2010)

    USGS Publications Warehouse

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.

    2012-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.3 in southern California (1970–2010). The bathymetry was generated from digital files from the California Department of Fish And Game, Marine Region, Coastal Bathymetry Project. Elevation data are from the U.S. Geological Survey National Elevation Database. Landsat satellite image is from fourteen Landsat 5 Thematic Mapper scenes collected between 2009 and 2010. Fault data are reproduced with permission from 2006 California Geological Survey and U.S. Geological Survey data. The earthquake data are from the U.S. Geological Survey National Earthquake Information Center.

  2. Business Activity Monitoring: Real-Time Group Goals and Feedback Using an Overhead Scoreboard in a Distribution Center

    ERIC Educational Resources Information Center

    Goomas, David T.; Smith, Stuart M.; Ludwig, Timothy D.

    2011-01-01

    Companies operating large industrial settings often find delivering timely and accurate feedback to employees to be one of the toughest challenges they face in implementing performance management programs. In this report, an overhead scoreboard at a retailer's distribution center informed teams of order selectors as to how many tasks were…

  3. Satellite Relay Telemetry of Seismic Data in Earthquake Prediction and Control

    NASA Technical Reports Server (NTRS)

    Jackson, W. H.; Eaton, J. P.

    1971-01-01

    The Satellite Telemetry Earthquake Monitoring Program was started to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project. The principal advantages of the satellite relay system over commercial telephone or microwave systems were: (1) it could be made less prone to massive failure during a major earthquake; (2) it could be extended readily into undeveloped regions; and (3) it could provide flexible, uniform communications over large sections of major global tectonic zones. Fundamental characteristics of a communications system to cope with the large volume of raw data collected by a short-period seismograph network are discussed.

  4. Shaken, not stirred: a serendipitous study of ants and earthquakes.

    PubMed

    Lighton, John R B; Duncan, Frances D

    2005-08-01

    There is anecdotal evidence for profound behavioral changes prior to and during earthquakes in many organisms, including arthropods such as ants. Behavioral or physiological analysis has often, in light of these reports, been proposed as a means of earthquake prediction. We report here a serendipitous study of the effect of the powerful Landers earthquake in the Mojave Desert, USA (Richter magnitude 7.4) on ant trail dynamics and aerobic catabolism in the desert harvester ant Messor pergandei. We monitored trail traffic rates to and from the colony, trail speed, worker mass distributions, rates of aerobic catabolism and temperature at ant height before and during the earthquake, and for 3 days after the earthquake. Contrary to anecdotal reports of earthquake effects on ant behavior, the Landers earthquake had no effect on any measured aspect of the physiology or behavior of M. pergandei. We conclude that anecdotal accounts of the effects of earthquakes or their precursors on insect behavior should be interpreted with caution. PMID:16081608

  5. GEM - The Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade