Science.gov

Sample records for earthquake monitoring center

  1. Comprehensive Seismic Monitoring for Emergency Response and Hazards Assessment: Recent Developments at the USGS National Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Buland, R. P.; Guy, M.; Kragness, D.; Patton, J.; Erickson, B.; Morrison, M.; Bryon, C.; Ketchum, D.; Benz, H.

    2009-12-01

    The USGS National Earthquake Information Center (NEIC) has put into operation a new generation of seismic acquisition, processing and distribution subsystems that seamlessly integrate regional, national and global seismic network data for routine monitoring of earthquake activity and response to large, damaging earthquakes. The system, Bulletin Hydra, was designed to meet Advanced National Seismic System (ANSS) design goals to handle thousands of channels of real-time seismic data, compute and distribute time-critical seismic information for emergency response applications, and manage the integration of contributed earthquake products and information, arriving from near-real-time up to six weeks after an event. Bulletin Hydra is able meet these goals due to a modular, scalable, and flexible architecture that supports on-the-fly consumption of new data, readily allows for the addition of new scientific processing modules, and provides distributed client workflow management displays. Through the Edge subsystem, Bulletin Hydra accepts waveforms in half a dozen formats. In addition, Bulletin Hydra accepts contributed seismic information including hypocenters, magnitudes, moment tensors, unassociated and associated picks, and amplitudes in a variety of formats including earthworm import/export pairs and EIDS. Bulletin Hydra has state-driven algorithms for computing all IASPEI standard magnitudes (e.g. mb, mb_BB, ML, mb_LG, Ms_20, and Ms_BB) as well as Md, Ms(VMAX), moment tensor algorithms for modeling different portions of the wave-field at different distances (e.g. teleseismic body-wave, centroid, and regional moment tensors), and broadband depth. All contributed and derived data are centrally managed in an Oracle database. To improve on single station observations, Bulletin Hydra also does continuous real-time beam forming of high-frequency arrays. Finally, workflow management displays are used to assist NEIC analysts in their day-to-day duties. All combined

  2. NEIC - the National Earthquake Information Center

    USGS Publications Warehouse

    Masse, R.P.; Needham, R.E.

    1989-01-01

    The National Earthquake Information Center of the US Geological Survey has three main missions. First, the NEIC determines as rapidly and as accurately as possible, the location and size of all destructive earthquakes that occur worldwide. Second, the NEIC collects and provides to scientists and to the public an extensive seismic database that serves as a solid foundation for scientific research. Third, the NEIC pursues an active research program to improve its ability to locate earthquakes and to understand the earthquake mechanism. These efforts are all aimed at mitigating the risks of earthquakes to mankind. -from Authors

  3. NEIC; the National Earthquake Information Center

    USGS Publications Warehouse

    Masse, R.P.; Needham, R.E.

    1989-01-01

    At least 9,500 people were killed, 30,000 were injured and 100,000 were left homeless by this earthquake. According to some unconfirmed reports, the death toll from this earthquake may have been as high as 35,000. this earthquake is estimated to have seriously affected an area of 825,000 square kilometers, caused between 3 and 4 billion dollars in damage, and been felt by 20 million people. 

  4. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    USGS Publications Warehouse

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-01-01

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  5. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    USGS Publications Warehouse

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-11-02

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  6. Earthquake Observation through Groundwater Monitoring in South Korea

    NASA Astrophysics Data System (ADS)

    Piao, J.; Woo, N. C.

    2014-12-01

    According to previous researches, the influence of the some earthquakes can be detected by groundwater monitoring. Even in some countries groundwater monitoring is being used as an important tool to identify earthquake precursors and prediction measures. Thus, in this study we attempt to catch the anomalous changes in groundwater produced by earthquakes occurred in Korea through the National Groundwater Monitoring Network (NGMN). For observing the earthquake impacts on groundwater more effectively, from the National Groundwater Monitoring Network we selected 28 stations located in the five earthquake-prone zones in South Korea. And we searched the responses to eight earthquakes with M ≥2.5 which occurred in the vicinity of five earthquake-prone zones in 2012. So far, we tested the groundwater monitoring data (water-level, temperature and electrical conductivity). Those data have only been treated to remove barometric pressure changes. Then we found 29 anomalous changes, confirming that groundwater monitoring data can provide valuable information on earthquake effects. To identify the effect of the earthquake from mixture signals of water-level, other signals must be separated from the original data. Periodic signals will be separated from the original data using Fast Fourier Transform (FFT). After that we will attempt to separate precipitation effect, and determine if the anomalies were generated by earthquake or not.

  7. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  8. Remote sensing hazard monitoring and assessment in Yushu earthquake disaster

    NASA Astrophysics Data System (ADS)

    Wen, Qi; Xu, Feng; Chen, Shirong

    2011-12-01

    Yushu Earthquake of magnitude 7.1 Richter in 2010 has brought a huge loss of personal lives and properties to China. National Disaster Reduction Center of China implemented the disaster assessment by using remote sensing images and field investigation. Preliminary judgment of disaster scope and damage extent was acquired by change detection. And the building region of hard-hit area Jiegu town was partitioned into 3-level grids in airborne remote sensing images by street, type of use, structure, and about 685 girds were numbered. Hazard assessment expert group were sent to implement field investigation according to each grid. The housing damage scope and extent of loss were defined again integrated field investigation data and local government reported information. Though remote sensing technology has played an important role in huge disaster monitoring and assessment, the automatic capability of disaster information extraction flow, three-dimensional disaster monitoring mode and bidirectional feedback mechanism of products and services should still be further improved.

  9. Monitoring the Earthquake source process in North America

    USGS Publications Warehouse

    Herrmann, Robert B.; Benz, H.; Ammon, C.J.

    2011-01-01

    With the implementation of the USGS National Earthquake Information Center Prompt Assessment of Global Earthquakes for Response system (PAGER), rapid determination of earthquake moment magnitude is essential, especially for earthquakes that are felt within the contiguous United States. We report an implementation of moment tensor processing for application to broad, seismically active areas of North America. This effort focuses on the selection of regional crustal velocity models, codification of data quality tests, and the development of procedures for rapid computation of the seismic moment tensor. We systematically apply these techniques to earthquakes with reported magnitude greater than 3.5 in continental North America that are not associated with a tectonic plate boundary. Using the 0.02-0.10 Hz passband, we can usually determine, with few exceptions, moment tensor solutions for earthquakes with M w as small as 3.7. The threshold is significantly influenced by the density of stations, the location of the earthquake relative to the seismic stations and, of course, the signal-to-noise ratio. With the existing permanent broadband stations in North America operated for rapid earthquake response, the seismic moment tensor of most earthquakes that are M w 4 or larger can be routinely computed. As expected the nonuniform spatial pattern of these solutions reflects the seismicity pattern. However, the orientation of the direction of maximum compressive stress and the predominant style of faulting is spatially coherent across large regions of the continent.

  10. The USGS National Earthquake Information Center's Response to the Wenchuan, China Earthquake

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Wald, D. J.; Benz, H.; Sipkin, S.; Dewey, J.; Allen, T.; Jaiswal, K.; Buland, R.; Choy, G.; Hayes, G.; Hutko, A.

    2008-12-01

    Immediately after detecting the May 12th, 2008 Mw 7.9 Wenchuan Earthquake, the USGS National Earthquake Information Center (NEIC) began a coordinated effort to understand and communicate the earthquake's seismological characteristics, tectonic context, and humanitarian impact. NEIC's initial estimates of magnitude and location were distributed within 30 minutes of the quake by e-mail and text message to 70,000 users via the Earthquake Notification System. The release of these basic parameters automatically triggered the generation of more sophisticated derivative products that were used by relief and government agencies to plan their humanitarian response to the disaster. Body-wave and centroid moment tensors identified the earthquake's mechanism. Predictive ShakeMaps provided the first estimates of the geographic extent and amplitude of shaking. The initial automated population exposure estimate generated and distributed by the Prompt Assessment of Global Earthquakes for Response (PAGER) system stated that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater), indicating a large-scale disaster had occurred. NEIC's modeling of the mainshock and aftershocks was continuously refined and expanded. The length and orientation of the fault were determined from aftershocks, finite-fault models, and back-projection source imaging. Firsthand accounts of shaking intensity were collected and mapped by the "Did You Feel It" system. These results were used to refine our ShakeMaps and PAGER exposure estimates providing a more accurate assessment of the extent and enormity of the disaster. The products were organized and distributed in an event-specific summary poster and via the USGS Earthquake Program web pages where they were viewed by millions and reproduced by major media outlets (over 1/2 billion hits were served that month). Rather than just a point showing magnitude and epicenter, several of the media's schematic maps

  11. Southern California Earthquake Center (SCEC) Summer Internship Programs

    NASA Astrophysics Data System (ADS)

    Benthien, M. L.; Perry, S.; Jordan, T. H.

    2004-12-01

    For the eleventh consecutive year, the Southern California Earthquake Center (SCEC) coordinated undergraduate research experiences in summer 2004, allowing 35 students with a broad array of backgrounds and interests to work with the world's preeminent earthquake scientists and specialists. Students participate in interdisciplinary, system-level earthquake science and information technology research, and several group activities throughout the summer. Funding for student stipends and activities is made possible by the NSF Research Experiences for Undergraduates (REU) program. SCEC coordinates two intern programs: The SCEC Summer Undergraduate Research Experience (SCEC/SURE) and the SCEC Undergraduate Summer in Earthquake Information Technology (SCEC/USEIT). SCEC/SURE interns work one-on-one with SCEC scientists at their institutions on a variety of earthquake science research projects. The goals of the program are to expand student participation in the earth sciences and related disciplines, encourage students to consider careers in research and education, and to increase diversity of students and researchers in the earth sciences. 13 students participated in this program in 2004. SCEC/USEIT is an NSF REU site that brings undergraduate students from across the country to the University of Southern California each summer. SCEC/USEIT interns interact in a team-oriented research environment and are mentored by some of the nation's most distinguished geoscience and computer science researchers. The goals of the program are to allow undergraduates to use advanced tools of information technology to solve problems in earthquake research; close the gap between computer science and geoscience; and engage non-geoscience majors in the application of earth science to the practical problems of reducing earthquake risk. SCEC/USEIT summer research goals are structured around a grand challenge problem in earthquake information technology. For the past three years the students have

  12. Recent improvements in earthquake and tsunami monitoring in the Caribbean

    NASA Astrophysics Data System (ADS)

    Gee, L.; Green, D.; McNamara, D.; Whitmore, P.; Weaver, J.; Huang, P.; Benz, H.

    2007-12-01

    Following the catastrophic loss of life from the December 26, 2004, Sumatra-Andaman Islands earthquake and tsunami, the U.S. Government appropriated funds to improve monitoring along a major portion of vulnerable coastal regions in the Caribbean Sea, the Gulf of Mexico, and the Atlantic Ocean. Partners in this project include the United States Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the Puerto Rico Seismic Network (PRSN), the Seismic Research Unit of the University of the West Indies, and other collaborating institutions in the Caribbean region. As part of this effort, the USGS is coordinating with Caribbean host nations to design and deploy nine new broadband and strong-motion seismic stations. The instrumentation consists of an STS-2 seismometer, an Episensor accelerometer, and a Q330 high resolution digitizer. Six stations are currently transmitting data to the USGS National Earthquake Information Center, where the data are redistributed to the NOAA's Tsunami Warning Centers, regional monitoring partners, and the IRIS Data Management Center. Operating stations include: Isla Barro Colorado, Panama; Gun Hill Barbados; Grenville, Grenada; Guantanamo Bay, Cuba; Sabaneta Dam, Dominican Republic; and Tegucigalpa, Honduras. Three additional stations in Barbuda, Grand Turks, and Jamaica will be completed during the fall of 2007. These nine stations are affiliates of the Global Seismographic Network (GSN) and complement existing GSN stations as well as regional stations. The new seismic stations improve azimuthal coverage, increase network density, and provide on-scale recording throughout the region. Complementary to this network, NOAA has placed Deep-ocean Assessment and Reporting of Tsunami (DART) stations at sites in regions with a history of generating destructive tsunamis. Recently, NOAA completed deployment of 7 DART stations off the coasts of Montauk Pt, NY; Charleston, SC; Miami, FL; San Juan, Puerto Rico; New

  13. Northern California Earthquake Data Center: Data Sets and Data Services

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Allen, R. M.; Zuzlewski, S.

    2015-12-01

    The Northern California Earthquake Data Center (NCEDC) provides a permanent archive and real-time data distribution services for a unique and comprehensive data set of seismological and geophysical data sets encompassing northern and central California. We provide access to over 85 terabytes of continuous and event-based time series data from broadband, short-period, strong motion, and strain sensors as well as continuous and campaign GPS data at both standard and high sample rates. The Northen California Seismic System (NCSS), operated by UC Berkeley and USGS Menlo Park, has recorded over 900,000 events from 1984 to the present, and the NCEDC serves catalog, parametric information, moment tensors and first motion mechanisms, and time series data for these events. We also serve event catalogs, parametric information, and event waveforms for DOE enhanced geothermal system monitoring in northern California and Nevada. The NCEDC provides a several ways for users to access these data. The most recent development are web services, which provide interactive, command-line, or program-based workflow access to data. Web services use well-established server and client protocols and RESTful software architecture that allow users to easily submit queries and receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, simple text, or MiniSEED depending on the service and selected output format. The NCEDC supports all FDSN-defined web services as well as a number of IRIS-defined and NCEDC-defined services. We also continue to support older email-based and browser-based access to data. NCEDC data and web services can be found at http://www.ncedc.org and http://service.ncedc.org.

  14. Long Baseline Tilt Meter Array to Monitor Cascadia's Slow Earthquakes

    NASA Astrophysics Data System (ADS)

    Suszek, N.; Bilham, R.; Flake, R.; Melbourne, T. I.; Miller, M.

    2004-12-01

    Five biaxial Michelson tilt meters are currently being installed in the Puget Lowlands near Seattle to monitor dynamic tilt changes accompanying episodic slow earthquakes that occur at 20-40 km depth. Each tilt meter consists of a 1-2 m deep, 500-m-long, 15-cm diameter, horizontal, half-filled water-pipe, terminated by float sensors with sub-micron water-level resolution, similar to those that have operated unattended for the past decade within the Long Valley caldera. The sensors measure water height relative to the base of a pile driven to 10 m depth. A wide-body LVDT attached to this pile outside the reservoir, senses the motion of the core attached to the float within. The voltage indicating the position of the core is sampled 16 times a second, and digitally filtered before transmission via radio modem for storage as 1-minute samples in a remote computer. The computer gathers 16-bit water height, vault temperature, air pressure and various housekeeping data once per minute using remote telemetry. Installed during 2004, the first of the tilt meters, installed in 2004, float sensors at each end, and one in the center of each pipe, permit us to examine tilt signal coherence and local noise. Each adjacent pair of sensors has a tilt resolution of 2e-9 and a range of 8 microradians. We anticipate tilt signals with durations of 0.3-30 days, and amplitudes of less than 0.1 microradian associated with slow earthquakes. Anticipated noise levels in the tilt meters are 10-1000 times lower that these expected signals, similar to or better than signal-to-noise levels from planned strain meters of the PBO array.

  15. Real-time earthquake monitoring: Early warning and rapid response

    NASA Technical Reports Server (NTRS)

    1991-01-01

    A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.

  16. Lessons Learned from Creating the Public Earthquake Resource Center at CERI

    NASA Astrophysics Data System (ADS)

    Patterson, G. L.; Michelle, D.; Johnston, A.

    2004-12-01

    The Center for Earthquake Research and Information (CERI) at the University of Memphis opened the Public Earthquake Resource Center (PERC) in May 2004. The PERC is an interactive display area that was designed to increase awareness of seismology, Earth Science, earthquake hazards, and earthquake engineering among the general public and K-12 teachers and students. Funding for the PERC is provided by the US Geological Survey, The NSF-funded Mid America Earthquake Center, and the University of Memphis, with input from the Incorporated Research Institutions for Seismology. Additional space at the facility houses local offices of the US Geological Survey. PERC exhibits are housed in a remodeled residential structure at CERI that was donated by the University of Memphis and the State of Tennessee. Exhibits were designed and built by CERI and US Geological Survey staff and faculty with the help of experienced museum display subcontractors. The 600 square foot display area interactively introduces the basic concepts of seismology, real-time seismic information, seismic network operations, paleoseismology, building response, and historical earthquakes. Display components include three 22" flat screen monitors, a touch sensitive monitor, 3 helicorder elements, oscilloscope, AS-1 seismometer, life-sized liquefaction trench, liquefaction shake table, and building response shake table. All displays include custom graphics, text, and handouts. The PERC website at www.ceri.memphis.edu/perc also provides useful information such as tour scheduling, ask a geologist, links to other institutions, and will soon include a virtual tour of the facility. Special consideration was given to address State science standards for teaching and learning in the design of the displays and handouts. We feel this consideration is pivotal to the success of any grass roots Earth Science education and outreach program and represents a valuable lesson that has been learned at CERI over the last several

  17. Utilizing online monitoring of water wells for detecting earthquake precursors

    NASA Astrophysics Data System (ADS)

    Reuveni, Y.; Anker, Y.; Inbar, N.; Yellin-Dror, A.; Guttman, J.; Flexer, A.

    2015-12-01

    Groundwater reaction to earthquakes is well known and documented, mostly as changes in water levels or springs discharge, but also as changes in groundwater chemistry. During 2004 groundwater level undulations preceded a series of moderate (ML~5) earthquakes, which occurred along the Dead Sea Rift System (DSRS). In order to try and validate these preliminary observations monitoring of several observation wells was initiated. The monitoring and telemetry infrastructure as well as the wells were allocated specifically for the research by the Israeli National Water Company (Mekorot LTD.). Once several earthquake events were skipped due to insufficient sampling frequency and owing to insufficient storage capacity that caused loss of data, it was decided to establish an independent monitoring system. This current stage of research had commenced at 2011 and just recently became fully operative. At present there are four observation wells that are located along major faults, adjacent to the DSRS. The wells must be inactive and with a confined production layer. The wells are equipped with sensors for groundwter level, water conductivity and groundwater temperature measurements. The data acquisition and transfer resolution is of one minute and the dataset is being transferred through a GPRS network to a central database server. Since the start of the present research stage, most of the earthquakes recorded at the vicinity of the DSRS were smaller then ML 5, with groundwater response only after the ground movement. Nonetheless, distant earthquakes occurring as far as 300 km along a DSRS adjacent fault (ML~3), were noticed at the observation wells. A recent earthquake precursory reoccurrence was followed by a 5.5ML earthquake with an epicenter near the eastern shore of the Red Sea about 400km south to the wells that alerted the quake (see figure). In both wells anomalies is water levels and conductivity were found few hours before the quake, although any single anomaly cannot

  18. An academic center's delivery of care after the Haitian earthquake.

    PubMed

    Jaffer, Amir K; Campo, Rafael E; Gaski, Greg; Reyes, Mario; Gebhard, Ralf; Ginzburg, Enrique; Kolber, Michael A; Macdonald, John; Falcone, Steven; Green, Barth A; Barreras-Pagan, Lazara; O'Neill, William W

    2010-08-17

    The Miller School of Medicine of the University of Miami and Project Medishare, an affiliated not-for-profit organization, provided a large-scale relief effort in Haiti after the earthquake of 12 January 2010. Their experience demonstrates that academic medical centers in proximity to natural disasters can help deliver effective medical care through a coordinated process involving mobilization of their own resources, establishment of focused management teams at home and on the ground with formal organizational oversight, and partnership with governmental and nongovernmental relief agencies. Proximity to the disaster area allows for prompt arrival of medical personnel and equipment. The recruitment and organized deployment of large numbers of local and national volunteers are indispensable parts of this effort. Multidisciplinary teams on short rotations can form the core of the medical response.

  19. Helping safeguard Veterans Affairs' hospital buildings by advanced earthquake monitoring

    USGS Publications Warehouse

    Kalkan, Erol; Banga, Krishna; Ulusoy, Hasan S.; Fletcher, Jon Peter B.; Leith, William S.; Blair, James L.

    2012-01-01

    In collaboration with the U.S. Department of Veterans Affairs (VA), the National Strong Motion Project of the U.S. Geological Survey has recently installed sophisticated seismic systems that will monitor the structural integrity of hospital buildings during earthquake shaking. The new systems have been installed at more than 20 VA medical campuses across the country. These monitoring systems, which combine sensitive accelerometers and real-time computer calculations, are capable of determining the structural health of each structure rapidly after an event, helping to ensure the safety of patients and staff.

  20. Integrating geomatics and structural investigation in post-earthquake monitoring of ancient monumental Buildings

    NASA Astrophysics Data System (ADS)

    Dominici, Donatella; Galeota, Dante; Gregori, Amedeo; Rosciano, Elisa; Alicandro, Maria; Elaiopoulos, Michail

    2014-06-01

    The old city center of L’Aquila is rich in historical buildings of considerable merit. On April 6th 2009 a devastating earthquake caused significant structural damages, affecting especially historical and monumental masonry buildings. The results of a study carried out on a monumental building, former headquarters of the University of L’Aquila (The Camponeschi building, XVI century) are presented in this paper. The building is situated in the heart of the old city center and was seriously damaged by the earthquake. Preliminary visual damage analysis carried out immediately after the quake, clearly evidenced the building’s complexity, raising the need for direct and indirect investigation on the structure. Several non-destructive test methods were then performed in situ to better characterize the masonry typology and the damage distribution, as well. Subsequently, a number of representative control points were identified on the building’s facades to represent, by their motion over time, the evolution of the structural displacements and deformations. In particular, a surveying network consisting of 27 different points was established. A robotic total station mounted on top of a concrete pillar was used for periodically monitoring the surveying control network. Stability of the pillar was checked through a GNSS static survey repeated before any set of measurements. The present study evidences the interesting possibilities of combining geomatics with structural investigation during post-earthquake monitoring of ancient monumental buildings.

  1. High resolution strain sensor for earthquake precursor observation and earthquake monitoring

    NASA Astrophysics Data System (ADS)

    Zhang, Wentao; Huang, Wenzhu; Li, Li; Liu, Wenyi; Li, Fang

    2016-05-01

    We propose a high-resolution static-strain sensor based on a FBG Fabry-Perot interferometer (FBG-FP) and a wavelet domain cross-correlation algorithm. This sensor is used for crust deformation measurement, which plays an important role in earthquake precursor observation. The Pound-Drever-Hall (PDH) technique based on a narrow-linewidth tunable fiber laser is used to interrogate the FBG-FPs. A demodulation algorithm based on wavelet domain cross-correlation is used to calculate the wavelength difference. The FBG-FP sensor head is fixed on the two steel alloy rods which are installed in the bedrock. The reference FBG-FP is placed in a strain-free state closely to compensate the environment temperature fluctuation. A static-strain resolution of 1.6 n(epsilon) can be achieved. As a result, clear solid tide signals and seismic signals can be recorded, which suggests that the proposed strain sensor can be applied to earthquake precursor observation and earthquake monitoring.

  2. CISN ShakeAlert Earthquake Early Warning System Monitoring Tools

    NASA Astrophysics Data System (ADS)

    Henson, I. H.; Allen, R. M.; Neuhauser, D. S.

    2015-12-01

    CISN ShakeAlert is a prototype earthquake early warning system being developed and tested by the California Integrated Seismic Network. The system has recently been expanded to support redundant data processing and communications. It now runs on six machines at three locations with ten Apache ActiveMQ message brokers linking together 18 waveform processors, 12 event association processes and 4 Decision Module alert processes. The system ingests waveform data from about 500 stations and generates many thousands of triggers per day, from which a small portion produce earthquake alerts. We have developed interactive web browser system-monitoring tools that display near real time state-of-health and performance information. This includes station availability, trigger statistics, communication and alert latencies. Connections to regional earthquake catalogs provide a rapid assessment of the Decision Module hypocenter accuracy. Historical performance can be evaluated, including statistics for hypocenter and origin time accuracy and alert time latencies for different time periods, magnitude ranges and geographic regions. For the ElarmS event associator, individual earthquake processing histories can be examined, including details of the transmission and processing latencies associated with individual P-wave triggers. Individual station trigger and latency statistics are available. Detailed information about the ElarmS trigger association process for both alerted events and rejected events is also available. The Google Web Toolkit and Map API have been used to develop interactive web pages that link tabular and geographic information. Statistical analysis is provided by the R-Statistics System linked to a PostgreSQL database.

  3. Collaborative Projects at the Northern California Earthquake Data Center (NCEDC)

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Oppenheimer, D.; Zuzlewski, S.; Gee, L.; Murray, M.; Bassett, A.; Prescott, W.; Romanowicz, B.

    2001-12-01

    SELECT command, to perform queries on the GDVs, and developed a program which converts the MSQL to an SQL request. MSQL2SQL converts the MSQL command into a parse tree, and defines an API allowing each datacenter to traverse the parse tree and revise it to produce a data center-specific SQL request. The NCEDC converted the IRIS SeismiQuery program to use the GDVs and MSQL, installed it at the NCEDC, and distributed the software to IRIS, SCEC-DC, and other interested parties. The resulting program should be much easier to install and support at other data centers. The NCEDC is also working on several data center integration projects in order to provide users with seamless access to data. The NCEDC is collaborating with IRIS on the NETDC project and with UNAVCO on the GPS Seamless Archive Centers initiative. Through the newly formed California Integrated Seismic Network, we are working with the SCEC-DC to provide unified access to California earthquake data.

  4. Archiving and Distributing Seismic Data at the Southern California Earthquake Data Center (SCEDC)

    NASA Astrophysics Data System (ADS)

    Appel, V. L.

    2002-12-01

    The Southern California Earthquake Data Center (SCEDC) archives and provides public access to earthquake parametric and waveform data gathered by the Southern California Seismic Network and since January 1, 2001, the TriNet seismic network, southern California's earthquake monitoring network. The parametric data in the archive includes earthquake locations, magnitudes, moment-tensor solutions and phase picks. The SCEDC waveform archive prior to TriNet consists primarily of short-period, 100-samples-per-second waveforms from the SCSN. The addition of the TriNet array added continuous recordings of 155 broadband stations (20 samples per second or less), and triggered seismograms from 200 accelerometers and 200 short-period instruments. Since the Data Center and TriNet use the same Oracle database system, new earthquake data are available to the seismological community in near real-time. Primary access to the database and waveforms is through the Seismogram Transfer Program (STP) interface. The interface enables users to search the database for earthquake information, phase picks, and continuous and triggered waveform data. Output is available in SAC, miniSEED, and other formats. Both the raw counts format (V0) and the gain-corrected format (V1) of COSMOS (Consortium of Organizations for Strong-Motion Observation Systems) are now supported by STP. EQQuest is an interface to prepackaged waveform data sets for select earthquakes in Southern California stored at the SCEDC. Waveform data for large-magnitude events have been prepared and new data sets will be available for download in near real-time following major events. The parametric data from 1981 to present has been loaded into the Oracle 9.2.0.1 database system and the waveforms for that time period have been converted to mSEED format and are accessible through the STP interface. The DISC optical-disk system (the "jukebox") that currently serves as the mass-storage for the SCEDC is in the process of being replaced

  5. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  6. Evaluation of the completeness and accuracy of an earthquake catalogue based on hydroacoustic monitoring

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.

    2002-12-01

    NOAA's Pacific Marine Environment Laboratory (PMEL) produces a catalogue of Pacific Ocean earthquakes based on hydroacoustic monitoring from April 1996. The International Seismological Centre (ISC) worked without referring to the PMEL catalogue for earthquakes through April 2000, so the ISC and PMEL catalogues are independent until then. The PMEL catalogue includes many more intraplate and mid-ocean ridge earthquakes; more than 20 times as many earthquakes as the ISC catalogue in some areas. In some areas ISC earthquakes are nearly a strict subset PMEL earthquakes, but elsewhere many ISC earthquakes are not in the PMEL catalogue. Along the Pacific-Antarctic Plate Boundary (45°-70°S, 110°-180°W), for example, the PMEL catalogue misses out many ISC earthquakes, including a few MW(Harvard)>5 crustal earthquakes. Near the Cocos Ridge (2°-7°N, 81°-88°E) for many of the earthquakes in each catalogue, there is no corresponding earthquake in the other. Among earthquakes that are in both catalogues, location differences may be much greater than the formal location uncertainties. But formal errors are known to underestimate true location errors, so studying the seismic arrival time residuals with respect to the hydroacoucoustic origins and hydroacoustic arrival times residuals with respect to the seismic origins provides a more rigorous evaluation of the intrinsic differences between these two monitoring technologies.

  7. Romanian Data Center: A modern way for seismic monitoring

    NASA Astrophysics Data System (ADS)

    Neagoe, Cristian; Marius Manea, Liviu; Ionescu, Constantin

    2014-05-01

    The main seismic survey of Romania is performed by the National Institute for Earth Physics (NIEP) which operates a real-time digital seismic network. The NIEP real-time network currently consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Mark l4c, Ranger, gs21, Mark l22) and acceleration sensors (Episensor Kinemetrics). The data are transmitted at the National Data Center (NDC) and Eforie Nord (EFOR) Seismic Observatory. EFOR is the back-up for the NDC and also a monitoring center for the Black Sea tsunami events. NIEP is a data acquisition node for the seismic network of Moldova (FDSN code MD) composed of five seismic stations. NIEP has installed in the northern part of Bulgaria eight seismic stations equipped with broadband sensors and Episensors and nine accelerometers (Episensors) installed in nine districts along the Danube River. All the data are acquired at NIEP for Early Warning System and for primary estimation of the earthquake parameters. The real-time acquisition (RT) and data exchange is done by Antelope software and Seedlink (from Seiscomp3). The real-time data communication is ensured by different types of transmission: GPRS, satellite, radio, Internet and a dedicated line provided by a governmental network. For data processing and analysis at the two data centers Antelope 5.2 TM is being used running on 3 workstations: one from a CentOS platform and two on MacOS. Also a Seiscomp3 server stands as back-up for Antelope 5.2 Both acquisition and analysis of seismic data systems produce information about local and global parameters of earthquakes. In addition, Antelope is used for manual processing (event association, calculation of magnitude, creating a database, sending seismic bulletins, calculation of PGA and PGV, etc.), generating

  8. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  9. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  10. Quantifying 10 years of improved earthquake-monitoring performance in the Caribbean region

    USGS Publications Warehouse

    McNamara, Daniel E.; Hillebrandt-Andrade, Christa; Saurel, Jean-Marie; Huerfano-Moreno, V.; Lynch, Lloyd

    2015-01-01

    Over 75 tsunamis have been documented in the Caribbean and adjacent regions during the past 500 years. Since 1500, at least 4484 people are reported to have perished in these killer waves. Hundreds of thousands are currently threatened along the Caribbean coastlines. Were a great tsunamigenic earthquake to occur in the Caribbean region today, the effects would potentially be catastrophic due to an increasingly vulnerable region that has seen significant population increases in the past 40–50 years and currently hosts an estimated 500,000 daily beach visitors from North America and Europe, a majority of whom are not likely aware of tsunami and earthquake hazards. Following the magnitude 9.1 Sumatra–Andaman Islands earthquake of 26 December 2004, the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Early Warning System for the Caribbean and Adjacent Regions (CARIBE‐EWS) was established and developed minimum performance standards for the detection and analysis of earthquakes. In this study, we model earthquake‐magnitude detection threshold and P‐wave detection time and demonstrate that the requirements established by the UNESCO ICG CARIBE‐EWS are met with 100% of the network operating. We demonstrate that earthquake‐monitoring performance in the Caribbean Sea region has improved significantly in the past decade as the number of real‐time seismic stations available to the National Oceanic and Atmospheric Administration tsunami warning centers have increased. We also identify weaknesses in the current international network and provide guidance for selecting the optimal distribution of seismic stations contributed from existing real‐time broadband national networks in the region.

  11. Monitoring of ULF (ultra-low-frequency) Geomagnetic Variations Associated with Earthquakes

    PubMed Central

    Hayakawa, Masashi; Hattori, Katsumi; Ohta, Kenji

    2007-01-01

    ULF (ultra-low-frequency) electromagnetic emission is recently recognized as one of the most promising candidates for short-term earthquake prediction. This paper reviews previous convincing evidence on the presence of ULF emissions before a few large earthquakes. Then, we present our network of ULF monitoring in the Tokyo area by describing our ULF magnetic sensors and we finally present a few, latest results on seismogenic electromagnetic emissions for recent large earthquakes with the use of sophisticated signal processings.

  12. Earthquakes

    ERIC Educational Resources Information Center

    Roper, Paul J.; Roper, Jere Gerard

    1974-01-01

    Describes the causes and effects of earthquakes, defines the meaning of magnitude (measured on the Richter Magnitude Scale) and intensity (measured on a modified Mercalli Intensity Scale) and discusses earthquake prediction and control. (JR)

  13. Real-time earthquake monitoring using a search engine method.

    PubMed

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  14. Real-time earthquake monitoring using a search engine method.

    PubMed

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-04

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  15. The Development of an Earthquake Preparedness Plan for a Child Care Center in a Geologically Hazardous Region.

    ERIC Educational Resources Information Center

    Wokurka, Linda

    The director of a child care center at a community college in California developed an earthquake preparedness plan for the center which met state and local requirements for earthquake preparedness at schools. The plan consisted of: (1) the identification and reduction of nonstructural hazards in classrooms, office, and staff rooms; (2) storage of…

  16. Real-time earthquake monitoring using a search engine method

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data.

  17. Real-time earthquake monitoring using a search engine method

    PubMed Central

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  18. Tectonic earthquakes in Greenland: An overview of the monitoring achievements during the last decades

    NASA Astrophysics Data System (ADS)

    Voss, P.

    2011-12-01

    The seismic monitoring in Greenland has been greatly improved during the last decades, and has provided new insight into the earthquake activity. Results from the earthquake monitoring in Greenland during this period are presented, and these results show; a large increase of the number of detected earthquakes, an improved detection threshold, new areas of high seismicity, several earthquake clusters and seismicity below the ice cap. Despite the improved monitoring, events detection is still performed manually, by analyzing all of the real time data. With a station separation of around 400km many earthquakes are only detected on one or two stations which make automatic detection very difficult. But improved instrumentation has enabled the use of single station location technique. Results from and challenges using this method are presented. The development of the seismic monitoring have gone from having only three seismic stations placed in Greenland in the 1970'ties, till today where there are 17 permanent stations and a similar number of temporary stations placed in Greenland. All equipped with broadband sensors and 12 of the permanent stations transmit data in real time. The resent major improvement of the seismic monitoring is performed by the Greenland ice sheet monitoring network (GLISN, http://glisn.info). The primary goal of GLISN is to provide broadband seismic data for the detection of Glacial earthquakes. GLISN will be fully implemented when the Iridium real time data transfer is in operation at five stations, hopefully by mid 2012.

  19. Korea Integrated Seismic System tool(KISStool) for seismic monitoring and data sharing at the local data center

    NASA Astrophysics Data System (ADS)

    Park, J.; Chi, H. C.; Lim, I.; Jeong, B.

    2011-12-01

    The Korea Integrated Seismic System(KISS) is a back-bone seismic network which distributes seismic data to different organizations in near-real time at Korea. The association of earthquake monitoring institutes has shared their seismic data through the KISS from 2003. Local data centers operating remote several stations need to send their free field seismic data to NEMA(National Emergency Management Agency) by the law of countermeasure against earthquake hazard in Korea. It is very important the efficient tool for local data centers which want to rapidly detect local seismic intensity and to transfer seismic event information toward national wide data center including PGA, PGV, dominant frequency of P-wave, raw data, and etc. We developed the KISStool(Korea Integrated Seismic System tool) for easy and convenient operation seismic network in local data center. The KISStool has the function of monitoring real time waveforms by clicking station icon on the Google map and real time variation of PGA, PGV, and other data by opening the bar type monitoring section. If they use the KISStool, any local data center can transfer event information to NEMA(National Emergency Management Agency), KMA(Korea Meteorological Agency) or other institutes through the KISS using UDP or TCP/IP protocols. The KISStool is one of the most efficient methods to monitor and transfer earthquake event at local data center in Korea. KIGAM will support this KISStool not only to the member of the monitoring association but also local governments.

  20. Real-Time GPS Monitoring for Earthquake Rapid Assessment in the San Francisco Bay Area

    NASA Astrophysics Data System (ADS)

    Guillemot, C.; Langbein, J. O.; Murray, J. R.

    2012-12-01

    The U.S. Geological Survey Earthquake Science Center has deployed a network of eight real-time Global Positioning System (GPS) stations in the San Francisco Bay area and is implementing software applications to continuously evaluate the status of the deformation within the network. Real-time monitoring of the station positions is expected to provide valuable information for rapidly estimating source parameters should a large earthquake occur in the San Francisco Bay area. Because earthquake response applications require robust data access, as a first step we have developed a suite of web-based applications which are now routinely used to monitor the network's operational status and data streaming performance. The web tools provide continuously updated displays of important telemetry parameters such as data latency and receive rates, as well as source voltage and temperature information within each instrument enclosure. Automated software on the backend uses the streaming performance data to mitigate the impact of outages, radio interference and bandwidth congestion on deformation monitoring operations. A separate set of software applications manages the recovery of lost data due to faulty communication links. Displacement estimates are computed in real-time for various combinations of USGS, Plate Boundary Observatory (PBO) and Bay Area Regional Deformation (BARD) network stations. We are currently comparing results from two software packages (one commercial and one open-source) used to process 1-Hz data on the fly and produce estimates of differential positions. The continuous monitoring of telemetry makes it possible to tune the network to minimize the impact of transient interruptions of the data flow, from one or more stations, on the estimated positions. Ongoing work is focused on using data streaming performance history to optimize the quality of the position, reduce drift and outliers by switching to the best set of stations within the network, and

  1. Remote monitoring of the earthquake cycle using satellite radar interferometry.

    PubMed

    Wright, Tim J

    2002-12-15

    The earthquake cycle is poorly understood. Earthquakes continue to occur on previously unrecognized faults. Earthquake prediction seems impossible. These remain the facts despite nearly 100 years of intensive study since the earthquake cycle was first conceptualized. Using data acquired from satellites in orbit 800 km above the Earth, a new technique, radar interferometry (InSAR), has the potential to solve these problems. For the first time, detailed maps of the warping of the Earth's surface during the earthquake cycle can be obtained with a spatial resolution of a few tens of metres and a precision of a few millimetres. InSAR does not need equipment on the ground or expensive field campaigns, so it can gather crucial data on earthquakes and the seismic cycle from some of the remotest areas of the planet. In this article, I review some of the remarkable observations of the earthquake cycle already made using radar interferometry and speculate on breakthroughs that are tantalizingly close. PMID:12626271

  2. A Potential of Borehole Strainmeters for Continuous Monitoring of Stress Change Associated with Earthquakes

    NASA Astrophysics Data System (ADS)

    Soh, Inho; Chang, Chandong

    2016-04-01

    conduct Coulomb stress transfer models to estimate the stress drops using various earthquake parameters (earthquake magnitudes, fault length, and slip displacement), and the modelled transferred stress drops at the strainmeter site are estimated to be similar to (or an order of magnitude lower than) those determined from the strainmeter data. Our study demonstrates that there is a strong applicability of the strainmeter data for continuous stress monitoring with a special emphasis on earthquakes.

  3. A Potential of Borehole Strainmeters for Continuous Monitoring of Stress Change Associated with Earthquakes

    NASA Astrophysics Data System (ADS)

    Soh, Inho; Chang, Chandong

    2016-04-01

    Coulomb stress transfer models to estimate the stress drops using various earthquake parameters (earthquake magnitudes, fault length, and slip displacement), and the modelled transferred stress drops at the strainmeter site are estimated to be similar to (or an order of magnitude lower than) those determined from the strainmeter data. Our study demonstrates that there is a strong applicability of the strainmeter data for continuous stress monitoring with a special emphasis on earthquakes.

  4. A hospital as victim and responder: the Sepulveda VA Medical Center and the Northridge earthquake.

    PubMed

    Chavez, C W; Binder, B

    1996-01-01

    Many hospital emergency plans focus on the hospital as a disaster responder, with a fully operational medical facility, able to receive and treat mass casualties from a clearly defined accident scene. However, hospitals need to prepare a response for extreme casualty events such as earthquakes, tornadoes, or hurricanes. This article describes the planning, mitigation, response, and recovery of a major medical--surgical center thrust into a victim responder role following the devastating Northridge earthquake. The subsequent evacuation and care of patients, treatment of casualties, incident command, prior education and training, and recovery issues are addressed.

  5. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  6. Seismic monitoring at Cascade Volcanic Centers, 2004?status and recommendations

    USGS Publications Warehouse

    Moran, Seth C.

    2004-01-01

    The purpose of this report is to assess the current (May, 2004) status of seismic monitoring networks at the 13 major Cascade volcanic centers. Included in this assessment are descriptions of each network, analyses of the ability of each network to detect and to locate seismic activity, identification of specific weaknesses in each network, and a prioritized list of those networks that are most in need of additional seismic stations. At the outset it should be recognized that no Cascade volcanic center currently has an adequate seismic network relative to modern-day networks at Usu Volcano (Japan) or Etna and Stromboli volcanoes (Italy). For a system the size of Three Sisters, for example, a modern-day, cutting-edge seismic network would ideally consist of a minimum of 10 to 12 short-period three-component seismometers (for determining particle motions, reliable S-wave picks, moment tensor inversions, fault-plane solutions, and other important seismic parameters) and 7 to 10 broadband sensors (which, amongst other considerations, enable detection and location of very long period (VLP) and other low-frequency events, moment tensor inversions, and, because of their wide dynamic range, on-scale recording of large-amplitude events). Such a dense, multi component seismic network would give the ability to, for example, detect in near-real-time earthquake migrations over a distance of ~0.5km or less, locate tremor sources, determine the nature of a seismic source (that is, pure shear, implosive, explosive), provide on-scale recordings of very small and very large-amplitude seismic signals, and detect localized changes in seismic stress tensor orientations caused by movement of magma bodies. However, given that programmatic resources are currently limited, installation of such networks at this time is unrealistic. Instead, this report focuses on identifying what additional stations are needed to guarantee that anomalous seismicity associated with volcanic unrest will be

  7. Effects of a major earthquake on calls to regional poison control centers.

    PubMed

    Nathan, A R; Olson, K R; Everson, G W; Kearney, T E; Blanc, P D

    1992-03-01

    We retrospectively evaluated the effect of the Loma Prieta earthquake on calls to 2 designated regional poison control centers (San Francisco and Santa Clara) in the area. In the immediate 12 hours after the earthquake, there was an initial drop (31%) in call volume, related to telephone system overload and other technical problems. Calls from Bay Area counties outside of San Francisco and Santa Clara decreased more dramatically than those from within the host counties where the poison control centers are located. In the next 2 days, each poison control center then handled a 27% increase in call volume. Requests for information regarding safety of water supplies and other environmental concerns were significantly increased. The number of cases of actual poisoning exposure decreased, particularly poison and drug ingestions in children. Most calls directly related to the earthquake included spills and leaks of hazardous materials and questions about water and food safety. Regional poison control centers play an essential role in the emergency medical response to major disasters and are critically dependent on an operational telephone system.

  8. 88 hours: the U.S. Geological Survey National Earthquake Information Center response to the March 11, 2011 Mw 9.0 Tohoku earthquake

    USGS Publications Warehouse

    Wald, David J.; Hayes, Gavin P.; Benz, Harley M.; Earle, Paul; Briggs, Richard W.

    2011-01-01

    The M 9.0 11 March 2011 Tohoku, Japan, earthquake and associated tsunami near the east coast of the island of Honshu caused tens of thousands of deaths and potentially over one trillion dollars in damage, resulting in one of the worst natural disasters ever recorded. The U.S. Geological Survey National Earthquake Information Center (USGS NEIC), through its responsibility to respond to all significant global earthquakes as part of the National Earthquake Hazards Reduction Program, quickly produced and distributed a suite of earthquake information products to inform emergency responders, the public, the media, and the academic community of the earthquake's potential impact and to provide scientific background for the interpretation of the event's tectonic context and potential for future hazard. Here we present a timeline of the NEIC response to this devastating earthquake in the context of rapidly evolving information emanating from the global earthquake-response community. The timeline includes both internal and publicly distributed products, the relative timing of which highlights the inherent tradeoffs between the requirement to provide timely alerts and the necessity for accurate, authoritative information. The timeline also documents the iterative and evolutionary nature of the standard products produced by the NEIC and includes a behind-the-scenes look at the decisions, data, and analysis tools that drive our rapid product distribution.

  9. Earthquake!

    ERIC Educational Resources Information Center

    Hernandez, Hildo

    2000-01-01

    Examines the types of damage experienced by California State University at Northridge during the 1994 earthquake and what lessons were learned in handling this emergency are discussed. The problem of loose asbestos is addressed. (GR)

  10. Earthquakes

    USGS Publications Warehouse

    Shedlock, Kaye M.; Pakiser, Louis Charles

    1998-01-01

    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  11. The Pacific Tsunami Warning Center's Response to the Tohoku Earthquake and Tsunami

    NASA Astrophysics Data System (ADS)

    Weinstein, S. A.; Becker, N. C.; Shiro, B.; Koyanagi, K. K.; Sardina, V.; Walsh, D.; Wang, D.; McCreery, C. S.; Fryer, G. J.; Cessaro, R. K.; Hirshorn, B. F.; Hsu, V.

    2011-12-01

    The largest Pacific basin earthquake in 47 years, and also the largest magnitude earthquake since the Sumatra 2004 earthquake, struck off of the east coast of the Tohoku region of Honshu, Japan at 5:46 UTC on 11 March 2011. The Tohoku earthquake (Mw 9.0) generated a massive tsunami with runups of up to 40m along the Tohoku coast. The tsunami waves crossed the Pacific Ocean causing significant damage as far away as Hawaii, California, and Chile, thereby becoming the largest, most destructive tsunami in the Pacific Basin since 1960. Triggers on the seismic stations at Erimo, Hokkaido (ERM) and Matsushiro, Honshu (MAJO), alerted Pacific Tsunami Warning Center (PTWC) scientists 90 seconds after the earthquake began. Four minutes after its origin, and about one minute after the earthquake's rupture ended, PTWC issued an observatory message reporting a preliminary magnitude of 7.5. Eight minutes after origin time, the Japan Meteorological Agency (JMA) issued its first international tsunami message in its capacity as the Northwest Pacific Tsunami Advisory Center. In accordance with international tsunami warning system protocols, PTWC then followed with its first international tsunami warning message using JMA's earthquake parameters, including an Mw of 7.8. Additional Mwp, mantle wave, and W-phase magnitude estimations based on the analysis of later-arriving seismic data at PTWC revealed that the earthquake magnitude reached at least 8.8, and that a destructive tsunami would likely be crossing the Pacific Ocean. The earthquake damaged the nearest coastal sea-level station located 90 km from the epicenter in Ofunato, Japan. The NOAA DART sensor situated 600 km off the coast of Sendai, Japan, at a depth of 5.6 km recorded a tsunami wave amplitude of nearly two meters, making it by far the largest tsunami wave ever recorded by a DART sensor. Thirty minutes later, a coastal sea-level station at Hanasaki, Japan, 600 km from the epicenter, recorded a tsunami wave amplitude of

  12. Real-Time Data Processing Systems and Products at the Alaska Earthquake Information Center

    NASA Astrophysics Data System (ADS)

    Ruppert, N. A.; Hansen, R. A.

    2007-05-01

    The Alaska Earthquake Information Center (AEIC) receives data from over 400 seismic sites located within the state boundaries and the surrounding regions and serves as a regional data center. In 2007, the AEIC reported ~20,000 seismic events, with the largest event of M6.6 in Andreanof Islands. The real-time earthquake detection and data processing systems at AEIC are based on the Antelope system from BRTT, Inc. This modular and extensible processing platform allows an integrated system complete from data acquisition to catalog production. Multiple additional modules constructed with the Antelope toolbox have been developed to fit particular needs of the AEIC. The real-time earthquake locations and magnitudes are determined within 2-5 minutes of the event occurrence. AEIC maintains a 24/7 seismologist-on-duty schedule. Earthquake alarms are based on the real- time earthquake detections. Significant events are reviewed by the seismologist on duty within 30 minutes of the occurrence with information releases issued for significant events. This information is disseminated immediately via the AEIC website, ANSS website via QDDS submissions, through e-mail, cell phone and pager notifications, via fax broadcasts and recorded voice-mail messages. In addition, automatic regional moment tensors are determined for events with M>=4.0. This information is posted on the public website. ShakeMaps are being calculated in real-time with the information currently accessible via a password-protected website. AEIC is designing an alarm system targeted for the critical lifeline operations in Alaska. AEIC maintains an extensive computer network to provide adequate support for data processing and archival. For real-time processing, AEIC operates two identical, interoperable computer systems in parallel.

  13. The April 18, 2008 Illinois earthquake: an ANSS monitoring success

    USGS Publications Warehouse

    Herrmann, Robert B.; Withers, M.; Benz, H.

    2008-01-01

    The largest-magnitude earthquake in the past 20 years struck near Mt. Carmel in southeastern Illinois on Friday morning, 18 April 2008 at 09:36:59 UTC (04:37 CDT). The Mw 5.2 earthquake was felt over an area that spanned Chicago and Atlanta, with about 40,000 reports submitted to the U.S. Geological Survey (USGS) “Did You Feel It?” system. There were at least six felt aftershocks greater than magnitude 3 and 20 aftershocks with magnitudes greater than 2 located by regional and national seismic networks. Portable instrumentation was deployed by researchers of the University of Memphis and Indiana University (the first portable station was installed at about 23:00 UTC on 18 April). The portable seismographs were deployed both to capture near-source, high-frequency ground motions for significant aftershocks and to better understand structure along the active fault. The previous similar-size earthquake within the Wabash Valley seismic zone (WVSZ) of southeastern Illinois and southwestern Indiana was a magnitude 5.0 in June 1987. The seismicity associated with the WVSZ is thought to occur in a complex horst and graben system of Precambrian igneous and metamorphic units at depths between 12 and 20 km. Paleoliquefaction evidence suggests several major shaking events have occurred within the past 12,000 years (Munson et al. 1997).

  14. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    NASA Astrophysics Data System (ADS)

    Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

    2009-12-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this

  15. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  16. The Northern California Earthquake Data Center: Seismic and Geophysical Data for Northern California and Beyond

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Klein, F.; Zuzlewski, S.; Gee, L.; Oppenheimer, D.; Romanowicz, B.

    2004-12-01

    The Northern California Earthquake Data Center (NCEDC) is an archive and distribution center for geophysical data for networks in northern and central California. The NCEDC provides timeseries data from seismic, strain, electro-magnetic, a variety of creep, tilt, and environmental sensors, and continuous and campaign GPS data in raw and RINEX formats. The NCEDC has a wide variety of interfaces for data retrieval. Timeseries data are available via a web interface and standard queued request methods such as NetDC (developed in collaboration with the IRIS DMC and other international data centers), BREQ_FAST, and EVT_FAST. Interactive data retrieval methods include STP, developed by the SCEDC, and FISSURES DHI (Data Handling Interface), an object-oriented interface developed by IRIS. The Sandia MATSEIS system is being adapted to use the FISSURES DHI interface to provide an enhanced GUI-based seismic analysis system for MATLAB. Northern California and prototype ANSS worldwide earthquake catalogs are searchable from web interfaces, and supporting phase and amplitude data can be retrieved when available. Future data sets planned for the NCEDC are seismic and strain data from the EarthScope Plate Boundary Observatory (PBO) and SAFOD. The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and USGS Menlo Park.

  17. Implications of the World Trade Center Health Program (WTCHP) for the Public Health Response to the Great East Japan Earthquake

    PubMed Central

    CRANE, Michael A.; CHO, Hyunje G.; LANDRIGAN, Phillip J.

    2013-01-01

    The attacks on the World Trade Center (WTC) on September 11, 2001 resulted in a serious burden of physical and mental illness for the 50,000 rescue workers that responded to 9/11 as well as the 400,000 residents and workers in the surrounding areas of New York City. The Zadroga Act of 2010 established the WTC Health Program (WTCHP) to provide monitoring and treatment of WTC exposure-related conditions and health surveillance for the responder and survivor populations. Several reports have highlighted the applicability of insights gained from the WTCHP to the public health response to the Great East Japan Earthquake. Optimal exposure monitoring processes and attention to the welfare of vulnerable exposed sub-groups are critical aspects of the response to both incidents. The ongoing mental health care concerns of 9/11 patients accentuate the need for accessible and appropriately skilled mental health care in Fukushima. Active efforts to demonstrate transparency and to promote community involvement in the public health response will be highly important in establishing successful long-term monitoring and treatment programs for the exposed populations in Fukushima. PMID:24317449

  18. Earthquakes for Kids

    MedlinePlus

    ... Hazards Data & Products Learn Monitoring Research Earthquakes for Kids Kid's Privacy Policy Earthquake Topics for Education FAQ Earthquake Glossary For Kids Prepare Google Earth/KML Files Earthquake Summary Posters ...

  19. Application of collocated GPS and seismic sensors to earthquake monitoring and early warning.

    PubMed

    Li, Xingxing; Zhang, Xiaohong; Guo, Bofeng

    2013-10-24

    We explore the use of collocated GPS and seismic sensors for earthquake monitoring and early warning. The GPS and seismic data collected during the 2011 Tohoku-Oki (Japan) and the 2010 El Mayor-Cucapah (Mexico) earthquakes are analyzed by using a tightly-coupled integration. The performance of the integrated results is validated by both time and frequency domain analysis. We detect the P-wave arrival and observe small-scale features of the movement from the integrated results and locate the epicenter. Meanwhile, permanent offsets are extracted from the integrated displacements highly accurately and used for reliable fault slip inversion and magnitude estimation.

  20. Basin-centered asperities in great subduction zone earthquakes: A link between slip, subsidence, and subduction erosion?

    USGS Publications Warehouse

    Wells, R.E.; Blakely, R.J.; Sugiyama, Y.; Scholl, D. W.; Dinterman, P.A.

    2003-01-01

    Published areas of high coseismic slip, or asperities, for 29 of the largest Circum-Pacific megathrust earthquakes are compared to forearc structure revealed by satellite free-air gravity, bathymetry, and seismic profiling. On average, 71% of an earthquake's seismic moment and 79% of its asperity area occur beneath the prominent gravity low outlining the deep-sea terrace; 57% of an earthquake's asperity area, on average, occurs beneath the forearc basins that lie within the deep-sea terrace. In SW Japan, slip in the 1923, 1944, 1946, and 1968 earthquakes was largely centered beneath five forearc basins whose landward edge overlies the 350??C isotherm on the plate boundary, the inferred downdip limit of the locked zone. Basin-centered coseismic slip also occurred along the Aleutian, Mexico, Peru, and Chile subduction zones but was ambiguous for the great 1964 Alaska earthquake. Beneath intrabasin structural highs, seismic slip tends to be lower, possibly due to higher temperatures and fluid pressures. Kilometers of late Cenozoic subsidence and crustal thinning above some of the source zones are indicated by seismic profiling and drilling and are thought to be caused by basal subduction erosion. The deep-sea terraces and basins may evolve not just by growth of the outer arc high but also by interseismic subsidence not recovered during earthquakes. Basin-centered asperities could indicate a link between subsidence, subduction erosion, and seismogenesis. Whatever the cause, forearc basins may be useful indicators of long-term seismic moment release. The source zone for Cascadia's 1700 A.D. earthquake contains five large, basin-centered gravity lows that may indicate potential asperities at depth. The gravity gradient marking the inferred downdip limit to large coseismic slip lies offshore, except in northwestern Washington, where the low extends landward beneath the coast. Transverse gravity highs between the basins suggest that the margin is seismically segmented and

  1. Disasters; the 2010 Haitian earthquake and the evacuation of burn victims to US burn centers.

    PubMed

    Kearns, Randy D; Holmes, James H; Skarote, Mary Beth; Cairns, Charles B; Strickland, Samantha Cooksey; Smith, Howard G; Cairns, Bruce A

    2014-09-01

    Response to the 2010 Haitian earthquake included an array of diverse yet critical actions. This paper will briefly review the evacuation of a small group of patients with burns to burn centers in the southeastern United States (US). This particular evacuation brought together for the first time plans, groups, and organizations that had previously only exercised this process. The response to the Haitian earthquake was a glimpse at what the international community working together can do to help others, and relieve suffering following a catastrophic disaster. The international response was substantial. This paper will trace one evacuation, one day for one unique group of patients with burns to burn centers in the US and review the lessons learned from this process. The patient population with burns being evacuated from Haiti was very small compared to the overall operation. Nevertheless, the outcomes included a better understanding of how a larger event could challenge the limited resources for all involved. This paper includes aspects of the patient movement, the logistics needed, and briefly discusses reimbursement for the care provided. PMID:24411582

  2. Disasters; the 2010 Haitian earthquake and the evacuation of burn victims to US burn centers.

    PubMed

    Kearns, Randy D; Holmes, James H; Skarote, Mary Beth; Cairns, Charles B; Strickland, Samantha Cooksey; Smith, Howard G; Cairns, Bruce A

    2014-09-01

    Response to the 2010 Haitian earthquake included an array of diverse yet critical actions. This paper will briefly review the evacuation of a small group of patients with burns to burn centers in the southeastern United States (US). This particular evacuation brought together for the first time plans, groups, and organizations that had previously only exercised this process. The response to the Haitian earthquake was a glimpse at what the international community working together can do to help others, and relieve suffering following a catastrophic disaster. The international response was substantial. This paper will trace one evacuation, one day for one unique group of patients with burns to burn centers in the US and review the lessons learned from this process. The patient population with burns being evacuated from Haiti was very small compared to the overall operation. Nevertheless, the outcomes included a better understanding of how a larger event could challenge the limited resources for all involved. This paper includes aspects of the patient movement, the logistics needed, and briefly discusses reimbursement for the care provided.

  3. Novel Algorithms Enabling Rapid, Real-Time Earthquake Monitoring and Tsunami Early Warning Worldwide

    NASA Astrophysics Data System (ADS)

    Lomax, A.; Michelini, A.

    2012-12-01

    We have introduced recently new methods to determine rapidly the tsunami potential and magnitude of large earthquakes (e.g., Lomax and Michelini, 2009ab, 2011, 2012). To validate these methods we have implemented them along with other new algorithms within the Early-est earthquake monitor at INGV-Rome (http://early-est.rm.ingv.it, http://early-est.alomax.net). Early-est is a lightweight software package for real-time earthquake monitoring (including phase picking, phase association and event detection, location, magnitude determination, first-motion mechanism determination, ...), and for tsunami early warning based on discriminants for earthquake tsunami potential. In a simulation using archived broadband seismograms for the devastating M9, 2011 Tohoku earthquake and tsunami, Early-est determines: the epicenter within 3 min after the event origin time, discriminants showing very high tsunami potential within 5-7 min, and magnitude Mwpd(RT) 9.0-9.2 and a correct shallow-thrusting mechanism within 8 min. Real-time monitoring with Early-est givess similar results for most large earthquakes using currently available, real-time seismogram data. Here we summarize some of the key algorithms within Early-est that enable rapid, real-time earthquake monitoring and tsunami early warning worldwide: >>> FilterPicker - a general purpose, broad-band, phase detector and picker (http://alomax.net/FilterPicker); >>> Robust, simultaneous association and location using a probabilistic, global-search; >>> Period-duration discriminants TdT0 and TdT50Ex for tsunami potential available within 5 min; >>> Mwpd(RT) magnitude for very large earthquakes available within 10 min; >>> Waveform P polarities determined on broad-band displacement traces, focal mechanisms obtained with the HASH program (Hardebeck and Shearer, 2002); >>> SeisGramWeb - a portable-device ready seismogram viewer using web-services in a browser (http://alomax.net/webtools/sgweb/info.html). References (see also: http

  4. Volcano and Earthquake Monitoring Plan for the Yellowstone Volcano Observatory, 2006-2015

    USGS Publications Warehouse

    ,

    2006-01-01

    To provide Yellowstone National Park (YNP) and its surrounding communities with a modern, comprehensive system for volcano and earthquake monitoring, the Yellowstone Volcano Observatory (YVO) has developed a monitoring plan for the period 2006-2015. Such a plan is needed so that YVO can provide timely information during seismic, volcanic, and hydrothermal crises and can anticipate hazardous events before they occur. The monitoring network will also provide high-quality data for scientific study and interpretation of one of the largest active volcanic systems in the world. Among the needs of the observatory are to upgrade its seismograph network to modern standards and to add five new seismograph stations in areas of the park that currently lack adequate station density. In cooperation with the National Science Foundation (NSF) and its Plate Boundary Observatory Program (PBO), YVO seeks to install five borehole strainmeters and two tiltmeters to measure crustal movements. The boreholes would be located in developed areas close to existing infrastructure and away from sensitive geothermal features. In conjunction with the park's geothermal monitoring program, installation of new stream gages, and gas-measuring instruments will allow YVO to compare geophysical phenomena, such as earthquakes and ground motions, to hydrothermal events, such as anomalous water and gas discharge. In addition, YVO seeks to characterize the behavior of geyser basins, both to detect any precursors to hydrothermal explosions and to monitor earthquakes related to fluid movements that are difficult to detect with the current monitoring system. Finally, a monitoring network consists not solely of instruments, but requires also a secure system for real-time transmission of data. The current telemetry system is vulnerable to failures that could jeopardize data transmission out of Yellowstone. Future advances in monitoring technologies must be accompanied by improvements in the infrastructure for

  5. New approach for earthquake/tsunami monitoring using dense GPS networks.

    PubMed

    Li, Xingxing; Ge, Maorong; Zhang, Yong; Wang, Rongjiang; Xu, Peiliang; Wickert, Jens; Schuh, Harald

    2013-01-01

    In recent times increasing numbers of high-rate GPS stations have been installed around the world and set-up to provide data in real-time. These networks provide a great opportunity to quickly capture surface displacements, which makes them important as potential constituents of earthquake/tsunami monitoring and warning systems. The appropriate GPS real-time data analysis with sufficient accuracy for this purpose is a main focus of the current GPS research. In this paper we propose an augmented point positioning method for GPS based hazard monitoring, which can achieve fast or even instantaneous precise positioning without relying on data of a specific reference station. The proposed method overcomes the limitations of the currently mostly used GPS processing approaches of relative positioning and global precise point positioning. The advantages of the proposed approach are demonstrated by using GPS data, which was recorded during the 2011 Tohoku-Oki earthquake in Japan.

  6. Quantifying 10 years of Improvements in Earthquake and Tsunami Monitoring in the Caribbean and Adjacent Regions

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.; Huerfano Moreno, V. A.; McNamara, D. E.; Saurel, J. M.

    2014-12-01

    The magnitude-9.3 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness to the destructive hazard of earthquakes and tsunamis. Post event assessments of global coastline vulnerability highlighted the Caribbean as a region of high hazard and risk and that it was poorly monitored. Nearly 100 tsunamis have been reported for the Caribbean region and Adjacent Regions in the past 500 years and continue to pose a threat for its nations, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North and South America. Significant efforts to improve monitoring capabilities have been undertaken since this time including an expansion of the United States Geological Survey (USGS) Global Seismographic Network (GSN) (McNamara et al., 2006) and establishment of the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS). The minimum performance standards it recommended for initial earthquake locations include: 1) Earthquake detection within 1 minute, 2) Minimum magnitude threshold = M4.5, and 3) Initial hypocenter error of <30 km. In this study, we assess current compliance with performance standards and model improvements in earthquake and tsunami monitoring capabilities in the Caribbean region since the first meeting of the UNESCO ICG-Caribe EWS in 2006. The three measures of network capability modeled in this study are: 1) minimum Mw detection threshold; 2) P-wave detection time of an automatic processing system and; 3) theoretical earthquake location uncertainty. By modeling three measures of seismic network capability, we can optimize the distribution of ICG-Caribe EWS seismic stations and select an international network that will be contributed from existing real-time broadband national networks in the region. Sea level monitoring improvements both offshore and

  7. Network of seismo-geochemical monitoring observatories for earthquake prediction research in India

    NASA Astrophysics Data System (ADS)

    Chaudhuri, Hirok; Barman, Chiranjib; Iyengar, A.; Ghose, Debasis; Sen, Prasanta; Sinha, Bikash

    2013-08-01

    Present paper deals with a brief review of the research carried out to develop multi-parametric gas-geochemical monitoring facilities dedicated to earthquake prediction research in India by installing a network of seismo-geochemical monitoring observatories at different regions of the country. In an attempt to detect earthquake precursors, the concentrations of helium, argon, nitrogen, methane, radon-222 (222Rn), polonium-218 (218Po), and polonium-214 (214Po) emanating from hydrothermal systems are monitored continuously and round the clock at these observatories. In this paper, we make a cross correlation study of a number of geochemical anomalies recorded at these observatories. With the data received from each of the above observatories we attempt to make a time series analysis to relate magnitude and epicentral distance locations through statistical methods, empirical formulations that relate the area of influence to earthquake scale. Application of the linear and nonlinear statistical techniques in the recorded geochemical data sets reveal a clear signature of long-range correlation in the data sets.

  8. [High Resolution Remote Sensing Monitoring and Assessment of Secondary Geological Disasters Triggered by the Lushan Earthquake].

    PubMed

    Wang, Fu-tao; Wang, Shi-xin; Zhou, Yi; Wang, Li-tao; Yan, Fu-li; Li, Wen-jun; Liu, Xiong-fei

    2016-01-01

    The secondary geological disasters triggered by the Lushan earthquake on April 20, 2013, such as landslides, collapses, debris flows, etc., had caused great casualties and losses. We monitored the number and spatial distribution of the secondary geological disasters in the earthquake-hit area from airborne remote sensing images, which covered areas about 3 100 km2. The results showed that Lushan County, Baoxing County and Tianquan County were most severely affected; there were 164, 126 and 71 secondary geological disasters in these regions. Moreover, we analyzed the relationship between the distribution of the secondary geological disasters, geological structure and intensity. The results indicate that there were 4 high-hazard zones in the monitored area, one focused within six kilometers from the epicenter, and others are distributed along the two main fault zones of the Longmen Mountain. More than 97% secondary geological disasters occurred in zones with a seismic intensity of VII to IX degrees, a slope between 25 A degrees and 50 A degrees, and an altitude of between 800 and 2 000 m. At last, preliminary suggestions were proposed for the rehabilitation and reconstruction planning of Lushan earthquake. According to the analysis result, airborne and space borne remote sensing can be used accurately and effectively in almost real-time to monitor and assess secondary geological disasters, providing a scientific basis and decision making support for government emergency command and post-disaster reconstruction.

  9. [High Resolution Remote Sensing Monitoring and Assessment of Secondary Geological Disasters Triggered by the Lushan Earthquake].

    PubMed

    Wang, Fu-tao; Wang, Shi-xin; Zhou, Yi; Wang, Li-tao; Yan, Fu-li; Li, Wen-jun; Liu, Xiong-fei

    2016-01-01

    The secondary geological disasters triggered by the Lushan earthquake on April 20, 2013, such as landslides, collapses, debris flows, etc., had caused great casualties and losses. We monitored the number and spatial distribution of the secondary geological disasters in the earthquake-hit area from airborne remote sensing images, which covered areas about 3 100 km2. The results showed that Lushan County, Baoxing County and Tianquan County were most severely affected; there were 164, 126 and 71 secondary geological disasters in these regions. Moreover, we analyzed the relationship between the distribution of the secondary geological disasters, geological structure and intensity. The results indicate that there were 4 high-hazard zones in the monitored area, one focused within six kilometers from the epicenter, and others are distributed along the two main fault zones of the Longmen Mountain. More than 97% secondary geological disasters occurred in zones with a seismic intensity of VII to IX degrees, a slope between 25 A degrees and 50 A degrees, and an altitude of between 800 and 2 000 m. At last, preliminary suggestions were proposed for the rehabilitation and reconstruction planning of Lushan earthquake. According to the analysis result, airborne and space borne remote sensing can be used accurately and effectively in almost real-time to monitor and assess secondary geological disasters, providing a scientific basis and decision making support for government emergency command and post-disaster reconstruction. PMID:27228764

  10. Grand Canyon Monitoring and Research Center

    USGS Publications Warehouse

    Hamill, John F.

    2009-01-01

    The Grand Canyon of the Colorado River, one of the world's most spectacular gorges, is a premier U.S. National Park and a World Heritage Site. The canyon supports a diverse array of distinctive plants and animals and contains cultural resources significant to the region's Native Americans. About 15 miles upstream of Grand Canyon National Park sits Glen Canyon Dam, completed in 1963, which created Lake Powell. The dam provides hydroelectric power for 200 wholesale customers in six western States, but it has also altered the Colorado River's flow, temperature, and sediment-carrying capacity. Over time this has resulted in beach erosion, invasion and expansion of nonnative species, and losses of native fish. Public concern about the effects of Glen Canyon Dam operations prompted the passage of the Grand Canyon Protection Act of 1992, which directs the Secretary of the Interior to operate the dam 'to protect, mitigate adverse impacts to, and improve values for which Grand Canyon National Park and Glen Canyon National Recreation Area were established...' This legislation also required the creation of a long-term monitoring and research program to provide information that could inform decisions related to dam operations and protection of downstream resources.

  11. Real-time seismic monitoring of the integrated cape girardeau bridge array and recorded earthquake response

    USGS Publications Warehouse

    Celebi, M.

    2006-01-01

    This paper introduces the state of the art, real-time and broad-band seismic monitoring network implemented for the 1206 m [3956 ft] long, cable-stayed Bill Emerson Memorial Bridge in Cape Girardeau (MO), a new Mississippi River crossing, approximately 80 km from the epicentral region of the 1811-1812 New Madrid earthquakes. The bridge was designed for a strong earthquake (magnitude 7.5 or greater) during the design life of the bridge. The monitoring network comprises a total of 84 channels of accelerometers deployed on the superstructure, pier foundations and at surface and downhole free-field arrays of the bridge. The paper also presents the high quality response data obtained from the network. Such data is aimed to be used by the owner, researchers and engineers to assess the performance of the bridge, to check design parameters, including the comparison of dynamic characteristics with actual response, and to better design future similar bridges. Preliminary analyses of ambient and low amplitude small earthquake data reveal specific response characteristics of the bridge and the free-field. There is evidence of coherent tower, cable, deck interaction that sometimes results in amplified ambient motions. Motions at the lowest tri-axial downhole accelerometers on both MO and IL sides are practically free from any feedback from the bridge. Motions at the mid-level and surface downhole accelerometers are influenced significantly by feedback due to amplified ambient motions of the bridge. Copyright ASCE 2006.

  12. Viscoelastic solutions to tectonic problems of extinct spreading centers, earthquake triggering, and subduction zone dynamics

    NASA Astrophysics Data System (ADS)

    Freed, Andrew Mark

    This dissertation uses a finite element technique to explore the role of viscoelastic behavior in a wide range of plate tectonic processes. We consider problems associated with spreading centers, earthquake triggering, and subduction zone dynamics. We simulated the evolution of a slow-spreading center upon cessation of active spreading in order to predict long-term changes in the axial valley morphology. Results suggest that the axial valley created at a slow-spreading center persists because the crust is too strong to deform ductily and because no effective mechanism exists to reverse the topography created by rift-bounding normal faults. These results suggest that the persistence of axial valleys at extinct spreading centers is consistent with a lithospheric stretching model based on dynamic forces for active slow-spreading ridges. In our study of earthquake triggering, results suggest that if a ductile lower crust or upper mantle flows viscously following a thrust event, relaxation may cause a transfer of stress to the upper crust. Under certain conditions this may lead to further increases and a lateral expansion of high Coulomb stresses along the base of the upper crust. Analysis of experimentally determined non-Newtonian flow laws suggests that wet granitic, quartz, and feldspar aggregates may yield a viscosity on the order of 10sp{19} Pa-s. The calculated rate of stress transfer from a viscous lower crust or upper mantle to the upper crust becomes faster with increasing values of the power law exponent and the presence of a regional compressive strain rate. In our study of subduction zone dynamics, we model the density and strength structures that drive the Nazca and South American plates. Results suggest that chemical buoyancy and phase changes associated with a cool subducting slab strongly influence the magnitude of driving forces, and the downgoing slab behaves weaker than the strength that would be expected based solely on temperature. Additionally

  13. Two Decades of Seismic Monitoring by WEBNET: Disclosing a Lifecycle of an Earthquake Swarm Zone

    NASA Astrophysics Data System (ADS)

    Fischer, T.; Horalek, J.; Cermakova, H.; Michalek, J.; Doubravova, J.; Bouskova, A.; Bachura, M.

    2014-12-01

    The area of West Bohemia/Vogtland in western Eger Rift is typified by earthquake swarm activity with maximum magnitudes not exceeding ML 5. The seismicity is dominated by the area near Novy Kostel where earthquakes cluster along a narrow and steeply dipping focal zone of 8 km length that strikes about N-S in the depth range 7-11 km. Detailed seismic monitoring has been carried out by the WEBNET seismic network since 1992. During that period earthquake swarms with several mainshocks exceeding magnitude level ML 3 took place in 2000, 2008 and 2011. These swarms were characteristic by episodic character where the activity of individual episodes overlapped in time and space. Interestingly, the rate of activity of individual swarms increased with each subsequent swarm; the 2000 swarm being the slowest and the 2011 swarm the most rapid one. In 2014 the character of seismicity has changed from a swarm-like activity to a mainshock-aftershock activity. Already three mainshocks has occurred since May 2014; the ML 3.6 event of May 24, the ML 4.5 event of May 31 and the ML 3.5 event of August 3. All these events were followed by a short aftershock sequence of one to four days duration. All three events exceeded the following aftershocks by more than one magnitude level and none of these mainshocks were preceded by foreshocks, which differentiates this activity from the preceding swarm seismicity. Interestingly, the hypocenters of the mentioned earthquake swarms and mainshock-aftershock sequences share a common fault zone and overlap significantly. We present detailed analysis of precise hypocenter locations and statistical characteristics of the activity in order to find the origin of different behavior of seismic activity, which results in either earthquake swarms or mainshock-aftershock activity.

  14. Cloud-based systems for monitoring earthquakes and other environmental quantities

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Olson, M.; Liu, A.; Chandy, M.; Bunn, J.; Guy, R.

    2013-12-01

    There are many advantages to using a cloud-based system to record and analyze environmental quantities such as earthquakes, radiation, various gases, dust and meteorological parameters. These advantages include robustness and dynamic scalability, and also reduced costs. In this paper, we present our experiences over the last three years in developing a cloud-based earthquake monitoring system (the Community Seismic Network). This network consists of over 600 sensors (accelerometers) in the S. California region that send data directly to the Google App Engine where they are analyzed. The system is capable of handing many other types of sensor data and generating a situation-awareness analysis as a product. Other advantages to the cloud-based system are integration with other peer networks, and being able to deploy anywhere in the world without have to build addition computing infrastructure.

  15. Recorded earthquake responses from the integrated seismic monitoring network of the Atwood Building, Anchorage, Alaska

    USGS Publications Warehouse

    Celebi, M.

    2006-01-01

    An integrated seismic monitoring system with a total of 53 channels of accelerometers is now operating in and at the nearby free-field site of the 20-story steel-framed Atwood Building in highly seismic Anchorage, Alaska. The building has a single-story basement and a reinforced concrete foundation without piles. The monitoring system comprises a 32-channel structural array and a 21-channel site array. Accelerometers are deployed on 10 levels of the building to assess translational, torsional, and rocking motions, interstory drift (displacement) between selected pairs of adjacent floors, and average drift between floors. The site array, located approximately a city block from the building, comprises seven triaxial accelerometers, one at the surface and six in boreholes ranging in depths from 15 to 200 feet (???5-60 meters). The arrays have already recorded low-amplitude shaking responses of the building and the site caused by numerous earthquakes at distances ranging from tens to a couple of hundred kilometers. Data from an earthquake that occurred 186 km away traces the propagation of waves from the deepest borehole to the roof of the building in approximately 0.5 seconds. Fundamental structural frequencies [0.58 Hz (NS) and 0.47 Hz (EW)], low damping percentages (2-4%), mode coupling, and beating effects are identified. The fundamental site frequency at approximately 1.5 Hz is close to the second modal frequencies (1.83 Hz NS and 1.43 EW) of the building, which may cause resonance of the building. Additional earthquakes prove repeatability of these characteristics; however, stronger shaking may alter these conclusions. ?? 2006, Earthquake Engineering Research Institute.

  16. An Architecture for Continuous Data Quality Monitoring in Medical Centers.

    PubMed

    Endler, Gregor; Schwab, Peter K; Wahl, Andreas M; Tenschert, Johannes; Lenz, Richard

    2015-01-01

    In the medical domain, data quality is very important. Since requirements and data change frequently, continuous and sustainable monitoring and improvement of data quality is necessary. Working together with managers of medical centers, we developed an architecture for a data quality monitoring system. The architecture enables domain experts to adapt the system during runtime to match their specifications using a built-in rule system. It also allows arbitrarily complex analyses to be integrated into the monitoring cycle. We evaluate our architecture by matching its components to the well-known data quality methodology TDQM.

  17. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    NASA Astrophysics Data System (ADS)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  18. Continued Swift/XRT monitoring observations of the Galactic center

    NASA Astrophysics Data System (ADS)

    Degenaar, N.; Wijnands, R.; Reynolds, M. T.; Miller, J. M.; Kennea, J. A.; Gehrels, N.; Haggard, D.; Ponti, G.; Burrows, D. N.

    2014-02-01

    We report on continued X-ray monitoring observations of the Galactic center with the Swift/XRT (Atel #5847). Between 2014 February 2 and 6 the XRT count rate at the position of Sgr A* and the nearby transient magnetar SGR J1745-29 varied between ~1E-2 and 3E-2 counts s-1.

  19. Real-time monitoring of fine-scale changes in fault and earthquake properties

    NASA Astrophysics Data System (ADS)

    Waldhauser, F.; Schaff, D. P.

    2014-12-01

    The high-resolution back-processing and re-analysis of long-term seismic archives has generated new data that provide insight into the fine-scale structures of active faults and seismogenic processes that control them. Such high-precision studies are typically carried out retro-actively, for a specific time period and/or fault of interest. For the last 5 years we have been operating a real-time system, DD-RT, that uses waveform cross-correlation and double-difference algorithms to automatically compute high-precision (10s to 100s of meters) locations of new earthquakes recorded by the Northern California Seismic System. These locations are computed relative to a high-resolution, 30 year long background archive that includes over half a million earthquakes, 20 million seismograms, and 1.7 billion correlation measurements. In this paper we present results from using the DD-RT system and its relational database to monitor changes in earthquake and fault properties at the scale of individual events. We developed baseline characteristics for repeating earthquakes, fore- and aftershock sequences, and fault zone properties, against which we evaluate new events in near real-time. We developed these baseline characteristics from a comprehensive analysis of the double-difference archive, and developed real-time modules that plug into the DD-RT system for monitoring deviations from these baselines. For example, we defined baseline characteristics for 8,500 repeating earthquake sequences, including more than 25,000 events, that were found in an extensive search across Northern California. Precise measurements of relative hypocenter positions, differential magnitudes, and waveform similarity are used to automatically associate new member events to existing sequences. This allows us to monitor changes relative to baseline parameters such as recurrence intervals and their coefficient of variation (CV). Alerting of such changes are especially important for large sequences of

  20. New Continuous Timeseries Data at the Northern California Earthquake Data Center

    NASA Astrophysics Data System (ADS)

    Neuhauser, D. S.; Dietz, L.; Zuzlewski, S.; Kohler, W.; Gee, L.; Oppenheimer, D.; Romanowicz, B.

    2005-12-01

    The Northern California Earthquake Data Center (NCEDC) is an archive and distribution center for geophysical data for networks in northern and central California. Recent discovery of non-volcanic tremors in northern and central California has sparked user interest in access to a wider range of continuous seismic data in the region. The NCEDC has responded by expanding its archiving and distribution to all new available continuous data from northern California seismic networks (the USGS NCSN, the UC Berkeley BDSN, the Parkfield HRSN borehole network, and local USArray stations) at all available sample rates, to provide access to all recent real-time timeseries data, and to restore from tape and archive all NCSN continuous data from 2001-present. All new continuous timeseries data will also be available in near-real-time from the NCEDC via the DART (Data Available in Real Time) system, which allows users to directly download daily Telemetry MiniSEED files or to extract and retrieve the timeseries of their selection. The NCEDC will continue to create and distribute event waveform collections for all events detected by the Northern California Seismic System (NCSS), the northern California component of the California Integrated Seismic Network (CISN). All new continuous and event timeseries will be archived in daily intervals and are accessible via the same data request tools (NetDC, BREQ_FAST, EVT_FAST, FISSURES/DHI, STP) as previously archived waveform data. The NCEDC is a joint project of the UC Berkeley Seismological Laboratory and USGS Menlo Park.

  1. USGS contributions to earthquake and tsunami monitoring in the Caribbean Region

    NASA Astrophysics Data System (ADS)

    McNamara, D.; Caribbean Project Team, U.; Partners, C.

    2007-05-01

    USGS Caribbean Project Team: Lind Gee, Gary Gyure, John Derr, Jack Odum, John McMillan, David Carver, Jim Allen, Susan Rhea, Don Anderson, Harley Benz Caribbean Partners: Christa von Hillebrandt-Andrade-PRSN, Juan Payero ISU-UASD,DR, Eduardo Camacho - UPAN, Panama, Lloyd Lynch - SRU,Gonzalo Cruz - UNAH,Honduras, Margaret Wiggins-Grandison - Jamaica, Judy Thomas - CERO Barbados, Sylvan McIntyre - NADMA Grenada, E. Bermingham - STRI. The magnitude-9 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness of the destructive hazard posed by earthquakes and tsunamis. In response to this tragedy, the US government undertook a collaborative project to improve earthquake and tsunami monitoring along a major portion of vulnerable coastal regions, in the Caribbean Sea, the Gulf of Mexico, and the Atlantic Ocean. Seismically active areas of the Caribbean Sea region pose a tsunami risk for Caribbean islands, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North America. Nearly 100 tsunamis have been reported for the Caribbean region in the past 500 years, including 14 tsunamis reported in Puerto Rico and the U.S. Virgin Islands. Partners in this project include the United States Geological Survey (USGS), the Smithsonian Institute, the National Oceanic and Aeronautic Administration (NOAA), and several partner institutions in the Caribbean region. This presentation focuses on the deployment of nine broadband seismic stations to monitor earthquake activity in the Caribbean region that are affiliated with the Global Seismograph Network (GSN). By the end of 2006, five stations were transmitting data to the USGS National Earthquake Information Service (NEIS), and regional partners through Puerto Rico seismograph network (PRSN) Earthworm systems. The following stations are currently operating: SDDR - Sabaneta Dam Dominican Republic, BBGH - Gun Hill Barbados, GRGR - Grenville, Grenada, BCIP - Barro Colorado, Panama, TGUH - Tegucigalpa

  2. Data and Visualizations in the Southern California Earthquake Center's Fault Information System

    NASA Astrophysics Data System (ADS)

    Perry, S.

    2003-12-01

    The Southern California Earthquake Center's Fault Information System (FIS) provides a single point of access to fault-related data and models from multiple databases and datasets. The FIS is built of computer code, metadata and Web interfaces based on Web services technology, which enables queries and data interchange irrespective of computer software or platform. Currently we have working prototypes of programmatic and browser-based access. The first generation FIS may be searched and downloaded live, by automated processes, as well as interactively, by humans using a browser. Users get ascii data in plain text or encoded in XML. Via the Earthquake Information Technology (EIT) Interns (Juve and others, this meeting), we are also testing the effectiveness of querying multiple databases using a fault database ontology. For more than a decade, the California Geological Survey (CGS), SCEC, and the U. S. Geological Survey (USGS) have put considerable, shared resources into compiling and assessing published fault data, then providing the data on the Web. Several databases now exist, with different formats, datasets, purposes, and users, in various stages of completion. When fault databases were first envisioned, the full power of today's internet was not yet recognized, and the databases became the Web equivalents of review papers, where one could read an overview summation of a fault, then copy and paste pertinent data. Today, numerous researchers also require rapid queries and downloads of data. Consequently, the first components of the FIS are MySQL databases that deliver numeric values from earlier, text-based databases. Another essential service provided by the FIS is visualizations of fault representations such as those in SCEC's Community Fault Model. The long term goal is to provide a standardized, open-source, platform-independent visualization technique. Currently, the FIS makes available fault model viewing software for users with access to Matlab or Java3D

  3. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    NASA Astrophysics Data System (ADS)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  4. Southern California Earthquake Center - SCEC1: Final Report Summary Alternative Earthquake Source Characterization for the Los Angeles Region

    SciTech Connect

    Foxall, B

    2003-02-26

    The objective my research has been to synthesize current understanding of the tectonics and faults of the Los Angeles Basin and surrounding region to quantify uncertainty in the characterization of earthquake sources used for geologically- and geodetically-based regional earthquake likelihood models. This work has focused on capturing epistemic uncertainty; i.e. uncertainty stemming from ignorance of the true characteristics of the active faults in the region and of the tectonic forces that drive them. In the present context, epistemic uncertainty has two components: First, the uncertainty in source geometrical and occurrence rate parameters deduced from the limited geological, geophysical and geodetic observations available; and second. uncertainties that result from fundamentally different interpretations of regional tectonic deformation and faulting. Characterization of the large number of active and potentially active faults that need to be included in estimating earthquake occurrence likelihoods for the Los Angeles region requires synthesis and evaluation of large amounts of data and numerous interpretations. This was accomplished primarily through a series of carefully facilitated workshops, smaller meetings involving key researchers, and email groups. The workshops and meetings were made possible by the unique logistical and financial resources available through SCEC, and proved to be extremely effective forums for the exchange and critical debate of data and interpretations that are essential in constructing fully representative source models. The main products from this work are a complete source model that characterizes all know or potentially active faults in the greater Los Angeles region. which includes the continental borderland as far south as San Diego, the Ventura Basin, and the Santa Barbara Channel. The model constitutes a series of maps and representative cross-sections that define alternative fault geometries, a table containing rault

  5. UNLV’s environmentally friendly Science and Engineering Building is monitored for earthquake shaking

    USGS Publications Warehouse

    Kalkan, Erol; Savage, Woody; Reza, Shahneam; Knight, Eric; Tian, Ying

    2013-01-01

    The University of Nevada Las Vegas’ (UNLV) Science and Engineering Building is at the cutting edge of environmentally friendly design. As the result of a recent effort by the U.S. Geological Survey’s National Strong Motion Project in cooperation with UNLV, the building is now also in the forefront of buildings installed with structural monitoring systems to measure response during earthquakes. This is particularly important because this is the first such building in Las Vegas. The seismic instrumentation will provide essential data to better understand the structural performance of buildings, especially in this seismically active region.

  6. Role of WEGENER (World Earthquake GEodesy Network for Environmental Hazard Research) in monitoring natural hazards (Invited)

    NASA Astrophysics Data System (ADS)

    Ozener, H.; Zerbini, S.; Bastos, M. L.; Becker, M. H.; Meghraoui, M.; Reilinger, R. E.

    2013-12-01

    WEGENER was originally the acronym for Working Group of European Geoscientists for the Establishment of Networks for Earth-science Research. It was founded in March 1981 in response to an appeal delivered at the Journées Luxembourgeoises de Geodynamique in December 1980 to respond with a coordinated European proposal to a NASA Announcement of Opportunity inviting participation in the Crustal Dynamics and Earthquake Research Program. WEGENER, during the past 33 years, has always kept a close contact with the Agencies and Institutions responsible for the development and maintenance of the global space geodetic networks with the aim to make them aware of the scientific needs and outcomes of the project which might have an influence on the general science policy trends. WEGENER served as Inter-commission Project 3.2, between Commission 1 and Commission 3, of the International Association of Geodesy (IAG) until 2012. Since then, WEGENER project has become the Sub-commission 3.5 of IAG commission 3, namely Tectonics and Earthquake Geodesy. In this presentation, we briefly review the accomplishments of WEGENER as originally conceived and outline and justify the new focus of the WEGENER consortium. The remarkable and rapid evolution of the present state of global geodetic monitoring in regard to the precision of positioning capabilities (and hence deformation) and global coverage, the development of InSAR for monitoring strain with unprecedented spatial resolution, and continuing and planned data from highly precise satellite gravity and altimetry missions, encourage us to shift principal attention from mainly monitoring capabilities by a combination of space and terrestrial geodetic techniques to applying existing observational methodologies to the critical geophysical phenomena that threaten our planet and society. Our new focus includes developing an improved physical basis to mitigate earthquake, tsunami, and volcanic risks, and the effects of natural and

  7. Search For Earthquake Precursors In The Data of Multidisciplinary Monitoring of Geophysical and Biological Parameters

    NASA Astrophysics Data System (ADS)

    Sidorin, A. Ya.

    Short-term variations in the set of geophysical and biological parameters that moni- tored at the Garm research site for a long time are considered in relation to an earth- quake with M=5.3. We used day average data of electrical resistivity, electrtotelluric field, electrochemical potential, water conductivity and hour average data of electrical activity of weak electrical fishes. All the geoelectrical parameters monitored directly in the epicentral zone are found to change within two weeks before the earthquake. No changes were revealed at an epicentral distance of 16 km. This work was supported by Russian Found of Basic Research, grant No. 01-05-65503.

  8. Change of permeability caused by 2011 Tohoku earthquake detected from pore pressure monitoring

    NASA Astrophysics Data System (ADS)

    Kinoshita, C.; Kano, Y.; Ito, H.

    2013-12-01

    Earthquake-induced groundwater changes which are the pre- and co-seismic changes have been long reported (e.g. Roeloffs, 1996). For example, 1995 Kobe earthquake, water inflow into observation tunnel changed at Rokko (Fujimori et al., 1995), at the times of 1964 Alaska earthquake (M8.6) (Coble, 1967) and 1999 Taiwan Chi-Chi earthquake (M7.6) (Chia et al., 2001), groundwater leve were fluctuated. The shaking of seismic waves and crack formation by crustal deformation are proposed as one causes but the mechanism is controversial. We are monitoring pore pressure from 2005 to measure the stress changes at Kamioka mine, Gifu prefecture, central Japan. Barometric pressure and strain are observed to correct the pore pressure data. In general, the pore pressure changes associate with the meteorological effects, Earth tides and crustal deformation. Increase of pore pressure depends on the precipitation which flows into the ground. Especially, snow effects are bigger than the usual rainfall because our observation site has heavy snow in winter season. Melted snow flows in the ground and pore pressure increases at the March to April every year. When the 2011 Tohoku earthquake (M9.0) occurred, pore pressure remarkably decreased because the permeability increases by crustal deformation at Kamioka region. Thus, we estimated the hydraulic diffusivity before and after the earthquake from pore pressure response to crustal deformation. We made separated analyses on three frequency bands. First is the high frequency band, especially, seismic response. Second is response to Earth tides. Third frequency band is that of barometric response which is lower than other two bands. At high frequency band, we confirmed that the deformation occurred under undrained condition and estimated the bulk modulus from pore pressure and strain data. Next, tidal response is extracted from pore pressure which applied to every three months data of pore pressure, barometric pressure and strain. Time window

  9. Application for temperature and humidity monitoring of data center environment

    NASA Astrophysics Data System (ADS)

    Albert, Ş.; Truşcǎ, M. R. C.; Soran, M. L.

    2015-12-01

    The technology and computer science registered a large development in the last years. Most systems that use high technologies require special working conditions. The monitoring and the controlling are very important. The temperature and the humidity are important parameters in the operation of computer systems, industrial and research, maintaining it between certain values to ensure their proper functioning being important. Usually, the temperature is maintained in the established range using an air conditioning system, but the humidity is affected. In the present work we developed an application based on a board with own firmware called "AVR_NET_IO" using a microcontroller ATmega32 type for temperature and humidity monitoring in Data Center of INCDTIM. On this board, temperature sensors were connected to measure the temperature in different points of the Data Center and outside of this. Humidity monitoring is performed using data from integrated sensors of the air conditioning system, thus achieving a correlation between humidity and temperature variation. It was developed a software application (CM-1) together with the hardware, which allows temperature monitoring and register inside Data Center and trigger an alarm when variations are greater with 3°C than established limits of the temperature.

  10. Postseismic Deformation after the 1964 Great Alaskan Earthquake: Collaborative Research with Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Freymueller, Jeffrey T.

    1999-01-01

    The purpose of this project was to carry out GPS observations on the Kenai Peninsula, southern Alaska, in order to study the postseismic and contemporary deformation following the 1964 Alaska earthquake. All of the research supported in this grant was carried out in collaboration with Dr. Steven Cohen of Goddard Space Flight Center. The research funding from this grant primarily supported GPS fieldwork, along with the acquisition of computer equipment to allow analysis and modeling of the GPS data. A minor amount of salary support was provided by the PI, but the great majority of the salary support was provided by the Geophysical Institute. After the expiration of this grant, additional funding was obtained from the National Science Foundation to continue the work. This grant supported GPS field campaigns in August 1995, June 1996, May-June and September 1997, and May-June 1998. We initially began the work by surveying leveling benchmarks on the Kenai peninsula that had been surveyed after the 1964 earthquake. Changes in height from the 1964 leveling data to the 1995+ GPS data, corrected for the geoid-ellipsoid separation, give the total elevation change since the earthquake. Beginning in 1995, we also identified or established sites that were suitable for long-term surveying using GPS. In the subsequent annual GPS campaigns, we made regular measurements at these GPS marks, and steadily enhanced our set of points for which cumulative postseismic uplift data were available. From 4 years of Global Positioning System (GPS) measurements, we find significant spatial variations in present-day deformation between the eastern and western Kenai peninsula, Alaska. Sites in the eastern Kenai peninsula and Prince William Sound move to the NNW relative to North America, in the direction of Pacific-North America relative plate motion. Velocities decrease in magnitude from nearly the full plate rate in southern Prince William Sound to about 30 mm/yr at Seward and to about 5 mm

  11. Federal Radiological Monitoring and Assessment Center Overview of FRMAC Operations

    SciTech Connect

    1998-03-01

    In the event of a major radiological emergency, 17 federal agencies with various statutory responsibilities have agreed to coordinate their efforts at the emergency scene under the umbrella of the Federal Radiological Emergency Response Plan. This cooperative effort will ensure that all federal radiological assistance fully supports their efforts to protect the public. the mandated federal cooperation ensures that each agency can obtain the data critical to its specific responsibilities. This Overview of Federal Radiological Monitoring and Assessment Center (FRMAC) describes the FRMAC response activities to a major radiological emergency. It also describes the federal assets and subsequent operational activities which provide federal radiological monitoring and assessment of the off-site areas.

  12. The academic health center in complex humanitarian emergencies: lessons learned from the 2010 Haiti earthquake.

    PubMed

    Babcock, Christine; Theodosis, Christian; Bills, Corey; Kim, Jimin; Kinet, Melodie; Turner, Madeleine; Millis, Michael; Olopade, Olufunmilayo; Olopade, Christopher

    2012-11-01

    On January 12, 2010, a 7.0-magnitude earthquake struck Haiti. The event disrupted infrastructure and was marked by extreme morbidity and mortality. The global response to the disaster was rapid and immense, comprising multiple actors-including academic health centers (AHCs)-that provided assistance in the field and from home. The authors retrospectively examine the multidisciplinary approach that the University of Chicago Medicine (UCM) applied to postearthquake Haiti, which included the application of institutional structure and strategy, systematic deployment of teams tailored to evolving needs, and the actual response and recovery. The university mobilized significant human and material resources for deployment within 48 hours and sustained the effort for over four months. In partnership with international and local nongovernmental organizations as well as other AHCs, the UCM operated one of the largest and more efficient acute field hospitals in the country. The UCM's efforts in postearthquake Haiti provide insight into the role AHCs can play, including their strengths and limitations, in complex disasters. AHCs can provide necessary intellectual and material resources as well as technical expertise, but the cost and speed required for responding to an emergency, and ongoing domestic responsibilities, may limit the response of a large university and hospital system. The authors describe the strong institutional backing, the detailed predeployment planning and logistical support UCM provided, the engagement of faculty and staff who had previous experience in complex humanitarian emergencies, and the help of volunteers fluent in the local language which, together, made UCM's mission in postearthquake Haiti successful.

  13. Analogue models of subduction megathrust earthquakes: improving rheology and monitoring technique

    NASA Astrophysics Data System (ADS)

    Brizzi, Silvia; Corbi, Fabio; Funiciello, Francesca; Moroni, Monica

    2015-04-01

    duration and rupture width. Experimental monitoring has been performed by means of both PEP and PIV (i.e., Particle Image Velocimetry) algorithms. PEP differs from classic cross-correlation techniques (i.e., PIV) in its ability to provide sparse velocity vectors at points coincident with particle barycentre positions, allowing a lagrangian description of the velocity field and a better spatial resolution (i.e., ≈ 0.03 mm2) with respect to PIV. Results show that PEP algorithm is able to identify a greater number of analogue earthquakes (i.e., ≈ 20% more than PIV algorithm), decreasing the minimum detectable magnitude from 6.6 to 4.5. Furthermore, earthquake source parameters (e.g., hypocentre position, rupture limits and slip distribution) are more accurately defined. PEP algorithm is then suitable to potentially gain new insights on seismogenic process of STF, by extending the analysable magnitude range of analogue earthquakes and having implications on applicability of scaling relationship, such as Gutenberg - Richter law, to experimental results.

  14. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 2, Radiation Monitoring and Sampling

    SciTech Connect

    NSTec Aerial Measurement Systems

    2012-07-31

    The FRMAC Monitoring and Sampling Manual, Volume 2 provides standard operating procedures (SOPs) for field radiation monitoring and sample collection activities that are performed by the Monitoring group during a FRMAC response to a radiological emergency.

  15. Real-time earthquake monitoring for tsunami warning in the Indian Ocean and beyond

    NASA Astrophysics Data System (ADS)

    Hanka, W.; Saul, J.; Weber, B.; Becker, J.; Harjadi, P.; Fauzi; Gitews Seismology Group

    2010-12-01

    The Mw = 9.3 Sumatra earthquake of 26 December 2004 generated a tsunami that affected the entire Indian Ocean region and caused approximately 230 000 fatalities. In the response to this tragedy the German government funded the German Indonesian Tsunami Early Warning System (GITEWS) Project. The task of the GEOFON group of GFZ Potsdam was to develop and implement the seismological component. In this paper we describe the concept of the GITEWS earthquake monitoring system and report on its present status. The major challenge for earthquake monitoring within a tsunami warning system is to deliver rapid information about location, depth, size and possibly other source parameters. This is particularly true for coast lines adjacent to the potential source areas such as the Sunda trench where these parameters are required within a few minutes after the event in order to be able to warn the population before the potential tsunami hits the neighbouring coastal areas. Therefore, the key for a seismic monitoring system with short warning times adequate for Indonesia is a dense real-time seismic network across Indonesia with densifications close to the Sunda trench. A substantial number of supplementary stations in other Indian Ocean rim countries are added to strengthen the teleseismic monitoring capabilities. The installation of the new GITEWS seismic network - consisting of 31 combined broadband and strong motion stations - out of these 21 stations in Indonesia - is almost completed. The real-time data collection is using a private VSAT communication system with hubs in Jakarta and Vienna. In addition, all available seismic real-time data from the other seismic networks in Indonesia and other Indian Ocean rim countries are acquired also directly by VSAT or by Internet at the Indonesian Tsunami Warning Centre in Jakarta and the resulting "virtual" network of more than 230 stations can jointly be used for seismic data processing. The seismological processing software as part

  16. Microplate boundaries as obstacles to pre-earthquake strain transfer in Western Turkey: Inferences from continuous geochemical monitoring

    NASA Astrophysics Data System (ADS)

    İnan, Sedat; Pabuçcu, Zümer; Kulak, Furkan; Ergintav, Semih; Tatar, Orhan; Altunel, Erhan; Akyüz, Serdar; Tan, Onur; Seyis, Cemil; Çakmak, Rahşan; Saatçılar, Ruhi; Eyidoğan, Haluk

    2012-04-01

    Warm and hot spring water chemistry changes as well as soil gas radon release patterns have been monitored in Western Turkey, alongside regional seismicity, providing a multi-disciplinary approach. From January 2009 to May 2011, 33 earthquakes with ML between 4.0 and 6.0 occurred in this seismically very active region; the ML 6.0 earthquake occurred on 19 May 2011 in Simav town of Kütahya Province at a location midway between dense multidisciplinary monitoring networks of Marmara Region (MR) and the Aegean Extensional Province (AEP). We previously reported on noteworthy precursory anomalies prior to several earthquakes (ML ⩾ 4) in the MR and AEP, but no precursory anomaly was detected prior to the ML 6.0 event on 19 May 2011 in Simav, Kütahya Province, midway between dense multidisciplinary monitoring MR and AEP networks. Although these networks operate within the theoretical strain radii of this earthquake (Dobrovolsky et al., 1979), no reliable anomaly were found. Geodetic studies based on GPS data have identified crustal blocks in this region. The epicentral area of the Simav event is located within a block tectonically separated from AEP and MR. Thus, we speculate that pre-earthquake strain accumulation within the Simav block did not effectively transfer to adjacent blocks where the MR and AEP networks are located, thereby providing an explanation for the absence of detectable anomalies. Moreover, prior to some earthquakes quadrant features of geochemical transients have been found; suggesting that soil radon anomalies appear in compressional quadrant(s) of pre-earthquake strain distribution.

  17. The effects of educational program on health volunteers’ knowledge regarding their approach to earthquake in health centers in Tehran

    PubMed Central

    JOUHARI, ZAHRA; PIRASTEH, AFSHAR; GHASSEMI, GHOLAM REZA; BAZRAFKAN, LEILA

    2015-01-01

    Introduction The people's mental, intellectual and physical non-readiness to confront earthquake may result in disastrous outcomes. This research aimed to study of effects of a training intervention on health connector’s knowledge regarding their approach to earthquake in health-training centers in East of Tehran. Methods This research which is a semi-experimental study was designed and executed in 2011, using a questionnaire with items based on the information of Crisis Management Org. After a pilot study and making the questionnaire valid and reliable, we determined the sample size. Then, the questionnaires were completed before and after the training program by 82 health connectors at health-treatment centers in the East of Tehran. Finally, the collected data were analyzed by SPSS 14, using paired sample t–test and Pearson's correlation coefficient. Results Health connectors were women with the mean age of 43.43±8.51 years. In this research, the mean score of connectors’ knowledge before and after the training was 35.15±4.3 and 43.73±2.91 out of 48, respectively. The difference was statistically significant (p=0.001). The classes were the most important source of information for the health connectors. Conclusion The people's knowledge to confront earthquake can be increased by holding training courses and workshops. Such training courses and workshops have an important role in data transfer and readiness of health connectors. PMID:25927068

  18. Biosecurity and Health Monitoring at the Zebrafish International Resource Center.

    PubMed

    Murray, Katrina N; Varga, Zoltán M; Kent, Michael L

    2016-07-01

    The Zebrafish International Resource Center (ZIRC) is a repository and distribution center for mutant, transgenic, and wild-type zebrafish. In recent years annual imports of new zebrafish lines to ZIRC have increased tremendously. In addition, after 15 years of research, we have identified some of the most virulent pathogens affecting zebrafish that should be avoided in large production facilities, such as ZIRC. Therefore, while importing a high volume of new lines we prioritize safeguarding the health of our in-house fish colony. Here, we describe the biosecurity and health-monitoring program implemented at ZIRC. This strategy was designed to prevent introduction of new zebrafish pathogens, minimize pathogens already present in the facility, and ensure a healthy zebrafish colony for in-house uses and shipment to customers.

  19. Self-Powered WSN for Distributed Data Center Monitoring

    PubMed Central

    Brunelli, Davide; Passerone, Roberto; Rizzon, Luca; Rossi, Maurizio; Sartori, Davide

    2016-01-01

    Monitoring environmental parameters in data centers is gathering nowadays increasing attention from industry, due to the need of high energy efficiency of cloud services. We present the design and the characterization of an energy neutral embedded wireless system, prototyped to monitor perpetually environmental parameters in servers and racks. It is powered by an energy harvesting module based on Thermoelectric Generators, which converts the heat dissipation from the servers. Starting from the empirical characterization of the energy harvester, we present a power conditioning circuit optimized for the specific application. The whole system has been enhanced with several sensors. An ultra-low-power micro-controller stacked over the energy harvesting provides an efficient power management. Performance have been assessed and compared with the analytical model for validation. PMID:26729135

  20. Self-Powered WSN for Distributed Data Center Monitoring.

    PubMed

    Brunelli, Davide; Passerone, Roberto; Rizzon, Luca; Rossi, Maurizio; Sartori, Davide

    2016-01-01

    Monitoring environmental parameters in data centers is gathering nowadays increasing attention from industry, due to the need of high energy efficiency of cloud services. We present the design and the characterization of an energy neutral embedded wireless system, prototyped to monitor perpetually environmental parameters in servers and racks. It is powered by an energy harvesting module based on Thermoelectric Generators, which converts the heat dissipation from the servers. Starting from the empirical characterization of the energy harvester, we present a power conditioning circuit optimized for the specific application. The whole system has been enhanced with several sensors. An ultra-low-power micro-controller stacked over the energy harvesting provides an efficient power management. Performance have been assessed and compared with the analytical model for validation. PMID:26729135

  1. Self-Powered WSN for Distributed Data Center Monitoring.

    PubMed

    Brunelli, Davide; Passerone, Roberto; Rizzon, Luca; Rossi, Maurizio; Sartori, Davide

    2016-01-02

    Monitoring environmental parameters in data centers is gathering nowadays increasing attention from industry, due to the need of high energy efficiency of cloud services. We present the design and the characterization of an energy neutral embedded wireless system, prototyped to monitor perpetually environmental parameters in servers and racks. It is powered by an energy harvesting module based on Thermoelectric Generators, which converts the heat dissipation from the servers. Starting from the empirical characterization of the energy harvester, we present a power conditioning circuit optimized for the specific application. The whole system has been enhanced with several sensors. An ultra-low-power micro-controller stacked over the energy harvesting provides an efficient power management. Performance have been assessed and compared with the analytical model for validation.

  2. Towards real-time regional earthquake simulation I: real-time moment tensor monitoring (RMT) for regional events in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liang, Wen-Tzong; Cheng, Hui-Wen; Tu, Feng-Shan; Ma, Kuo-Fong; Tsuruoka, Hiroshi; Kawakatsu, Hitoshi; Huang, Bor-Shouh; Liu, Chun-Chi

    2014-01-01

    We have developed a real-time moment tensor monitoring system (RMT) which takes advantage of a grid-based moment tensor inversion technique and real-time broad-band seismic recordings to automatically monitor earthquake activities in the vicinity of Taiwan. The centroid moment tensor (CMT) inversion technique and a grid search scheme are applied to obtain the information of earthquake source parameters, including the event origin time, hypocentral location, moment magnitude and focal mechanism. All of these source parameters can be determined simultaneously within 117 s after the occurrence of an earthquake. The monitoring area involves the entire Taiwan Island and the offshore region, which covers the area of 119.3°E to 123.0°E and 21.0°N to 26.0°N, with a depth from 6 to 136 km. A 3-D grid system is implemented in the monitoring area with a uniform horizontal interval of 0.1° and a vertical interval of 10 km. The inversion procedure is based on a 1-D Green's function database calculated by the frequency-wavenumber (fk) method. We compare our results with the Central Weather Bureau (CWB) catalogue data for earthquakes occurred between 2010 and 2012. The average differences between event origin time and hypocentral location are less than 2 s and 10 km, respectively. The focal mechanisms determined by RMT are also comparable with the Broadband Array in Taiwan for Seismology (BATS) CMT solutions. These results indicate that the RMT system is realizable and efficient to monitor local seismic activities. In addition, the time needed to obtain all the point source parameters is reduced substantially compared to routine earthquake reports. By connecting RMT with a real-time online earthquake simulation (ROS) system, all the source parameters will be forwarded to the ROS to make the real-time earthquake simulation feasible. The RMT has operated offline (2010-2011) and online (since January 2012 to present) at the Institute of Earth Sciences (IES), Academia Sinica

  3. Long-term blood pressure changes induced by the 2009 L'Aquila earthquake: assessment by 24 h ambulatory monitoring.

    PubMed

    Giorgini, Paolo; Striuli, Rinaldo; Petrarca, Marco; Petrazzi, Luisa; Pasqualetti, Paolo; Properzi, Giuliana; Desideri, Giovambattista; Omboni, Stefano; Parati, Gianfranco; Ferri, Claudio

    2013-09-01

    An increased rate of cardiovascular and cerebrovascular events has been described during and immediately after earthquakes. In this regard, few data are available on long-term blood pressure control in hypertensive outpatients after an earthquake. We evaluated the long-term effects of the April 2009 L'Aquila earthquake on blood pressure levels, as detected by 24 h ambulatory blood pressure monitoring. Before/after (mean±s.d. 6.9±4.5/14.2±5.1 months, respectively) the earthquake, the available 24 h ambulatory blood pressure monitoring data for the same patients were extracted from our database. Quake-related daily life discomforts were evaluated through interviews. We enrolled 47 patients (25 female, age 52±14 years), divided into three groups according to antihypertensive therapy changes after versus before the earthquake: unchanged therapy (n=24), increased therapy (n=17) and reduced therapy (n=6). Compared with before the quake, in the unchanged therapy group marked increases in 24 h (P=0.004), daytime (P=0.01) and nighttime (P=0.02) systolic blood pressure were observed after the quake. Corresponding changes in 24 h (P=0.005), daytime (P=0.01) and nighttime (P=0.009) diastolic blood pressure were observed. Daily life discomforts were reported more frequently in the unchanged therapy and increased therapy groups than the reduced therapy group (P=0.025 and P=0.018, respectively). In conclusion, this study shows that patients with unchanged therapy display marked blood pressure increments up to more than 1 year after an earthquake, as well as long-term quake-related discomfort. Our data suggest that particular attention to blood pressure levels and adequate therapy modifications should be considered after an earthquake, not only early after the event but also months later.

  4. Advanced earthquake monitoring system for U.S. Department of Veterans Affairs medical buildings--instrumentation

    USGS Publications Warehouse

    Kalkan, Erol; Banga, Krishna; Ulusoy, Hasan S.; Fletcher, Jon Peter B.; Leith, William S.; Reza, Shahneam; Cheng, Timothy

    2012-01-01

    In collaboration with the U.S. Department of Veterans Affairs (VA), the National Strong Motion Project (NSMP; http://nsmp.wr.usgs.gov/) of the U.S. Geological Survey has been installing sophisticated seismic systems that will monitor the structural integrity of 28 VA hospital buildings located in seismically active regions of the conterminous United States, Alaska, and Puerto Rico during earthquake shaking. These advanced monitoring systems, which combine the use of sensitive accelerometers and real-time computer calculations, are designed to determine the structural health of each hospital building rapidly after an event, helping the VA to ensure the safety of patients and staff. This report presents the instrumentation component of this project by providing details of each hospital building, including a summary of its structural, geotechnical, and seismic hazard information, as well as instrumentation objectives and design. The structural-health monitoring component of the project, including data retrieval and processing, damage detection and localization, automated alerting system, and finally data dissemination, will be presented in a separate report.

  5. Monitoring of the stress state variations of the Southern California for the purpose of earthquake prediction

    NASA Astrophysics Data System (ADS)

    Gokhberg, M.; Garagash, I.; Bondur, V.; Steblov, G. M.

    2014-12-01

    The three-dimensional geomechanical model of Southern California was developed, including a mountain relief, fault tectonics and characteristic internal features such as the roof of the consolidated crust and Moho surface. The initial stress state of the model is governed by the gravitational forces and horizontal tectonic motions estimated from GPS observations. The analysis shows that the three-dimensional geomechanical models allows monitoring of the changes in the stress state during the seismic process in order to constrain the distribution of the future places with increasing seismic activity. This investigation demonstrates one of possible approach to monitor upcoming seismicity for the periods of days - weeks - months. Continuous analysis of the stress state was carried out during 2009-2014. Each new earthquake with М~1 and above from USGS catalog was considered as the new defect of the Earth crust which has some definite size and causes redistribution of the stress state. Overall calculation technique was based on the single function of the Earth crust damage, recalculated each half month. As a result each half month in the upper crust layers and partially in the middle layers we revealed locations of the maximal values of the stress state parameters: elastic energy density, shear stress, proximity of the earth crust layers to their strength limit. All these parameters exhibit similar spatial and temporal distribution. How follows from observations all four strongest events with М ~ 5.5-7.2 occurred in South California during the analyzed period were prefaced by the parameters anomalies in peculiar advance time of weeks-months in the vicinity of 10-50 km from the upcoming earthquake. After the event the stress state source disappeared. The figure shows migration of the maximums of the stress state variations gradients (parameter D) in the vicinity of the epicenter of the earthquake 04.04.2010 with М=7.2 in the period of 01.01.2010-01.05.2010. Grey lines

  6. Seismic ACROSS Transmitter Installed at Morimachi above the Subducting Philippine Sea Plate for the Test Monitoring of the Seismogenic Zone of Tokai Earthquake not yet to Occur

    NASA Astrophysics Data System (ADS)

    Kunitomo, T.; Kumazawa, M.; Masuda, T.; Morita, N.; Torii, T.; Ishikawa, Y.; Yoshikawa, S.; Katsumata, A.; Yoshida, Y.

    2008-12-01

    Here we report the first seismic monitoring system in active and constant operation for the wave propagation characteristics in tectonic region just above the subducting plate driving the coming catastrophic earthquakes. Developmental works of such a system (ACROSS; acronym for Accurately Controlled, Routinely Operated, Signal System) have been started in 1994 at Nagoya University and since 1996 also at TGC (Tono Geoscience Center) of JAEA promoted by Hyogoken Nanbu Earthquakes (1995 Jan.17, Mj=7.3). The ACROSS is a technology system including theory of signal and data processing based on the brand new concept of measurement methodology of Green function between a signal source and observation site. The works done for first generation system are reported at IWAM04 and in JAEA report (Kumazawa et al.,2007). The Meteorological Research Institute of JMA has started a project of test monitoring of Tokai area in 2004 in corporation with Shizuoka University to realize the practical use of the seismic ACROSS for earthquake prediction researches. The first target was set to Tokai Earthquake not yet to take place. The seismic ACROSS transmitter was designed so as to be appropriate for the sensitive monitoring of the deep active fault zone on the basis of the previous technology elements accumulated so far. The ground coupler (antenna) is a large steel-reinforced concrete block (over 20m3) installed in the basement rocks in order to preserve the stability. Eccentric moment of the rotary transmitter is 82 kgm at maximum, 10 times larger than that of the first generation. Carrier frequency of FM signal for practical use can be from 3.5 to 15 Hz, and the signal phase is accurately controlled by a motor with vector inverter synchronized with GPS clock with a precision of 10-4 radian or better. By referring to the existing structure model in this area (Iidaka et al., 2003), the site of the transmitting station was chosen at Morimachi so as to be appropriate for detecting the

  7. Earthquake prediction

    NASA Technical Reports Server (NTRS)

    Turcotte, Donald L.

    1991-01-01

    The state of the art in earthquake prediction is discussed. Short-term prediction based on seismic precursors, changes in the ratio of compressional velocity to shear velocity, tilt and strain precursors, electromagnetic precursors, hydrologic phenomena, chemical monitors, and animal behavior is examined. Seismic hazard assessment is addressed, and the applications of dynamical systems to earthquake prediction are discussed.

  8. The Evolution of the Federal Monitoring and Assessment Center

    SciTech Connect

    NSTec Aerial Measurement System

    2012-07-31

    The Federal Radiological Monitoring and Assessment Center (FRMAC) is a federal emergency response asset whose assistance may be requested by the Department of Homeland Security (DHS), the Department of Defense (DoD), the Environmental Protection Agency (EPA), the Nuclear Regulatory Commission (NRC), and state and local agencies to respond to a nuclear or radiological incident. It is an interagency organization with representation from the Department of Energy’s National Nuclear Security Administration (DOE/NNSA), the Department of Defense (DoD), the Environmental Protection Agency (EPA), the Department of Health and Human Services (HHS), the Federal Bureau of Investigation (FBI), and other federal agencies. FRMAC, in its present form, was created in 1987 when the radiological support mission was assigned to the DOE’s Nevada Operations Office by DOE Headquarters. The FRMAC asset, including its predecessor entities, was created, grew, and evolved to function as a response to radiological incidents. Radiological emergency response exercises showed the need for a coordinated approach to managing federal emergency monitoring and assessment activities. The mission of FRMAC is to coordinate and manage all federal radiological environmental monitoring and assessment activities during a nuclear or radiological incident within the United States in support of state,local, tribal governments, DHS, and the federal coordinating agency. Radiological emergency response professionals with the DOE’s national laboratories support the Radiological Assistance Program (RAP), National Atmospheric Release Advisory Center (NARAC), the Aerial MeasuringSystem (AMS), and the Radiation Emergency Assistance Center/Training Site (REAC/TS). These teams support the FRMAC to provide: Atmospheric transport modeling; Radiation monitoring; Radiological analysis and data assessments; and Medical advice for radiation injuries In support of field operations, the FRMAC provides geographic

  9. Emergency radiological monitoring and analysis: Federal Radiological Monitoring and Assessment Center

    SciTech Connect

    Thome, D.J.

    1995-10-01

    The US Federal Radiological Emergency Response Plan (FRERP) provides the framework for integrating the various Federal agencies responding to a major radiological emergency. The FRERP authorizes the creation of the Federal Radiological Monitoring and Assessment Center (FRMAC), which is established to coordinate all Federal agencies involved in the monitoring and assessment of the off-site radiological conditions in support of the impacted State(s) and the Lead Federal Agency (LFA). Within the FRMAC, the Monitoring and Analysis Division (M&A) is responsible for coordinating all FRMAC assets involved in conducting a comprehensive program of environmental monitoring, sampling, radioanalysis, and quality assurance. To assure consistency, completeness, and the quality of the data produced, a methodology and procedures manual is being developed. This paper discusses the structure, assets, and operations of the FRMAC M&A and the content and preparation of the manual.

  10. A new era for low frequency Galactic center transient monitoring

    NASA Astrophysics Data System (ADS)

    Kassim, N. E.; Hyman, S. D.; Intema, H.; Lazio, T. J. W.

    2014-05-01

    An upgrade of the low frequency observing system of the VLA developed by NRL and NRAO, called low band (LB), will open a new era of Galactic center (GC) transient monitoring. Our previous searches using the VLA and GMRT have revealed a modest number of radio-selected transients, but have been severely sensitivity and observing time limited. The new LB system, currently accessing the 236--492 MHz frequency range, promises ≥5 × improved sensitivity over the legacy VLA system. The new system is emerging from commissioning in time to catch any enhanced sub-GHz emission from the G2 cloud event, and we review existing limits based on recent observations. We also describe a proposed 24/7 commensal system, called the LOw Band Observatory (LOBO). LOBO offers over 100 VLA GC monitoring hours per year, possibly revealing new transients and helping validate ASTRO2010's anticipation of a new era of transient radio astronomy. A funded LOBO pathfinder called the VLA Low Frequency Ionosphere and Transient Experiment (VLITE) is under development. Finally, we consider the impact of LB and LOBO on our GC monitoring program.

  11. Emergency radiological monitoring and analysis United States Federal Radiological Monitoring and Assessment Center

    SciTech Connect

    Thome, D.J.

    1994-09-01

    The United States Federal Radiological Emergency Response Plan (FRERP) provides the framework for integrating the various Federal agencies responding to a major radiological emergency. Following a major radiological incident the FRERP authorizes the creation of the Federal Radiological Monitoring and Assessment Center (FRMAC). The FRMAC is established to coordinate all Federal agencies involved in the monitoring and assessment of the off-site radiological conditions in support of the impacted states and the Lead Federal Agency (LFA). Within the FRMAC, the Monitoring and Analysis Division is responsible for coordinating all FRMAC assets involved in conducting a comprehensive program of environmental monitoring, sampling, radioanalysis and quality assurance. This program includes: (1) Aerial Radiological Monitoring - Fixed Wing and Helicopter, (2) Field Monitoring and Sampling, (3) Radioanalysis - Mobile and Fixed Laboratories, (4) Radiation Detection Instrumentation - Calibration and Maintenance, (5) Environmental Dosimetry, and (6) An integrated program of Quality Assurance. To assure consistency, completeness and the quality of the data produced, a methodology and procedures handbook is being developed. This paper discusses the structure, assets and operations of FRMAC monitoring and analysis and the content and preparation of this handbook.

  12. MUG-OBS - Multiparameter Geophysical Ocean Bottom System : a new instrumental approach to monitor earthquakes.

    NASA Astrophysics Data System (ADS)

    hello, yann; Charvis, Philippe; Yegikyan, Manuk; verfaillie, Romain; Rivet, Diane

    2016-04-01

    Real time monitoring of seismic activity is a major issue for early warning of earthquakes and tsunamis. It can be done using regional scale wired nodes, such as Neptune in Canada and in the U.S, or DONET in Japan. Another approach to monitor seismic activity at sea is to deploying repeatedly OBS array like during the amphibious Cascadia Initiative (four time 1-year deployments), the Japanese Pacific Array (broadband OBSs "ocean-bottom broadband dispersion survey" with 2-years autonomy), the Obsismer program in the French Lesser Antilles (eight time 6-months deployments) and the Osisec program in Ecuador (four time 6-months deployments). These autonomous OBSs are self-recovered or recovered using an ROV. These systems are costly including ship time, and require to recover the OBS before to start working on data. Among the most recent alternative we developed a 3/4 years autonomy ocean bottom system with 9 channels (?) allowing the acquisition of different seismic or environmental parameters. MUG-OBS is a free falling instrument rated down to 6000 m. The installation of the sensor is monitored by acoustic commands from the surface and a health bulletin with data checking is recovered by acoustic during the installation. The major innovation is that it is possible to recover the data any time on demand (regularly every 6-months or after a crisis) using one of the 6 data-shuttles released from the surface by acoustic command using a one day fast cruise boat of opportunity. Since sensors stayed at the same location for 3 years, it is a perfect tool to monitor large seismic events, background seismic activity and aftershock distribution. Clock, drift measurement and GPS localization is automatic when the shuttle reaches the surface. For remote areas, shuttles released automatically and a seismic events bulletin is transmitted. Selected data can be recovered by two-way Iridium satellite communication. After a period of 3 years the main station is self-recovered by

  13. The IPOC Creepmeter Array in N-Chile: Monitoring Slip Accumulation Triggered By Local or Remote Earthquakes

    NASA Astrophysics Data System (ADS)

    Victor, P.; Schurr, B.; Oncken, O.; Sobiesiak, M.; Gonzalez, G.

    2014-12-01

    The Atacama Fault System (AFS) is an active trench-parallel fault, located above the down-dip end of coupling of the north Chilean subduction zone. About 3 M=7 Earthquakes in the past 10 ky have been documented in the paleoseismological record, demonstrating the potential of large events in the future. To investigate the current surface creep rate and to deduce the mode of strain accumulation, we deployed an array of 11 creepmeters along four branches of the AFS. This array monitors the interaction of earthquake activity on the subduction zone and a trench-parallel fault in the overriding forearc. The displacement across the fault is continuously monitored with 2 samples/min with a resolution of 1μm. Collocated seismometers record the seismicity at two of the creepmeters, whereas control of the regional seismicity is provided by the IPOC Seismological Networks. Continuous time series of the creepmeter stations since 2009 show that the shallow segments of the fault do not creep permanently. Instead the accumulation of permanent deformation occurs by triggered slip recorded as well-defined steps caused by local or remote earthquakes. The 2014 Mw=8.2 Pisagua Earthquake, located close to the creepmeter array, triggered large displacement events on all stations. Another event recorded on all stations was the 2010 Mw=8.8 Maule earthquake located 1500km south of the array. All of the stations showed a triggered displacement event 6-8 min after origin time of the main shock, at the same time as the arrival of the surface waves recorded at nearby IPOC stations. This points to a dynamic triggering process caused by transient stresses during passage of the surface wave. Investigation of seismic events with Magnitudes <6 show displacement events triggered during P and S wave passage, pointing to static as well as dynamic stress changes for proximal events. Analyzing the causative earthquakes we find that the most effective way to trigger displacement events on the AFS are

  14. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Chen, S.; Chowdhury, F.; Bhaskaran, A.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2009-12-01

    The SCEDC archives continuous and triggered data from nearly 3000 data channels from 375 SCSN recorded stations. The SCSN and SCEDC process and archive an average of 12,000 earthquakes each year, contributing to the southern California earthquake catalog that spans from 1932 to present. The SCEDC provides public, searchable access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP, NETDC and DHI. New data products: ● The SCEDC is distributing synthetic waveform data from the 2008 ShakeOut scenario (Jones et al., USGS Open File Rep., 2008-1150) and (Graves et al. 2008; Geophys. Res. Lett.) This is a M 7.8 earthquake on the southern San Andreas fault. Users will be able to download 40 sps velocity waveforms in SAC format from the SCEDC website. The SCEDC is also distributing synthetic GPS data (Crowell et al., 2009; Seismo. Res. Letters.) for this scenario as well. ● The SCEDC has added a new web page to show the latest tomographic model of Southern California. This model is based on Tape et al., 2009 Science. New data services: ● The SCEDC is exporting data in QuakeML format. This is an xml format that has been adopted by the Advanced National Seismic System (ANSS). This data will also be available as a web service. ● The SCEDC is exporting data in StationXML format. This is an xml format created by the SCEDC and adopted by ANSS to fully describe station metadata. This data will also be available as a web service. ● The stp 1.6 client can now access both the SCEDC and the Northern California Earthquake Data Center (NCEDC) earthquake and waveform archives. In progress - SCEDC to distribute 1 sps GPS data in miniSEED format: ● As part of a NASA Advanced Information Systems Technology project in collaboration with Jet Propulsion Laboratory and Scripps Institution of Oceanography, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California

  15. GNSS-monitoring of Natural Hazards: Ionospheric Detection of Earthquakes and Volcano Eruptions

    NASA Astrophysics Data System (ADS)

    Shults, K.; Astafyeva, E.; Lognonne, P. H.

    2015-12-01

    During the last few decades earthquakes as sources of strong perturbations in the ionosphere have been reported by many researchers, and in the last few years the seismo-ionosphere coupling has been more and more discussed (e.g., Calais and Minster, 1998, Phys. Earth Planet. Inter., 105, 167-181; Afraimovich et al., 2010, Earth, Planets, Space, V.62, No.11, 899-904; Rolland et al., 2011, Earth Planets Space, 63, 853-857). Co-volcanic ionospheric perturbations have come under the scrutiny of science only in recent years but observations have already shown that mass and energy injections of volcanic activities can also excite oscillations in the ionosphere (Heki, 2006, Geophys. Res. Lett., 33, L14303; Dautermann et al., 2009, Geophys. Res., 114, B02202). The ionospheric perturbations are induced by acoustic and gravity waves generated in the neutral atmosphere by seismic source or volcano eruption. The upward propagating vibrations of the atmosphere interact with the plasma in the ionosphere by the particle collisions and excite variations of electron density detectable with dual-frequency receivers of the Global Navigation Satellite System (GNSS). In addition to co-seismic ionospheric disturbances (CID) observations, ionospheric GNSS measurements have recently proved to be useful to obtain ionospheric images for the seismic fault allowing to provide information on its' parameters and localization (Astafyeva et al., 2011, Geophys. Res. Letters, 38, L22104). This work describes how the GNSS signals can be used for monitoring of natural hazards on examples of the 9 March 2011 M7.3 Tohoku Foreshock and April 2015 M7.8 Nepal earthquake as well as the April 2015 Calbuco volcano eruptions. We also show that use of high-resolution GNSS data can aid to plot the ionospheric images of seismic fault.

  16. Statistical monitoring of aftershock sequences: a case study of the 2015 Mw7.8 Gorkha, Nepal, earthquake

    NASA Astrophysics Data System (ADS)

    Ogata, Yosihiko; Tsuruoka, Hiroshi

    2016-03-01

    Early forecasting of aftershocks has become realistic and practical because of real-time detection of hypocenters. This study illustrates a statistical procedure for monitoring aftershock sequences to detect anomalies to increase the probability gain of a significantly large aftershock or even an earthquake larger than the main shock. In particular, a significant lowering (relative quiescence) in aftershock activity below the level predicted by the Omori-Utsu formula or the epidemic-type aftershock sequence model is sometimes followed by a large earthquake in a neighboring region. As an example, we detected significant lowering relative to the modeled rate after approximately 1.7 days after the main shock in the aftershock sequence of the Mw7.8 Gorkha, Nepal, earthquake of April 25, 2015. The relative quiescence lasted until the May 12, 2015, M7.3 Kodari earthquake that occurred at the eastern end of the primary aftershock zone. Space-time plots including the transformed time can indicate the local places where aftershock activity lowers (the seismicity shadow). Thus, the relative quiescence can be hypothesized to be related to stress shadowing caused by probable slow slips. In addition, the aftershock productivity of the M7.3 Kodari earthquake is approximately twice as large as that of the M7.8 main shock.

  17. Instability prediction by monitoring center of pressure during standing.

    PubMed

    Tortolero, Xavier; Masani, Kei; Thrasher, Timothy A; Popovic, Milos R

    2006-01-01

    Incorporating an instability predictor into a portable sensor has a number of clinically relevant applications. This study investigated the feasibility of developing a real-time assessment tool to predict stepping during standing by monitoring Center of Pressure (COP) measurements. Forward and backward perturbations were performed on 16 able-bodied subjects using a pulley system attached to the subjects' waist. A linear relationship was found between the peak COP velocity (COPv) and the peak COP position caused by the perturbations. As the peak COPv occurs considerably before the peak COP, the peak COP estimated using a regression equation from the peak COPv may serve as an instability predictor. By comparing stepping thresholds with the estimated peak COP, we found that the stepping predictor successfully predicted instability (stepping) earlier than those predictors using actual COP. Results show that the proposed model is a viable solution to predict stepping, and the feasibility of incorporating the model into a neuroprosthesis system for standing. PMID:17947140

  18. Federal Radiological Monitoring and Assessment Center Monitoring Manual Volume 1, Operations

    SciTech Connect

    NSTec Aerial Measurement Systems

    2012-07-31

    The Monitoring division is primarily responsible for the coordination and direction of: Aerial measurements to delineate the footprint of radioactive contaminants that have been released into the environment. Monitoring of radiation levels in the environment; Sampling to determine the extent of contaminant deposition in soil, water, air and on vegetation; Preliminary field analyses to quantify soil concentrations or depositions; and Environmental and personal dosimetry for FRMAC field personnel, during a Consequence Management Response Team (CMRT) and Federal Radiological Monitoring and Assessment Center (FRMAC) response. Monitoring and sampling techniques used during CM/FRMAC operations are specifically selected for use during radiological emergencies where large numbers of measurements and samples must be acquired, analyzed, and interpreted in the shortest amount of time possible. In addition, techniques and procedures are flexible so that they can be used during a variety of different scenarios; e.g., accidents involving releases from nuclear reactors, contamination by nuclear waste, nuclear weapon accidents, space vehicle reentries, or contamination from a radiological dispersal device. The Monitoring division also provides technicians to support specific Health and Safety Division activities including: The operation of the Hotline; FRMAC facility surveys; Assistance with Health and Safety at Check Points; and Assistance at population assembly areas which require support from the FRMAC. This volume covers deployment activities, initial FRMAC activities, development and implementation of the monitoring and assessment plan, the briefing of field teams, and the transfer of FRMAC to the EPA.

  19. First Results of 3 Year Monitoring of Red Wood Ants' Behavioural Changes and Their Possible Correlation with Earthquake Events

    NASA Astrophysics Data System (ADS)

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-04-01

    Short-term earthquake predictions with an advance warning of several hours or days can currently not be performed reliably and remain limited to only a few minutes before the event. Abnormal animal behaviours prior to earthquakes have been reported previously but their detection creates problems in monitoring and reliability. A different situation is encountered for red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae). They have stationary nest sites on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas and are simultaneously information channels deeply reaching into the crust. A particular advantage of monitoring RWA is their high sensitivity to environmental changes. Besides an evolutionarily developed extremely strong temperature sensitivity of 0.25 K, they have chemoreceptors for the detection of CO2 concentrations and a sensitivity for electromagnetic fields. Changes of the electromagnetic field are discussed or short-lived "thermal anomalies" are reported as trigger mechanisms for bioanomalies of impending earthquakes. For 3 years, we have monitored two Red Wood Ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), 24/7 by high-resolution cameras equipped with a colour and infrared sensor. In the Neuwied Basin, an average of about 100 earthquakes per year with magnitudes up to M 3.9 occur located on different tectonic fault regimes (strike-slip faults and/or normal or thrust faults). The RWA mounds are located on two different fault regimes approximately 30 km apart. First results show that the ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants' behaviour hours before the earthquake event: The nocturnal rest phase and daily activity are suppressed, and standard daily routine is continued not before the next day. Additional parameters that might have an effect on the ants' daily routine

  20. Results of seismological monitoring in the Cascade Range 1962-1989: earthquakes, eruptions, avalanches and other curiosities

    USGS Publications Warehouse

    Weaver, C.S.; Norris, R.D.; Jonientz-Trisler, C.

    1990-01-01

    Modern monitoring of seismic activity at Cascade Range volcanoes began at Longmire on Mount Rainier in 1958. Since then, there has been an expansion of the regional seismic networks in Washington, northern Oregon and northern California. Now, the Cascade Range from Lassen Peak to Mount Shasta in the south and Newberry Volcano to Mount Baker in the north is being monitored for earthquakes as small as magnitude 2.0, and many of the stratovolcanoes are monitored for non-earthquake seismic activity. This monitoring has yielded three major observations. First, tectonic earthquakes are concentrated in two segments of the Cascade Range between Mount Rainier and Mount Hood and between Mount Shasta and Lassen Peak, whereas little seismicity occurs between Mount Hood and Mount Shasta. Second, the volcanic activity and associated phenomena at Mount St. Helens have produced intense and widely varied seismicity. And third, at the northern stratovolcanoes, signals generated by surficial events such as debris flows, icequakes, steam emissions, rockfalls and icefalls are seismically recorded. Such records have been used to alert authorities of dangerous events in progress. -Authors

  1. Logic-centered architecture for ubiquitous health monitoring.

    PubMed

    Lewandowski, Jacek; Arochena, Hisbel E; Naguib, Raouf N G; Chao, Kuo-Ming; Garcia-Perez, Alexeis

    2014-09-01

    One of the key points to maintain and boost research and development in the area of smart wearable systems (SWS) is the development of integrated architectures for intelligent services, as well as wearable systems and devices for health and wellness management. This paper presents such a generic architecture for multiparametric, intelligent and ubiquitous wireless sensing platforms. It is a transparent, smartphone-based sensing framework with customizable wireless interfaces and plug'n'play capability to easily interconnect third party sensor devices. It caters to wireless body, personal, and near-me area networks. A pivotal part of the platform is the integrated inference engine/runtime environment that allows the mobile device to serve as a user-adaptable personal health assistant. The novelty of this system lays in a rapid visual development and remote deployment model. The complementary visual Inference Engine Editor that comes with the package enables artificial intelligence specialists, alongside with medical experts, to build data processing models by assembling different components and instantly deploying them (remotely) on patient mobile devices. In this paper, the new logic-centered software architecture for ubiquitous health monitoring applications is described, followed by a discussion as to how it helps to shift focus from software and hardware development, to medical and health process-centered design of new SWS applications.

  2. The Savannah River Technology Center environmental monitoring field test platform

    SciTech Connect

    Rossabi, J.

    1993-03-05

    Nearly all industrial facilities have been responsible for introducing synthetic chemicals into the environment. The Savannah River Site is no exception. Several areas at the site have been contaminated by chlorinated volatile organic chemicals. Because of the persistence and refractory nature of these contaminants, a complete clean up of the site will take many years. A major focus of the mission of the Environmental Sciences Section of the Savannah River Technology Center is to develop better, faster, and less expensive methods for characterizing, monitoring, and remediating the subsurface. These new methods can then be applied directly at the Savannah River Site and at other contaminated areas in the United States and throughout the world. The Environmental Sciences Section has hosted field testing of many different monitoring technologies over the past two years primarily as a result of the Integrated Demonstration Program sponsored by the Department of Energy`s Office of Technology Development. This paper provides an overview of some of the technologies that have been demonstrated at the site and briefly discusses the applicability of these techniques.

  3. Federal Radiological Monitoring and Assessment Center Analytical Response

    SciTech Connect

    E.C. Nielsen

    2003-04-01

    The Federal Radiological Monitoring and Assessment Center (FRMAC) is authorized by the Federal Radiological Emergency Response Plan to coordinate all off-site radiological response assistance to state and local government s, in the event of a major radiological emergency in the United States. The FRMAC is established by the U.S. Department of Energy, National Nuclear Security Administration, to coordinate all Federal assets involved in conducting a comprehensive program of radiological environmental monitoring, sampling, radioanalysis, quality assurance, and dose assessment. During an emergency response, the initial analytical data is provided by portable field instrumentation. As incident responders scale up their response based on the seriousness of the incident, local analytical assets and mobile laboratories add additional capability and capacity. During the intermediate phase of the response, data quality objectives and measurement quality objectives are more rigorous. These higher objectives will require the use of larger laboratories, with greater capacity and enhanced capabilities. These labs may be geographically distant from the incident, which will increase sample management challenges. This paper addresses emergency radioanalytical capability and capacity and its utilization during FRMAC operations.

  4. Monitoring seismic velocity changes caused by the 2014 Northern Aegean earthquake using continuous ambient noise records

    NASA Astrophysics Data System (ADS)

    Evangelidis, Christos; Daskalakis, Emmanouil; Tsogka, Chrysoula

    2016-04-01

    The 24 May 2014 Northern Aegean earthquake (6.9 Mw), an event on the Northern Aegean Trough (NAT), ruptured on two different fault segments with a total ruptured length of ~100 km. For the second delayed segment, rupture propagated eastward from the hypocenter for ˜65 km with a supershear velocity (5.5 km/s). Low-aftershock seismicity on the supershear segment implies a simple and linear fault geometry there. An effort to monitor temporal seismic velocity changes across the ruptured area of the Northern Aegean earthquake is underway. In recent years, neighboring seismic broadband stations near active faults have been successfully used to detect such changes. The crosscorrelation functions (CCF) of ambient noise records between stations yields the corresponding traveltimes for those inter-station paths. Moreover, the auto-correlation functions (ACF) at each station produce the seismic responce for a coincident source and receiver position. Possible temporal changes of the measured traveltimes from CCFs and ACFs correspond to seismic velocity changes. Initially, we investigate the characteristics and sources of the ambient seismic noise as recorded at permanent seismic stations installed around NAT at the surrounding islands and in mainland Greece and Turkey. The microseismic noise levels show a clear seasonal variation at all stations. The noise levels across the double frequency band (DF; period range 4-8 s) reflect the local sea-weather conditions within a range of a few hundred kilometers. Three years of continuous seismic records framing the main shock have been analysed from ~15 stations within a radius of 100 km from the epicentre. We observe a clear decrease of seismic velocities most likely corresponding to the co-seismic shaking. The spatial variation of this velocity drop is imaged from all inter-station paths that correspond to CCF measurements and for station sites that correspond to ACF measurements. Thus, we explore a possible correlation between co

  5. The study of key issues about integration of GNSS and strong-motion records for real-time earthquake monitoring

    NASA Astrophysics Data System (ADS)

    Tu, Rui; Zhang, Pengfei; Zhang, Rui; Liu, Jinhai

    2016-08-01

    This paper has studied the key issues about integration of GNSS and strong-motion records for real-time earthquake monitoring. The validations show that the consistence of the coordinate system must be considered firstly to exclude the system bias between GNSS and strong-motion. The GNSS sampling rate is suggested about 1-5 Hz, and we should give the strong-motion's baseline shift with a larger dynamic noise as its variation is very swift. The initialization time of solving the baseline shift is less than one minute, and ambiguity resolution strategy is not greatly improved the solution. The data quality is very important for the solution, we advised to use multi-frequency and multi-system observations. These ideas give an important guide for real-time earthquake monitoring and early warning by the tight integration of GNSS and strong-motion records.

  6. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Bhaskaran, A.; Chen, S.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2010-12-01

    cooperation with QCN and CSN is exploring ways to archive and distribute data from high density low cost networks. As a starting point the SCEDC will store a dataset from QCN and CSN and distribute it through a separate STP client. New archival methods: ● The SCEDC is exploring the feasibility of archiving and distributing waveform data using cloud computing such as Google Apps. A month of continuous data from the SCEDC archive will be stored in Google Apps and a client developed to access it in a manner similar to STP. XML formats: ● The SCEDC is now distributing earthquake parameter data through web services in QuakeML format. ● The SCEDC in collaboration with the Northern California Earthquake Data Center (NCEDC) and USGS Golden has reviewed and revised the StationXML format to produce version 2.0. The new version includes a rules on extending the schema, use of named complex types, and greater consistency in naming conventions. Based on this work we plan to develop readers and writers of the StationXML format.

  7. Restoration of accelerator facilities damaged by Great East Japan Earthquake at Cyclotron and Radioisotope Center, Tohoku University.

    PubMed

    Wakui, Takashi; Itoh, Masatoshi; Shimada, Kenzi; Yoshida, Hidetomo P; Shinozuka, Tsutomu; Sakemi, Yasuhiro

    2014-01-01

    The Cyclotron and Radioisotope Center (CYRIC) of Tohoku University is a joint-use institution for education and research in a wide variety of fields ranging from physics to medicine. Accelerator facilities at the CYRIC provide opportunities for implementing a broad research program, including medical research using positron emission tomography (PET), with accelerated ions and radioisotopes. At the Great East Japan Earthquake on March 11, 2011, no human injuries occurred and a smooth evacuation was made in the CYRIC, thanks to the anti-earthquake measures such as the renovation of the cyclotron building in 2009 mainly to provide seismic strengthening, fixation of shelves to prevent the falling of objects, and securement of the width of the evacuation route. The preparation of an emergency response manual was also helpful. However, the accelerator facilities were damaged because of strong shaking that continued for a few minutes. For example, two columns on which a 930 cyclotron was placed were damaged, and thereby the 930 cyclotron was inclined. All the elements of beam transport lines were deviated from the beam axis. Some peripheral devices in a HM12 cyclotron were broken. Two shielding doors fell from the carriage onto the floor and blocked the entrances to the rooms. The repair work on the accelerator facilities was started at the end of July 2011. During the repair work, the joint use of the accelerator facilities was suspended. After the repair work was completed, the joint use was re-started at October 2012, one and a half years after the earthquake.

  8. The response of academic medical centers to the 2010 Haiti earthquake: the Mount Sinai School of Medicine experience.

    PubMed

    Ripp, Jonathan A; Bork, Jacqueline; Koncicki, Holly; Asgary, Ramin

    2012-01-01

    On January 12, 2010, Haiti was struck by a 7.0 earthquake which left the country in a state of devastation. In the aftermath, there was an enormous relief effort in which academic medical centers (AMC) played an important role. We offer a retrospective on the AMC response through the Mount Sinai School of Medicine (MSSM) experience. Over the course of the year that followed the Earthquake, MSSM conducted five service trips in conjunction with two well-established groups which have provided service to the Haitian people for over 15 years. MSSM volunteer personnel included nurses, resident and attending physicians, and specialty fellows who provided expertise in critical care, emergency medicine, wound care, infectious diseases and chronic disease management of adults and children. Challenges faced included stressful and potentially hazardous working conditions, provision of care with limited resources and cultural and language barriers. The success of the MSSM response was due largely to the strength of its human resources and the relationship forged with effective relief organizations. These service missions fulfilled the institution's commitment to social responsibility and provided a valuable training opportunity in advocacy. For other AMCs seeking to respond in future emergencies, we suggest early identification of a partner with field experience, recruitment of administrative and faculty support across the institution, significant pre-departure orientation and utilization of volunteers to fundraise and advocate. Through this process, AMCs can play an important role in disaster response.

  9. Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building

    USGS Publications Warehouse

    Kohler, M.D.; Davis, P.M.; Safak, E.

    2005-01-01

    Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.

  10. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.

  11. Connecting slow earthquakes to huge earthquakes

    NASA Astrophysics Data System (ADS)

    Obara, Kazushige; Kato, Aitaro

    2016-07-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes.

  12. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. PMID:27418504

  13. Comprehensive Nuclear-Test-Ban Treaty seismic monitoring: 2012 USNAS report and recent explosions, earthquakes, and other seismic sources

    SciTech Connect

    Richards, Paul G.

    2014-05-09

    A comprehensive ban on nuclear explosive testing is briefly characterized as an arms control initiative related to the Non-Proliferation Treaty. The work of monitoring for nuclear explosions uses several technologies of which the most important is seismology-a physics discipline that draws upon extensive and ever-growing assets to monitor for earthquakes and other ground-motion phenomena as well as for explosions. This paper outlines the basic methods of seismic monitoring within that wider context, and lists web-based and other resources for learning details. It also summarizes the main conclusions, concerning capability to monitor for test-ban treaty compliance, contained in a major study published in March 2012 by the US National Academy of Sciences.

  14. Comprehensive Nuclear-Test-Ban Treaty seismic monitoring: 2012 USNAS report and recent explosions, earthquakes, and other seismic sources

    NASA Astrophysics Data System (ADS)

    Richards, Paul G.

    2014-05-01

    A comprehensive ban on nuclear explosive testing is briefly characterized as an arms control initiative related to the Non-Proliferation Treaty. The work of monitoring for nuclear explosions uses several technologies of which the most important is seismology-a physics discipline that draws upon extensive and ever-growing assets to monitor for earthquakes and other ground-motion phenomena as well as for explosions. This paper outlines the basic methods of seismic monitoring within that wider context, and lists web-based and other resources for learning details. It also summarizes the main conclusions, concerning capability to monitor for test-ban treaty compliance, contained in a major study published in March 2012 by the US National Academy of Sciences.

  15. Citizen Monitoring during Hazards: The Case of Fukushima Radiation after the 2011 Japanese Earthquake

    NASA Astrophysics Data System (ADS)

    Hultquist, C.; Cervone, G.

    2015-12-01

    Citizen-led movements producing scientific environmental information are increasingly common during hazards. After the Japanese earthquake-triggered tsunami in 2011, the government produced airborne remote sensing data of the radiation levels after the Fukushima nuclear reactor failures. Advances in technology enabled citizens to monitor radiation by innovative mobile devices built from components bought on the Internet. The citizen-led Safecast project measured on-ground levels of radiation in the Fukushima prefecture which total 14 million entries to date in Japan. This non-authoritative citizen science collection recorded radiation levels at specific coordinates and times is available online, yet the reliability and validity of the data had not been assessed. The nuclear incident provided a case for assessment with comparable dimensions of citizen science and authoritative data. To perform a comparison of the datasets, standardization was required. The sensors were calibrated scientifically but collected using different units of measure. Radiation decays over time so temporal interpolation was necessary for comparison of measurements as being the same time frame. Finally, the GPS located points were selected within the overlapping spatial extent of 500 meters. This study spatially analyzes and statistically compares citizen-volunteered and government-generated radiation data. Quantitative measures are used to assess the similarity and difference in the datasets. Radiation measurements from the same geographic extents show similar spatial variations which suggests that citizen science data can be comparable with government-generated measurements. Validation of Safecast demonstrates that we can infer scientific data from unstructured and not vested data. Citizen science can provide real-time data for situational awareness which is crucial for decision making during disasters. This project provides a methodology for comparing datasets of radiological measurements

  16. Space Monitoring Data Center at Moscow State University

    NASA Astrophysics Data System (ADS)

    Kalegaev, Vladimir; Bobrovnikov, Sergey; Barinova, Vera; Myagkova, Irina; Shugay, Yulia; Barinov, Oleg; Dolenko, Sergey; Mukhametdinova, Ludmila; Shiroky, Vladimir

    Space monitoring data center of Moscow State University provides operational information on radiation state of the near-Earth space. Internet portal http://swx.sinp.msu.ru/ gives access to the actual data characterizing the level of solar activity, geomagnetic and radiation conditions in the magnetosphere and heliosphere in the real time mode. Operational data coming from space missions (ACE, GOES, ELECTRO-L1, Meteor-M1) at L1, LEO and GEO and from the Earth’s surface are used to represent geomagnetic and radiation state of near-Earth environment. On-line database of measurements is also maintained to allow quick comparison between current conditions and conditions experienced in the past. The models of space environment working in autonomous mode are used to generalize the information obtained from observations on the whole magnetosphere. Interactive applications and operational forecasting services are created on the base of these models. They automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons using data from LEO orbits. Special forecasting services give short-term forecast of SEP penetration to the Earth magnetosphere at low altitudes, as well as relativistic electron fluxes at GEO. Velocities of recurrent high speed solar wind streams on the Earth orbit are predicted with advance time of 3-4 days on the basis of automatic estimation of the coronal hole areas detected on the images of the Sun received from the SDO satellite. By means of neural network approach, Dst and Kp indices online forecasting 0.5-1.5 hours ahead, depending on solar wind and the interplanetary magnetic field, measured by ACE satellite, is carried out. Visualization system allows representing experimental and modeling data in 2D and 3D.

  17. Managing Multi-center Flow Cytometry Data for Immune Monitoring

    PubMed Central

    White, Scott; Laske, Karoline; Welters, Marij JP; Bidmon, Nicole; van der Burg, Sjoerd H; Britten, Cedrik M; Enzor, Jennifer; Staats, Janet; Weinhold, Kent J; Gouttefangeas, Cécile; Chan, Cliburn

    2014-01-01

    With the recent results of promising cancer vaccines and immunotherapy1–5, immune monitoring has become increasingly relevant for measuring treatment-induced effects on T cells, and an essential tool for shedding light on the mechanisms responsible for a successful treatment. Flow cytometry is the canonical multi-parameter assay for the fine characterization of single cells in solution, and is ubiquitously used in pre-clinical tumor immunology and in cancer immunotherapy trials. Current state-of-the-art polychromatic flow cytometry involves multi-step, multi-reagent assays followed by sample acquisition on sophisticated instruments capable of capturing up to 20 parameters per cell at a rate of tens of thousands of cells per second. Given the complexity of flow cytometry assays, reproducibility is a major concern, especially for multi-center studies. A promising approach for improving reproducibility is the use of automated analysis borrowing from statistics, machine learning and information visualization21–23, as these methods directly address the subjectivity, operator-dependence, labor-intensive and low fidelity of manual analysis. However, it is quite time-consuming to investigate and test new automated analysis techniques on large data sets without some centralized information management system. For large-scale automated analysis to be practical, the presence of consistent and high-quality data linked to the raw FCS files is indispensable. In particular, the use of machine-readable standard vocabularies to characterize channel metadata is essential when constructing analytic pipelines to avoid errors in processing, analysis and interpretation of results. For automation, this high-quality metadata needs to be programmatically accessible, implying the need for a consistent Application Programming Interface (API). In this manuscript, we propose that upfront time spent normalizing flow cytometry data to conform to carefully designed data models enables

  18. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site

    EPA Science Inventory

    The presentation covers the following monitoring objectives at the demonstration site at Edison, NJ: Hydrologic performance, water quality performance, urban heat island effects, maintenance effects and infiltration water parameters. There will be a side by side monitoring of ...

  19. Activity of Very-low-frequency Earthquakes in Japan Monitored by a Sensitive Accelerometer Network

    NASA Astrophysics Data System (ADS)

    Asano, Y.; Ito, Y.; Obara, K.

    2006-12-01

    We developed automatically detection method of very-low-frequency (VLF) earthquakes based on semblance analysis in order to spatio-tempotal distribution of VLF earthquakes in Japan. Seismograms observed by a sensitive accelerometer network (Hi-net Tilt) with a station separation of about 20 km were analyzed in this study. Band-pass filter with a pass-band of 0.02-0.05 Hz was applied to the original seismograms; and the filtered seismograms were re-sampled with a sampling frequency of 1 Hz. We composed 110 arrays with an aperture of about 50-100 km in all over Japan. Re-sampled seismograms observed at these stations in each array were analyzed to evaluate semblance coefficient and to estimate azimuths and apparent slownesses of propagating seismic waves in plain wave approximation. Window length and time step of the moving time windows to evaluate semblance coefficient were selected to be 30 s and 15 s, respectively. An azimuth and an apparent slowness corresponding to the maximum semblance coefficient in each time step can be estimated by a grid search algorism for each array. If the propagating seismic waves are radiated from a hypocenter, it is expected that the azimuths observed at each array are consistent with the epicenter location. Therefore we can estimate the epicenter location which explains these azimuths at each array. Inner products of observed and model predicted unit vectors corresponding to the azimuths of the incident wave are averaged over all arrays with weighting factors; and an epicenter location is estimated to maximumize the averaged value. We estimated epicenters for coherent wave arrivals detected by semblance analysis. After detection and epicenter determination for coherent events, events which are corresponding to ordinary earthquakes described in the event catalogue of Hi-net routine were excluded; and the rest is identified as VLF earthquake which have no distinct high-frequency radiation. We analyzed continuous waveform data from 2003

  20. Early Results of Three-Year Monitoring of Red Wood Ants’ Behavioral Changes and Their Possible Correlation with Earthquake Events

    PubMed Central

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-01-01

    Simple Summary For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the video streams. Based on this automated approach, a statistical analysis of the ant behavior will be carried out. Abstract Short-term earthquake predictions with an advance warning of several hours or days are currently not possible due to both incomplete understanding of the complex tectonic processes and inadequate observations. Abnormal animal behaviors before earthquakes have been reported previously, but create problems in monitoring and reliability. The situation is different with red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae)). They have stationary mounds on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas. For three years (2009–2012), two red wood ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), have been monitored 24/7 by high-resolution cameras with both a color and an infrared sensor. Early results show that ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants’ behavior hours before the earthquake: the nocturnal rest phase and daily activity are suppressed, and standard daily routine does not resume until the next day. At present, an automated image evaluation routine is being applied to the more than 45,000 hours of video streams. Based on this automated approach, a statistical analysis of

  1. The Development of Several Electromagnetic Monitoring Strategies and Algorithms for Validating Pre-Earthquake Electromagnetic Signals

    NASA Astrophysics Data System (ADS)

    Bleier, T. E.; Dunson, J. C.; Roth, S.; Mueller, S.; Lindholm, C.; Heraud, J. A.

    2012-12-01

    QuakeFinder, a private research group in California, reports on the development of a 100+ station network consisting of 3-axis induction magnetometers, and air conductivity sensors to collect and characterize pre-seismic electromagnetic (EM) signals. These signals are combined with daily Infra Red signals collected from the GOES weather satellite infrared (IR) instrument to compare and correlate with the ground EM signals, both from actual earthquakes and boulder stressing experiments. This presentation describes the efforts QuakeFinder has undertaken to automatically detect these pulse patterns using their historical data as a reference, and to develop other discriminative algorithms that can be used with air conductivity sensors, and IR instruments from the GOES satellites. The overall big picture results of the QuakeFinder experiment are presented. In 2007, QuakeFinder discovered the occurrence of strong uni-polar pulses in their magnetometer coil data that increased in tempo dramatically prior to the M5.1 earthquake at Alum Rock, California. Suggestions that these pulses might have been lightning or power-line arcing did not fit with the data actually recorded as was reported in Bleier [2009]. Then a second earthquake occurred near the same site on January 7, 2010 as was reported in Dunson [2011], and the pattern of pulse count increases before the earthquake occurred similarly to the 2007 event. There were fewer pulses, and the magnitude of them was decreased, both consistent with the fact that the earthquake was smaller (M4.0 vs M5.4) and farther away (7Km vs 2km). At the same time similar effects were observed at the QuakeFinder Tacna, Peru site before the May 5th, 2010 M6.2 earthquake and a cluster of several M4-5 earthquakes.

  2. Real-time particulate fallout contamination monitoring technology development at NASA Kennedy Space Center

    NASA Astrophysics Data System (ADS)

    Mogan, Paul A.; Schwindt, Chris J.

    1998-10-01

    Two separate real-time particulate fallout monitoring instruments have been developed by the contamination monitoring Laboratory at NASA John F. Kennedy Space Center. These instruments monitor particular fallout contamination deposition rates in cleanrooms and allow certification of cleanliness levels as well as proactive protection of valuable flight hardware.

  3. The Community Seismic Network and Quake-Catcher Network: Monitoring building response to earthquakes through community instrumentation

    NASA Astrophysics Data System (ADS)

    Cheng, M.; Kohler, M. D.; Heaton, T. H.; Clayton, R. W.; Chandy, M.; Cochran, E.; Lawrence, J. F.

    2013-12-01

    The Community Seismic Network (CSN) and Quake-Catcher Network (QCN) are dense networks of low-cost ($50) accelerometers that are deployed by community volunteers in their homes in California. In addition, many accelerometers are installed in public spaces associated with civic services, publicly-operated utilities, university campuses, and high-rise buildings. Both CSN and QCN consist of observation-based structural monitoring which is carried out using records from one to tens of stations in a single building. We have deployed about 150 accelerometers in a number of buildings ranging between five and 23 stories in the Los Angeles region. In addition to a USB-connected device which connects to the host's computer, we have developed a stand-alone sensor-plug-computer device that directly connects to the internet via Ethernet or WiFi. In the case of CSN, the sensors report data to the Google App Engine cloud computing service consisting of data centers geographically distributed across the continent. This robust infrastructure provides parallelism and redundancy during times of disaster that could affect hardware. The QCN sensors, however, are connected to netbooks with continuous data streaming in real-time via the distributed computing Berkeley Open Infrastructure for Network Computing software program to a server at Stanford University. In both networks, continuous and triggered data streams use a STA/LTA scheme to determine the occurrence of significant ground accelerations. Waveform data, as well as derived parameters such as peak ground acceleration, are then sent to the associated archives. Visualization models of the instrumented buildings' dynamic linear response have been constructed using Google SketchUp and MATLAB. When data are available from a limited number of accelerometers installed in high rises, the buildings are represented as simple shear beam or prismatic Timoshenko beam models with soil-structure interaction. Small-magnitude earthquake records

  4. First Results of 3 Year Monitoring of Red Wood Ants' Behavioural Changes and Their Possible Correlation with Earthquake Events

    NASA Astrophysics Data System (ADS)

    Berberich, Gabriele; Berberich, Martin; Grumpe, Arne; Wöhler, Christian; Schreiber, Ulrich

    2013-04-01

    Short-term earthquake predictions with an advance warning of several hours or days can currently not be performed reliably and remain limited to only a few minutes before the event. Abnormal animal behaviours prior to earthquakes have been reported previously but their detection creates problems in monitoring and reliability. A different situation is encountered for red wood ants (RWA; Formica rufa-group (Hymenoptera: Formicidae). They have stationary nest sites on tectonically active, gas-bearing fault systems. These faults may be potential earthquake areas and are simultaneously information channels deeply reaching into the crust. A particular advantage of monitoring RWA is their high sensitivity to environmental changes. Besides an evolutionarily developed extremely strong temperature sensitivity of 0.25 K, they have chemoreceptors for the detection of CO2 concentrations and a sensitivity for electromagnetic fields. Changes of the electromagnetic field are discussed or short-lived "thermal anomalies" are reported as trigger mechanisms for bioanomalies of impending earthquakes. For 3 years, we have monitored two Red Wood Ant mounds (Formica rufa-group), located at the seismically active Neuwied Basin (Eifel, Germany), 24/7 by high-resolution cameras equipped with a colour and infrared sensor. In the Neuwied Basin, an average of about 100 earthquakes per year with magnitudes up to M 3.9 occur located on different tectonic fault regimes (strike-slip faults and/or normal or thrust faults). The RWA mounds are located on two different fault regimes approximately 30 km apart. First results show that the ants have a well-identifiable standard daily routine. Correlation with local seismic events suggests changes in the ants' behaviour hours before the earthquake event: The nocturnal rest phase and daily activity are suppressed, and standard daily routine is continued not before the next day. Additional parameters that might have an effect on the ants' daily routine

  5. Long-term Ocean Bottom Monitoring for Shallow Slow Earthquakes in the Hyuga-nada, Nankai Subduction Zone

    NASA Astrophysics Data System (ADS)

    Yamashita, Y.; Shinohara, M.; Yamada, T.; Nakahigashi, K.; Shiobara, H.; Mochizuki, K.; Maeda, T.; Obara, K.

    2015-12-01

    The Hyuga-nada region, nearby the western end of the Nankai Trough in Japan, is one of the most active areas of shallow slow earthquakes in the world. Recently, ocean-bottom observation of offshore seismicity near the trench succeeded in detecting shallow tremor. The observed traces contained a complete episode lasting for one month exhibiting similar migration property of deep tremor [Yamashita et al., 2015]. This activity was associated with shallow very-low-frequency earthquake (VLFE) activity documented by land-based broadband seismic network. The coincidence between tremor and VLFE activities and similarity of their migration pattern show strong resemblance with the episodic tremor and slip episodes; this similarity suggests that the tremor activity in the shallow plate boundary may also be coupled with VLFE and short-term slow slip events in this area. It is important clarifying the seismicity including slow earthquakes to understand the slip behavior at a shallow plate boundary, and to improve assessments of the possibility of tsunamigenic megathrust earthquake that is anticipated to occur at the Nankai Trough. Motivated by these issues, we started long-term ocean-bottom monitoring in this area from May 2014 using 3 broadband and 7 short-period seismometers. In January 2015, we replaced the instruments and obtained the first data which includes minor shallow tremor and VLFE activity on June 1-3, 2014. Preliminary results of data processing show that the shallow tremor activity occurred at the northwestern part of the 2013 activity. The location corresponds the point where the tremors stopped migrating to further north direction and turned sharply eastward in the 2013 activity. On the other hand, clear tremor migration was not found in the 2014 activity. This local activity may imply that regional/small-scale heterogeneous structures such as a subducting sea mount affect the activity pattern. During the 2014 observation, many ordinary earthquakes also

  6. Passive seismic monitoring of natural and induced earthquakes: case studies, future directions and socio-economic relevance

    USGS Publications Warehouse

    Bohnhoff, Marco; Dresen, Georg; Ellsworth, William L.; Ito, Hisao; Cloetingh, Sierd; Negendank, Jörg

    2010-01-01

    An important discovery in crustal mechanics has been that the Earth’s crust is commonly stressed close to failure, even in tectonically quiet areas. As a result, small natural or man-made perturbations to the local stress field may trigger earthquakes. To understand these processes, Passive Seismic Monitoring (PSM) with seismometer arrays is a widely used technique that has been successfully applied to study seismicity at different magnitude levels ranging from acoustic emissions generated in the laboratory under controlled conditions, to seismicity induced by hydraulic stimulations in geological reservoirs, and up to great earthquakes occurring along plate boundaries. In all these environments the appropriate deployment of seismic sensors, i.e., directly on the rock sample, at the earth’s surface or in boreholes close to the seismic sources allows for the detection and location of brittle failure processes at sufficiently low magnitude-detection threshold and with adequate spatial resolution for further analysis. One principal aim is to develop an improved understanding of the physical processes occurring at the seismic source and their relationship to the host geologic environment. In this paper we review selected case studies and future directions of PSM efforts across a wide range of scales and environments. These include induced failure within small rock samples, hydrocarbon reservoirs, and natural seismicity at convergent and transform plate boundaries. Each example represents a milestone with regard to bridging the gap between laboratory-scale experiments under controlled boundary conditions and large-scale field studies. The common motivation for all studies is to refine the understanding of how earthquakes nucleate, how they proceed and how they interact in space and time. This is of special relevance at the larger end of the magnitude scale, i.e., for large devastating earthquakes due to their severe socio-economic impact.

  7. Catalog of Earthquake Hypocenters at Alaskan Volcanoes: January 1 through December 31, 2007

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.

    2008-01-01

    Between January 1 and December 31, 2007, AVO located 6,664 earthquakes of which 5,660 occurred within 20 kilometers of the 33 volcanoes monitored by the Alaska Volcano Observatory. Monitoring highlights in 2007 include: the eruption of Pavlof Volcano, volcanic-tectonic earthquake swarms at the Augustine, Illiamna, and Little Sitkin volcanic centers, and the cessation of episodes of unrest at Fourpeaked Mountain, Mount Veniaminof and the northern Atka Island volcanoes (Mount Kliuchef and Korovin Volcano). This catalog includes descriptions of : (1) locations of seismic instrumentation deployed during 2007; (2) earthquake detection, recording, analysis, and data archival systems; (3) seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2007; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2007.

  8. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2007

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.

    2008-01-01

    Between January 1 and December 31, 2007, AVO located 6,664 earthquakes of which 5,660 occurred within 20 kilometers of the 33 volcanoes monitored by the Alaska Volcano Observatory. Monitoring highlights in 2007 include: the eruption of Pavlof Volcano, volcanic-tectonic earthquake swarms at the Augustine, Illiamna, and Little Sitkin volcanic centers, and the cessation of episodes of unrest at Fourpeaked Mountain, Mount Veniaminof and the northern Atka Island volcanoes (Mount Kliuchef and Korovin Volcano). This catalog includes descriptions of : (1) locations of seismic instrumentation deployed during 2007; (2) earthquake detection, recording, analysis, and data archival systems; (3) seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2007; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2007.

  9. Real Earthquakes, Real Learning

    ERIC Educational Resources Information Center

    Schomburg, Aaron

    2003-01-01

    One teacher took her class on a year long earthquake expedition. The goal was to monitor the occurrences of real earthquakes during the year and mark their locations with push pins on a wall-sized world map in the hallway outside the science room. The purpose of the project was to create a detailed picture of the earthquakes that occurred…

  10. (abstract) GPS Monitoring of Crustal Deformation and the Earthquake Cycle in Costa Rica

    NASA Technical Reports Server (NTRS)

    Lundgren, Paul R.

    1994-01-01

    This paper will discuss the objectives, approach, and anticipated results of a study of earthquakes in Costa Rica. GPS measurements will be taken and field surveys will be made. Assessments of seismic strain accumulation and post-seismic deformation will be made in an effort to understand the effect these processes have on regional tectonic models.

  11. Earthquakes & Volcanoes, Volume 21, Number 1, 1989: Featuring the U.S. Geological Survey's National Earthquake Information Center in Golden, Colorado, USA

    USGS Publications Warehouse

    ,; Spall, Henry; Schnabel, Diane C.

    1989-01-01

    Earthquakes and Volcanoes is published bimonthly by the U.S. Geological Survey to provide current information on earthquakes and seismology, volcanoes, and related natural hazards of interest to both generalized and specialized readers. The Secretary of the Interior has determined that the publication of this periodical is necessary in the transaction of the public business required by law of this Department. Use of funds for printing this periodical has been approved by the Office of Management and Budget through June 30, 1989. Any use of trade, product, or firm names is for descriptive purposes only and does not imply endorsement by the U.S. Government.

  12. The continuous automatic monitoring network installed in Tuscany (Italy) since late 2002, to study earthquake precursory phenomena

    NASA Astrophysics Data System (ADS)

    Pierotti, Lisa; Cioni, Roberto

    2010-05-01

    Since late 2002, a continuous automatic monitoring network (CAMN) was designed, built and installed in Tuscany (Italy), in order to investigate and define the geochemical response of the aquifers to the local seismic activity. The purpose of the investigation was to identify eventual earthquake precursors. The CAMN is constituted by two groups of five measurement stations each. A first group has been installed in the Serchio and Magra graben (Garfagnana and Lunigiana Valleys, Northern Tuscany), while the second one, in the area of Mt. Amiata (Southern Tuscany), an extinct volcano. Garfagnana, Lunigiana and Mt. Amiata regions belong to the inner zone of the Northern Apennine fold-and-thrust belt. This zone has been involved in the post-collision extensional tectonics since the Upper Miocene-Pliocene. Such tectonic activity has produced horst and graben structures oriented from N-S to NW-SE that are transferred by NE-SW system. Both Garfagnana (Serchio graben) and Lunigiana (Magra graben) belong to the most inner sector of the belt where the seismic sources, responsible for the strongest earthquakes of the northern Apennine, are located (e.g. the M=6.5 earthquake of September 1920). The extensional processes in southern Tuscany have been accompanied by magmatic activity since the Upper Miocene, developing effusive and intrusive products traditionally attributed to the so-called Tuscan Magmatic Province. Mt. Amiata, whose magmatic activity ceased about 0.3 M.y. ago, belongs to the extensive Tyrrhenian sector that is characterized by high heat flow and crustal thinning. The whole zone is characterized by wide-spread but moderate seismicity (the maximum recorded magnitude has been 5.1 with epicentre in Piancastagnaio, 1919). The extensional regime in both the Garfagnana-Lunigiana and Mt. Amiata area is confirmed by the focal mechanisms of recent earthquakes. An essential phase of the monitoring activities has been the selection of suitable sites for the installation of

  13. Monitoring of earthquake precursors by multi-parameter stations in Eskisehir region (Turkey)

    NASA Astrophysics Data System (ADS)

    Yuce, G.; Ugurluoglu, D. Y.; Adar, N.; Yalcin, T.; Yaltirak, C.; Streil, T.; Oeserd, V. O.

    2010-04-01

    The objective of this study was to investigate the geochemical and hydrogeological effects of earthquakes on fluids in aquifers, particularly in a seismically active area such as Eskisehir (Turkey) where the Thrace-Eskisehir Fault Zone stretches over the region. The study area is also close to the North Anatolian Fault Zone generating devastating earthquakes such as the ones experienced in 1999, reactivating the Thrace-Eskisehir Fault. In the studied area, Rn and CO2 gas concentrations, redox potential, electrical conductivity, pH, water level, water temperature, and the climatic parameters were continuously measured in five stations for about a year. Based on the gathered data from the stations, some ambiguous anomalies in geochemical parameters and Rn concentration of groundwater were observed as precursors several days prior to an earthquake. According to the mid-term observations of this study, well-water level changes were found to be a good indicator for seismic estimations in the area, as it comprises naturally filtered anomalies reflecting only the changes due to earthquakes. Also, the results obtained from this study suggest that both the changes in well-water level and gas-water chemistry need to be interpretated together for more accurate estimations. Valid for the studied area, it can be said that shallow earthquakes with epicentral distances of <30 km from the observation stations have more influence on hydrochemical parameters of groundwater and well-water level changes. Although some hydrochemical anomalies were observed in the area, it requires further observations in order to be able to identify them as precursors.

  14. A survey conducted immediately after the 2011 Great East Japan Earthquake: evaluation of infectious risks associated with sanitary conditions in evacuation centers.

    PubMed

    Tokuda, Koichi; Kunishima, Hiroyuki; Gu, Yoshiaki; Endo, Shiro; Hatta, Masumitsu; Kanamori, Hajime; Aoyagi, Tetsuji; Ishibashi, Noriomi; Inomata, Shinya; Yano, Hisakazu; Kitagawa, Miho; Kaku, Mitsuo

    2014-08-01

    In cooperation with the Miyagi prefectural government, we conducted a survey of the management of sanitation at evacuation centers and the health of the evacuees by visiting 324 evacuation centers at two weeks after the 2011 Great East Japan Earthquake. The facilities often used as evacuation centers were community centers (36%), schools (32.7%) and Nursing homes (10.2%). It was more difficult to maintain a distance of at least 1 m between evacuees at the evacuation centers with a larger number of residents. At evacuation centers where the water supply was not restored, hygienic handling of food and the hand hygiene of the cooks were less than adequate. Among evacuation centers with ≤50 evacuees, there was a significant difference in the prevalence rate of digestive symptoms between the centers with and without persons in charge of health matters (0.3% vs. 2.1%, respectively, p < 0.001). The following three factors had an important influence on the level of sanitation at evacuation centers and the health of evacuees: 1) the size of the evacuation center, 2) the status of the water supply, and 3) the allocation of persons in charge of health matters. Given that adjusting the number of evacuees to fit the size of the evacuation center and prompt restoration of the water supply are difficult to achieve immediately after an earthquake, promptly placing persons in charge of health matters at evacuation centers is a practicable and effective measure, and allocation of at least one such person per 50 evacuees is desirable.

  15. 88 hours: The U.S. Geological Survey National Earthquake Information Center response to the 11 March 2011 Mw 9.0 Tohoku earthquake

    USGS Publications Warehouse

    Hayes, G.P.; Earle, P.S.; Benz, H.M.; Wald, D.J.; Briggs, R.W.

    2011-01-01

    This article presents a timeline of NEIC response to a major global earthquake for the first time in a formal journal publication. We outline the key observations of the earthquake made by the NEIC and its partner agencies, discuss how these analyses evolved, and outline when and how this information was released to the public and to other internal and external parties. Our goal in the presentation of this material is to provide a detailed explanation of the issues faced in the response to a rare, giant earthquake. We envisage that the timeline format of this presentation can highlight technical and procedural successes and shortcomings, which may in turn help prompt research by our academic partners and further improvements to our future response efforts. We have shown how NEIC response efforts have significantly improved over the past six years since the great 2004 Sumatra-Andaman earthquake. We are optimistic that the research spawned from this disaster, and the unparalleled dense and diverse data sets that have been recorded, can lead to similar-and necessary-improvements in the future.

  16. Monitoring of the Permeable Pavement Demonstration Site at the Edison Environmental Center (Poster)

    EPA Science Inventory

    This is a poster on the permeable pavement parking lot at the Edison Environmental Center. The monitoring scheme for the project is discussed in-depth with graphics explaining the instrumentation installed at the site.

  17. Hatfield Marine Science Center Dynamic Revetment Project DSL permit #45455-FP, Monitoring Report February 2012

    EPA Science Inventory

    A Dynamic Revetment (gravel beach) was installed in November, 2011 on the shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) to mitigate erosion that threatened HMSC critical infrastructure. Shoreline topographic and biological monitoring was init...

  18. Hatfield Marine Science Center Dynamic Revetment Project DSL permit #45455-FP, Monitoring Report February, 2013

    EPA Science Inventory

    A Dynamic Revetment (gravel beach) was installed in November, 2011 on the shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) to mitigate erosion that threatened HMSC critical infrastructure. Shoreline topographic and biological monitoring was init...

  19. The 2010 Mw 8.8 Maule megathrust earthquake of Central Chile, monitored by GPS.

    PubMed

    Vigny, C; Socquet, A; Peyrat, S; Ruegg, J-C; Métois, M; Madariaga, R; Morvan, S; Lancieri, M; Lacassin, R; Campos, J; Carrizo, D; Bejar-Pizarro, M; Barrientos, S; Armijo, R; Aranda, C; Valderas-Bermejo, M-C; Ortega, I; Bondoux, F; Baize, S; Lyon-Caen, H; Pavez, A; Vilotte, J P; Bevis, M; Brooks, B; Smalley, R; Parra, H; Baez, J-C; Blanco, M; Cimbaro, S; Kendrick, E

    2011-06-17

    Large earthquakes produce crustal deformation that can be quantified by geodetic measurements, allowing for the determination of the slip distribution on the fault. We used data from Global Positioning System (GPS) networks in Central Chile to infer the static deformation and the kinematics of the 2010 moment magnitude (M(w)) 8.8 Maule megathrust earthquake. From elastic modeling, we found a total rupture length of ~500 kilometers where slip (up to 15 meters) concentrated on two main asperities situated on both sides of the epicenter. We found that rupture reached shallow depths, probably extending up to the trench. Resolvable afterslip occurred in regions of low coseismic slip. The low-frequency hypocenter is relocated 40 kilometers southwest of initial estimates. Rupture propagated bilaterally at about 3.1 kilometers per second, with possible but not fully resolved velocity variations. PMID:21527673

  20. A robust satellite technique for monitoring seismically active areas: The case of Bhuj Gujarat earthquake

    NASA Astrophysics Data System (ADS)

    Genzano, N.; Aliano, C.; Filizzola, C.; Pergola, N.; Tramutoli, V.

    2007-02-01

    A robust satellite data analysis technique (RAT) has been recently proposed as a suitable tool for satellite TIR surveys in seismically active regions and already successfully tested in different cases of earthquakes (both high and medium-low magnitudes). In this paper, the efficiency and the potentialities of the RAT technique have been tested even when it is applied to a wide area with extremely variable topography, land coverage and climatic characteristics (the whole Indian subcontinent). Bhuj-Gujarat's earthquake (occurred on 26th January 2001, MS ˜ 7.9) has been considered as a test case in the validation phase, while a relatively unperturbed period (no earthquakes with MS ≥ 5, in the same region and in the same period) has been analyzed for confutation purposes. To this aim, 6 years of Meteosat-5 TIR observations have been processed for the characterization of the TIR signal behaviour at each specific observation time and location. The anomalous TIR values, detected by RAT, have been evaluated in terms of time-space persistence in order to establish the existence of actually significant anomalous transients. The results indicate that the studied area was affected by significant positive thermal anomalies which were identified, at different intensity levels, not far from the Gujarat coast (since 15th January, but with a clearer evidence on 22nd January) and near the epicentral area (mainly on 21st January). On 25th January (1 day before Gujarat's earthquake) significant TIR anomalies appear on the Northern Indian subcontinent, showing a remarkable coincidence with the principal tectonic lineaments of the region (thrust Himalayan boundary). On the other hand, the results of the confutation analysis indicate that no meaningful TIR anomalies appear in the absence of seismic events with MS ≥ 5.

  1. NCEER seminars on earthquakes

    USGS Publications Warehouse

    Pantelic, J.

    1987-01-01

    In May of 1986, the National Center for Earthquake Engineering Research (NCEER) in Buffalo, New York, held the first seminar in its new monthly forum called Seminars on Earthquakes. The Center's purpose in initiating the seminars was to educate the audience about earthquakes, to facilitate cooperation between the NCEER and visiting researchers, and to enable visiting speakers to learn more about the NCEER   

  2. A summary of ground motion effects at SLAC (Stanford Linear Accelerator Center) resulting from the Oct 17th 1989 earthquake

    SciTech Connect

    Ruland, R.E.

    1990-08-01

    Ground motions resulting from the October 17th 1989 (Loma Prieta) earthquake are described and can be correlated with some geologic features of the SLAC site. Recent deformations of the linac are also related to slow motions observed over the past 20 years. Measured characteristics of the earthquake are listed. Some effects on machine components and detectors are noted. 18 refs., 16 figs.

  3. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2010

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl K.

    2011-01-01

    Between January 1 and December 31, 2010, the Alaska Volcano Observatory (AVO) located 3,405 earthquakes, of which 2,846 occurred within 20 kilometers of the 33 volcanoes with seismograph subnetworks. There was no significant seismic activity in 2010 at these monitored volcanic centers. Seismograph subnetworks with severe outages in 2009 were repaired in 2010 resulting in three volcanic centers (Aniakchak, Korovin, and Veniaminof) being relisted in the formal list of monitored volcanoes. This catalog includes locations and statistics of the earthquakes located in 2010 with the station parameters, velocity models, and other files used to locate these earthquakes.

  4. GREENHOUSE GAS (GHG) MITIGATION AND MONITORING TECHNOLOGY PERFORMANCE: ACTIVITIES OF THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper discusses greenhouse gas (GHG) mitigation and monitoring technology performance activities of the GHG Technology Verification Center. The Center is a public/private partnership between Southern Research Institute and the U.S. EPA's Office of Research and Development. It...

  5. A framework for rapid post-earthquake assessment of bridges and restoration of transportation network functionality using structural health monitoring

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; Ramhormozian, Shahab; Mangabhai, Poonam; Singh, Ravikash; Orense, Rolando

    2013-04-01

    Quick and reliable assessment of the condition of bridges in a transportation network after an earthquake can greatly assist immediate post-disaster response and long-term recovery. However, experience shows that available resources, such as qualified inspectors and engineers, will typically be stretched for such tasks. Structural health monitoring (SHM) systems can therefore make a real difference in this context. SHM, however, needs to be deployed in a strategic manner and integrated into the overall disaster response plans and actions to maximize its benefits. This study presents, in its first part, a framework of how this can be achieved. Since it will not be feasible, or indeed necessary, to use SHM on every bridge, it is necessary to prioritize bridges within individual networks for SHM deployment. A methodology for such prioritization based on structural and geotechnical seismic risks affecting bridges and their importance within a network is proposed in the second part. An example using the methodology application to selected bridges in the medium-sized transportation network of Wellington, New Zealand is provided. The third part of the paper is concerned with using monitoring data for quick assessment of bridge condition and damage after an earthquake. Depending on the bridge risk profile, it is envisaged that data will be obtained from either local or national seismic monitoring arrays or SHM systems installed on bridges. A method using artificial neural networks is proposed for using data from a seismic array to infer key ground motion parameters at an arbitrary bridges site. The methodology is applied to seismic data collected in Christchurch, New Zealand. Finally, how such ground motion parameters can be used in bridge damage and condition assessment is outlined.

  6. (Stanford Linear Accelerator Center) annual environmental monitoring report, January--December 1989

    SciTech Connect

    Not Available

    1990-05-01

    This progress report discusses environmental monitoring activities at the Stanford Linear Accelerator Center for 1989. Topics include climate, site geology, site water usage, land use, demography, unusual events or releases, radioactive and nonradioactive releases, compliance summary, environmental nonradiological program information, environmental radiological program information, groundwater protection monitoring ad quality assurance. 5 figs., 7 tabs. (KJD)

  7. Analysis in natural time domain of geoelectric time series monitored prior two strong earthquakes occurred in Mexico

    NASA Astrophysics Data System (ADS)

    Ramírez-Rojas, A.; Flores-Marquez, L. E.

    2009-12-01

    The short-time prediction of seismic phenomena is currently an important problem in the scientific community. In particular, the electromagnetic processes associated with seismic events take in great interest since the VAN method was implemented. The most important features of this methodology are the seismic electrical signals (SES) observed prior to strong earthquakes. SES has been observed in the electromagnetic series linked to EQs in Greece, Japan and Mexico. By mean of the so-called natural time domain, introduced by Varotsos et al. (2001), they could characterize signals of dichotomic nature observed in different systems, like SES and ionic current fluctuations in membrane channels. In this work we analyze SES observed in geoelectric time series monitored in Guerrero, México. Our analysis concern with two strong earthquakes occurred, on October 24, 1993 (M=6.6) and September 14, 1995 (M=7.3). The time series of the first one displayed a seismic electric signal six days before the main shock and for the second case the time series displayed dichotomous-like fluctuations some months before the EQ. In this work we present the first results of the analysis in natural time domain for the two cases which seems to be agreeing with the results reported by Varotsos. P. Varotsos, N. Sarlis, and E. Skordas, Practica of the Athens Academy 76, 388 (2001).

  8. Improved earthquake monitoring in the central and eastern United States in support of seismic assessments for critical facilities

    USGS Publications Warehouse

    Leith, William S.; Benz, Harley M.; Herrmann, Robert B.

    2011-01-01

    Evaluation of seismic monitoring capabilities in the central and eastern United States for critical facilities - including nuclear powerplants - focused on specific improvements to understand better the seismic hazards in the region. The report is not an assessment of seismic safety at nuclear plants. To accomplish the evaluation and to provide suggestions for improvements using funding from the American Recovery and Reinvestment Act of 2009, the U.S. Geological Survey examined addition of new strong-motion seismic stations in areas of seismic activity and addition of new seismic stations near nuclear power-plant locations, along with integration of data from the Transportable Array of some 400 mobile seismic stations. Some 38 and 68 stations, respectively, were suggested for addition in active seismic zones and near-power-plant locations. Expansion of databases for strong-motion and other earthquake source-characterization data also was evaluated. Recognizing pragmatic limitations of station deployment, augmentation of existing deployments provides improvements in source characterization by quantification of near-source attenuation in regions where larger earthquakes are expected. That augmentation also supports systematic data collection from existing networks. The report further utilizes the application of modeling procedures and processing algorithms, with the additional stations and the improved seismic databases, to leverage the capabilities of existing and expanded seismic arrays.

  9. Structural Health Monitoring Sensor Development at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Prosser, W. H.; Wu, M. C.; Allison, S. G.; DeHaven, S. L.; Ghoshal, A.

    2002-01-01

    NASA is applying considerable effort on the development of sensor technology for structural health monitoring (SHM). This research is targeted toward increasing the safety and reliability of aerospace vehicles, while reducing operating and maintenance costs. Research programs are focused on applications to both aircraft and space vehicles. Sensor technologies under development span a wide range including fiber-optic sensing, active and passive acoustic sensors, electromagnetic sensors, wireless sensing systems, MEMS, and nanosensors. Because of their numerous advantages for aerospace applications, fiber-optic sensors are one of the leading candidates and are the major focus of this presentation. In addition, recent advances in active and passive acoustic sensing will also be discussed.

  10. Monitoring the Corniglio Landslide (Parma, Italy) before and after the M=5.4 earthquake of December 2008

    NASA Astrophysics Data System (ADS)

    Virdis, S.; Guastaldi, E.; Rindinella, A.; Disperati, L.; Ciulli, A.

    2009-04-01

    In this work we present the results of monitoring the Corniglio landslide (CL), a large landslide located in the Northern Apennines, by integrating traditional geomorphologic and geological surveys, digital photogrammetry, GPS and geostatistics. The CL spreads over an area of about 3 km x 1 km, close to Corniglio village (Parma, Italy). We propose a new kinematic framework for the CL as Deep-Seated Slope Gravitational Deformation (DSGSD). Surveys were carried out in six periods, in July and September 2006, March and August 2007, July 2008 (after a M=4 earthquake of 28 December 2007, 10 km far from Corniglio), and finally January 2009 (after several earthquakes occurred in the last days of December 2008, with magnitude from 4 to 5.4 and epicentres located less than 30 km far from Corniglio). Geological survey, interpretation of orthophotographs related to 1976, 1988, 1994, 1996, 1998, 2005, and satellite imagery related to 2003 were integrated for analysing the state of activity of landslide from 1976 to 2009, quantifying the ground displacement vectors. A RTK GPS survey was periodically carried out in order to locate the crown of the main landslide scarp and to identify reactivation of the CL after the earthquakes of the end of December 2008. Then, kriged multitemporal maps representing azimuth and module of ground displacement vectors were built, by evaluating the displacement with time of homologous ground targets on the multitemporal remotely sensed images. Measuring of ground deformations was performed on imagery related to the periods between December 1994 to July 1996, between October and November 1996, as well as the recurrent activity from October 1998 to 2003. In some sector of the main body of the landslide we estimated 70 m of total of ground displacement. The fieldwork results and photogeologic interpretation performed along the the Bratica valley, to the east of the CL, suggest that the occurrence of rigid behaviour lithotypes (Mt. Caio calcareous

  11. Research on geo-electrical resistivity observation system specially used for earthquake monitoring in China

    NASA Astrophysics Data System (ADS)

    Zhao, Jialiu; Wang, Lanwei; Qian, Jiadong

    2011-12-01

    This paper deals with the design and development of the observational system of geo-electrical resistivity on the basis of the demands for exploring the temporal variations of electrical properties of Earth media in the fixed points of the networks, which would be associated with the earthquake preparation. The observation system is characterized by the high accuracy in measurement, long term stability in operation and high level of rejection to the environmental interference. It consists of three main parts, configuration system measurement system, the calibration and inspection system.

  12. Radon progeny monitoring at the Eastern North Atlantic (ENA), Graciosa Island ARM facility and a potential earthquake precursory signal

    NASA Astrophysics Data System (ADS)

    Barbosa, Susana; Mendes, Virgilio B.; Azevedo, Eduardo B.

    2016-04-01

    Radon has been considered a promising earthquake precursor, the main rationale being an expected increase in radon exhalation in soil and rocks due to stress associated with the preparatory stages of an earthquake. However, the precursory nature of radon is far from being convincingly demonstrated so far. A major hindrance is the many meteorological and geophysical factors diving radon temporal variability, including the geophysical parameters influencing its emanation (grain size, moisture content, temperature), as well as the meteorological factors (atmospheric pressure, moisture, temperature, winds) influencing its mobility. Despite the challenges, radon remains one of the strongest candidates as a potential earthquake precursor, and it is of crucial importance to investigate the many factors driving its variability and its potential association with seismic events. Continuous monitoring of radon progeny is performed at the Eastern North Atlantic (ENA) facility located in the Graciosa island (Azores, 39N; 28W), a fixed site of the Atmospheric Radiation Measurement programme (ARM), established and supported by the Department of Energy (DOE) of the United States of America with the collaboration of the local government and University of the Azores. The Azores archipelago is associated with a complex geodynamic setting on the Azores triple junction where the American, Eurasian and African litospheric plates meet, resulting in significant seismic and volcanic activity. A considerable advantage of the monitoring site is the availability of a comprehensive dataset of concurrent meteorological observations performed at the ENA facility and freely available from the ARM data archive, enabling a detailed analysis of the environmental factors influencing the temporal variability of radon's progeny. Gamma radiation is being measured continuously every 15 minutes since May 2015. The time series of gamma radiation counts is dominated by sharp peaks lasting a few hours and

  13. Seismic Monitoring and Post-Seismic Investigations following the 12 January 2010 Mw 7.0 Haiti Earthquake (Invited)

    NASA Astrophysics Data System (ADS)

    Altidor, J.; Dieuseul, A.; Ellsworth, W. L.; Given, D. D.; Hough, S. E.; Janvier, M. G.; Maharrey, J. Z.; Meremonte, M. E.; Mildor, B. S.; Prepetit, C.; Yong, A.

    2010-12-01

    We report on ongoing efforts to establish seismic monitoring in Haiti. Following the devastating M7.0 Haiti earthquake of 12 January 2010, the Bureau des Mines et de l’Energie worked with the U.S. Geological Survey and other scientific institutions to investigate the earthquake and to better assess hazard from future earthquakes. We deployed several types of portable instruments to record aftershocks: strong-motion instruments within Port-au-Prince to investigate the variability of shaking due to local geological conditions, and a combination of weak-motion, strong-motion, and broadband instruments around the Enriquillo-Plaintain Garden fault (EPGF), primarily to improve aftershock locations and to lower the magnitude threshold of aftershock recording. A total of twenty instruments were deployed, including eight RefTek instruments and nine strong-motion (K2) accelerometers deployed in Port-au-Prince in collaboration with the USGS, and three additional broadband stations deployed in the epicentral region in collaboration with the University of Nice. Five K2s have remained in operation in Port-au-Prince since late June; in late June two instruments were installed in Cap-Haitien and Port de Paix in northern Haiti to provide monitoring of the Septentrional fault. A permanent strong-motion (NetQuakes) instrument was deployed in late June at the US Embassy. Five additional NetQuakes instruments will be deployed by the BME in late 2010/early 2011. Addionally, the BME has collaborated with other scientific institutions, including Columbia University, the Institut Géophysique du Globe, University of Nice, the University of Texas at Austin, and Purdue University, to conduct other types of investigations. These studies include, for example, sampling of uplifted corals to establish a chronology of prior events in the region of the Enriquillo-Plantain Garden fault, surveys of geotechnical properties to develop microzonation maps of metropolitan Port-au-Prince, surveys of

  14. On the Potential Uses of Static Offsets Derived From Low-Cost Community Instruments and Crowd-Sourcing for Earthquake Monitoring and Rapid Response

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Murray, J. R.; Iannucci, R. A.

    2013-12-01

    We explore the efficacy of low-cost community instruments (LCCIs) and crowd-sourcing to produce rapid estimates of earthquake magnitude and rupture characteristics which can be used for earthquake loss reduction such as issuing tsunami warnings and guiding rapid response efforts. Real-time high-rate GPS data are just beginning to be incorporated into earthquake early warning (EEW) systems. These data are showing promising utility including producing moment magnitude estimates which do not saturate for the largest earthquakes and determining the geometry and slip distribution of the earthquake rupture in real-time. However, building a network of scientific-quality real-time high-rate GPS stations requires substantial infrastructure investment which is not practicable in many parts of the world. To expand the benefits of real-time geodetic monitoring globally, we consider the potential of pseudorange-based GPS locations such as the real-time positioning done onboard cell phones or on LCCIs that could be distributed in the same way accelerometers are distributed as part of the Quake Catcher Network (QCN). While location information from LCCIs often have large uncertainties, their low cost means that large numbers of instruments can be deployed. A monitoring network that includes smartphones could collect data from potentially millions of instruments. These observations could be averaged together to substantially decrease errors associated with estimated earthquake source parameters. While these data will be inferior to data recorded by scientific-grade seismometers and GPS instruments, there are features of community-based data collection (and possibly analysis) that are very attractive. This approach creates a system where every user can host an instrument or download an application to their smartphone that both provides them with earthquake and tsunami warnings while also providing the data on which the warning system operates. This symbiosis helps to encourage

  15. Monitoring shallow resistivity changes prior to the 12 May 2008 M 8.0 Wenchuan earthquake on the Longmen Shan tectonic zone, China

    NASA Astrophysics Data System (ADS)

    Lu, Jun; Xie, Tao; Li, Mei; Wang, Yali; Ren, Yuexia; Gao, Shude; Wang, Lanwei; Zhao, Jialiu

    2016-04-01

    An active source measurement of shallow resistivity using fixed-electrode quasi-Schlumberger arrays has been conducted at Pixian, Jiangyou and Wudu stations on the Longmen Shan tectonic zone in western China, with the hope of detecting earthquake-associated changes. For the duration of the monitoring experiment, a gradual decrease of apparent resistivity of up to 6.7% several years prior to the 12 May 2008 M 8.0 Wenchuan earthquake had been recorded clearly at Pixian station, approximately 35 km from the epicenter. The change of apparent resistivity was monitored with a fixed Schlumberger array of AB/MN spacings of 736 m/226 m in the direction of N57.5°E, giving precisions in measured daily averages of 0.16% or less. A coseismic resistivity drop of up to 5.3% was observed at Jiangyou station, using a Schlumberger array of AB/MN spacings of 710 m/90 m in the direction of N10°E. No fluctuation of resistivity was detected at Wudu station at the time of the Wenchuan mainshock. While the focus of this paper is on monitoring or tracking resistivity variations prior to, during, and after the Wenchuan earthquake, we also aim to compare resistivity records of the Wenchuan earthquake to those of the M 7.8 Tangshan and M 7.2 Songpan earthquakes of 1976. Attempts to explain the observed resistivity variations have been made. The results show that the resistivity variations observed at all three stations are in approximate agreement with resistivity-stress behavior deduced from in situ experiments, focal mechanisms, a simplified dynamical model, static stress analyses, and field investigations from along the Longmen Shan fault zone.

  16. Earthquake Testing

    NASA Technical Reports Server (NTRS)

    1979-01-01

    During NASA's Apollo program, it was necessary to subject the mammoth Saturn V launch vehicle to extremely forceful vibrations to assure the moonbooster's structural integrity in flight. Marshall Space Flight Center assigned vibration testing to a contractor, the Scientific Services and Systems Group of Wyle Laboratories, Norco, California. Wyle-3S, as the group is known, built a large facility at Huntsville, Alabama, and equipped it with an enormously forceful shock and vibration system to simulate the liftoff stresses the Saturn V would encounter. Saturn V is no longer in service, but Wyle-3S has found spinoff utility for its vibration facility. It is now being used to simulate earthquake effects on various kinds of equipment, principally equipment intended for use in nuclear power generation. Government regulations require that such equipment demonstrate its ability to survive earthquake conditions. In upper left photo, Wyle3S is preparing to conduct an earthquake test on a 25ton diesel generator built by Atlas Polar Company, Ltd., Toronto, Canada, for emergency use in a Canadian nuclear power plant. Being readied for test in the lower left photo is a large circuit breaker to be used by Duke Power Company, Charlotte, North Carolina. Electro-hydraulic and electro-dynamic shakers in and around the pit simulate earthquake forces.

  17. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - presentation

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch has been monitoring an instrumented 110-space pervious pavement parking lot. The lot is used by EPA personnel and visitors to the Edison Environmental Center. The design includes 28-space rows of three permeable pavement types: asphal...

  18. Permeable Pavement Monitoring at the Edison Environmental Center Demonstration Site - Abstract

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch (UWMB) is monitoring an instrumented, working, 110-space pervious pavement parking at EPA’s Edison Environmental Center (EEC). Permeable pavement systems are classified as stormwater best management practices (BMPs) which reduce runo...

  19. Lessons learned from the introduction of autonomous monitoring to the EUVE science operations center

    NASA Technical Reports Server (NTRS)

    Lewis, M.; Girouard, F.; Kronberg, F.; Ringrose, P.; Abedini, A.; Biroscak, D.; Morgan, T.; Malina, R. F.

    1995-01-01

    The University of California at Berkeley's (UCB) Center for Extreme Ultraviolet Astrophysics (CEA), in conjunction with NASA's Ames Research Center (ARC), has implemented an autonomous monitoring system in the Extreme Ultraviolet Explorer (EUVE) science operations center (ESOC). The implementation was driven by a need to reduce operations costs and has allowed the ESOC to move from continuous, three-shift, human-tended monitoring of the science payload to a one-shift operation in which the off shifts are monitored by an autonomous anomaly detection system. This system includes Eworks, an artificial intelligence (AI) payload telemetry monitoring package based on RTworks, and Epage, an automatic paging system to notify ESOC personnel of detected anomalies. In this age of shrinking NASA budgets, the lessons learned on the EUVE project are useful to other NASA missions looking for ways to reduce their operations budgets. The process of knowledge capture, from the payload controllers for implementation in an expert system, is directly applicable to any mission considering a transition to autonomous monitoring in their control center. The collaboration with ARC demonstrates how a project with limited programming resources can expand the breadth of its goals without incurring the high cost of hiring additional, dedicated programmers. This dispersal of expertise across NASA centers allows future missions to easily access experts for collaborative efforts of their own. Even the criterion used to choose an expert system has widespread impacts on the implementation, including the completion time and the final cost. In this paper we discuss, from inception to completion, the areas where our experiences in moving from three shifts to one shift may offer insights for other NASA missions.

  20. Environmental assessment of the Carlsbad Environmental Monitoring and Research Center Facility

    SciTech Connect

    1995-10-01

    This Environmental Assessment has been prepared to determine if the Carlsbad Environmental Monitoring and Research Center (the Center), or its alternatives would have significant environmental impacts that must be analyzed in an Environmental Impact Statement. DOE`s proposed action is to continue funding the Center. While DOE is not funding construction of the planned Center facility, operation of that facility is dependent upon continued funding. To implement the proposed action, the Center would initially construct a facility of approximately 2,300 square meters (25,000 square feet). The Phase 1 laboratory facilities and parking lot will occupy approximately 1.2 hectares (3 acres) of approximately 8.9 hectares (22 acres) of land which were donated to New Mexico State University (NMSU) for this purpose. The facility would contain laboratories to analyze chemical and radioactive materials typical of potential contaminants that could occur in the environment in the vicinity of the DOE Waste Isolation Pilot Plant (WIPP) site or other locations. The facility also would have bioassay facilities to measure radionuclide levels in the general population and in employees of the WIPP. Operation of the Center would meet the DOE requirement for independent monitoring and assessment of environmental impacts associated with the planned disposal of transuranic waste at the WIPP.

  1. Program Evaluation of Remote Heart Failure Monitoring: Healthcare Utilization Analysis in a Rural Regional Medical Center

    PubMed Central

    Keberlein, Pamela; Sorenson, Gigi; Mohler, Sailor; Tye, Blake; Ramirez, A. Susana; Carroll, Mark

    2015-01-01

    Abstract Background: Remote monitoring for heart failure (HF) has had mixed and heterogeneous effects across studies, necessitating further evaluation of remote monitoring systems within specific healthcare systems and their patient populations. “Care Beyond Walls and Wires,” a wireless remote monitoring program to facilitate patient and care team co-management of HF patients, served by a rural regional medical center, provided the opportunity to evaluate the effects of this program on healthcare utilization. Materials and Methods: Fifty HF patients admitted to Flagstaff Medical Center (Flagstaff, AZ) participated in the project. Many of these patients lived in underserved and rural communities, including Native American reservations. Enrolled patients received mobile, broadband-enabled remote monitoring devices. A matched cohort was identified for comparison. Results: HF patients enrolled in this program showed substantial and statistically significant reductions in healthcare utilization during the 6 months following enrollment, and these reductions were significantly greater compared with those who declined to participate but not when compared with a matched cohort. Conclusions: The findings from this project indicate that a remote HF monitoring program can be successfully implemented in a rural, underserved area. Reductions in healthcare utilization were observed among program participants, but reductions were also observed among a matched cohort, illustrating the need for rigorous assessment of the effects of HF remote monitoring programs in healthcare systems. PMID:25025239

  2. Real-time Implementation of the Waveloc Technique for Monitoring Earthquake Swarms

    NASA Astrophysics Data System (ADS)

    Maggi, A.; Langet, N.; Michelini, A.

    2013-12-01

    Monitoring regions with high swarm-type seismicity (e.g. volcanoes, certain tectonic regions) is a challenge for the traditional pick-associate-locate type algorithms that form the basis of most seismicity monitoring software. Over the past few years, new approaches that avoid the association phase by direct migration of some characteristic function of the recorded seismograms have started to be implemented, and have shown great promise (see related abstract on the Waveloc method applied to Piton de la Fournaise volcano). Implementing such methods in real-time is an essential step in proving their usefulness and robustness in swarm-monitoring situations. Here we describe the work in progress on adapting the Waveloc migration technique to real-time operation. The resulting software package, RT-Waveloc, is currently in the prototype stage, and we hope to have a version that can be distributed to the scientific community for beta-testing within a year. The development of RT-Waveloc is financed by the EU NERA project.

  3. Earthquake history of Oregon

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    Although situated between two States (California and Washington) that have has many violent earthquakes, Oregon is noticeably less active seismically. the greatest damage experienced resulted from a major shock near Olympia, Wash., in 1949. During the short history record available (since 1841), 34 earthquakes of intensity V, Modified Mercalli Scale, or greater have centered within Oregon or near its borders. Only 13 of the earthquakes had an intensity above V, and many of the shocks were local. However, a 1936 earthquake in the eastern Oregon-Washington region caused extensive damage and was felt over an area of 272,000 square kilometers. 

  4. Real-Time seismic waveforms monitoring with BeiDou Navigation Satellite System (BDS) observations for the 2015 Mw 7.8 Nepal earthquake

    NASA Astrophysics Data System (ADS)

    Geng, T.

    2015-12-01

    Nowadays more and more high-rate Global Navigation Satellite Systems (GNSS) data become available in real time, which provide more opportunities to monitor the seismic waveforms. China's GNSS, BeiDou Navigation Satellite System (BDS), has already satisfied the requirement of stand-alone precise positioning in Asia-Pacific region with 14 in-orbit satellites, which promisingly suggests that BDS could be applied to the high-precision earthquake monitoring as GPS. In the present paper, real-time monitoring of seismic waveforms using BDS measurements is assessed. We investigate a so-called "variometric" approach to measure real-time seismic waveforms with high-rate BDS observations. This approach is based on time difference technique and standard broadcast products which are routinely available in real time. The 1HZ BDS data recorded by Beidou Experimental Tracking Stations (BETS) during the 2015 Mw 7.8 Nepal earthquake is analyzed. The results indicate that the accuracies of velocity estimation from BDS are 2-3 mm/s in horizontal components and 8-9 mm/s in vertical component, respectively, which are consistent with GPS. The seismic velocity waveforms during earthquake show good agreement between BDS and GPS. Moreover, the displacement waveforms is reconstructed by an integration of velocity time series with trend removal. The displacement waveforms with the accuracy of 1-2 cm are derived by comparing with post-processing GPS precise point positioning (PPP).

  5. [Computerized monitoring system in the operating center with UNIX and X-window].

    PubMed

    Tanaka, Y; Hashimoto, S; Chihara, E; Kinoshita, T; Hirose, M; Nakagawa, M; Murakami, T

    1992-01-01

    We previously reported the fully automated data logging system in the operating center. Presently, we revised the system using a highly integrated operating system, UNIX instead of OS/9. With this multi-task and multi-window (X-window) system, we could monitor all 12 rooms in the operating center at a time. The system in the operating center consists of 2 computers, SONY NEWS1450 (UNIX workstation) and Sord M223 (CP/M, data logger). On the bitmapped display of the workstation, using X-window, the data of all the operating rooms can be visualized. Furthermore, 2 other minicomputers (Fujitsu A50 in the conference room, and A60 in the ICU) and a workstation (Sun3-80 in the ICU) were connected with ethernet. With the remote login function (NFS), we could easily obtain the data during the operation from outside the operating center. This system works automatically and needs no routine maintenance.

  6. A Novel Cloud-Based Service Robotics Application to Data Center Environmental Monitoring.

    PubMed

    Russo, Ludovico Orlando; Rosa, Stefano; Maggiora, Marcello; Bona, Basilio

    2016-01-01

    This work presents a robotic application aimed at performing environmental monitoring in data centers. Due to the high energy density managed in data centers, environmental monitoring is crucial for controlling air temperature and humidity throughout the whole environment, in order to improve power efficiency, avoid hardware failures and maximize the life cycle of IT devices. State of the art solutions for data center monitoring are nowadays based on environmental sensor networks, which continuously collect temperature and humidity data. These solutions are still expensive and do not scale well in large environments. This paper presents an alternative to environmental sensor networks that relies on autonomous mobile robots equipped with environmental sensors. The robots are controlled by a centralized cloud robotics platform that enables autonomous navigation and provides a remote client user interface for system management. From the user point of view, our solution simulates an environmental sensor network. The system can easily be reconfigured in order to adapt to management requirements and changes in the layout of the data center. For this reason, it is called the virtual sensor network. This paper discusses the implementation choices with regards to the particular requirements of the application and presents and discusses data collected during a long-term experiment in a real scenario. PMID:27509505

  7. A Novel Cloud-Based Service Robotics Application to Data Center Environmental Monitoring.

    PubMed

    Russo, Ludovico Orlando; Rosa, Stefano; Maggiora, Marcello; Bona, Basilio

    2016-08-08

    This work presents a robotic application aimed at performing environmental monitoring in data centers. Due to the high energy density managed in data centers, environmental monitoring is crucial for controlling air temperature and humidity throughout the whole environment, in order to improve power efficiency, avoid hardware failures and maximize the life cycle of IT devices. State of the art solutions for data center monitoring are nowadays based on environmental sensor networks, which continuously collect temperature and humidity data. These solutions are still expensive and do not scale well in large environments. This paper presents an alternative to environmental sensor networks that relies on autonomous mobile robots equipped with environmental sensors. The robots are controlled by a centralized cloud robotics platform that enables autonomous navigation and provides a remote client user interface for system management. From the user point of view, our solution simulates an environmental sensor network. The system can easily be reconfigured in order to adapt to management requirements and changes in the layout of the data center. For this reason, it is called the virtual sensor network. This paper discusses the implementation choices with regards to the particular requirements of the application and presents and discusses data collected during a long-term experiment in a real scenario.

  8. A Novel Cloud-Based Service Robotics Application to Data Center Environmental Monitoring

    PubMed Central

    Russo, Ludovico Orlando; Rosa, Stefano; Maggiora, Marcello; Bona, Basilio

    2016-01-01

    This work presents a robotic application aimed at performing environmental monitoring in data centers. Due to the high energy density managed in data centers, environmental monitoring is crucial for controlling air temperature and humidity throughout the whole environment, in order to improve power efficiency, avoid hardware failures and maximize the life cycle of IT devices. State of the art solutions for data center monitoring are nowadays based on environmental sensor networks, which continuously collect temperature and humidity data. These solutions are still expensive and do not scale well in large environments. This paper presents an alternative to environmental sensor networks that relies on autonomous mobile robots equipped with environmental sensors. The robots are controlled by a centralized cloud robotics platform that enables autonomous navigation and provides a remote client user interface for system management. From the user point of view, our solution simulates an environmental sensor network. The system can easily be reconfigured in order to adapt to management requirements and changes in the layout of the data center. For this reason, it is called the virtual sensor network. This paper discusses the implementation choices with regards to the particular requirements of the application and presents and discusses data collected during a long-term experiment in a real scenario. PMID:27509505

  9. Monitoring the turbidity and surface temperature changes and effects of the 17 August 1999 earthquake in the Izmit Gulf, Turkey by the Landsat TM/ETM data.

    PubMed

    Tüfekçi, Kenan; Akman, A Unal

    2005-09-01

    The temporal turbidity and surface temperature changes and effects of the 17 August 1999 earthquake in the Izmit Gulf, Turkey have been investigated using Landsat TM/ETM data. The gulf is in the Mediterranean-Black Sea transition climatic zone and is partially surrounded by green vegetation cover and degraded and densely urbanized-industrialized areas. Landsat TM/ETM data acquired in 1990-1999 confirms increase in turbidity. Turbidity is always low in the southern part and high in the northern part of the gulf, because the more urbanized and industrialized areas are located in the northern part. The Landsat-7 ETM data acquired in the same year (1999) shows seasonal changes in turbidity. Moreover, the two high turbidity and surface temperature anomalies, one of which is parallel to the 17 August 1999 earthquake surface rupture (east-west) and the other which is in the northwest-southeast direction were mapped from Landsat-5 TM data acquired the day (18.08.1999) following the earthquake in the east end of the gulf. On the basis of turbidity implying the sea bottom movement, it is possible to state that a second rupture in the northwest and southeast direction could have occurred at the sea bottom during the earthquake. The distribution of the seismicity centers and the orientation of the lineaments in the area support this finding.

  10. MUG-OBS - Multiparameter Geophysical Ocean Bottom System : a new instrumental approach to monitor earthquakes.

    NASA Astrophysics Data System (ADS)

    Hello, Y.; Yegikyan, M.; Charvis, P.; Verfaillie, R.; Philippe, O.

    2015-12-01

    There are several attempts to monitor real time seismic activity, using regional scale wired nodes, such as Neptune in Canada and in the U.S, Antares in France or DONET in Japan.On another hand there are also initiatives in deploying repeatedly OBS array like during the amphibious Cascadia Initiative (four 1-year deployments), the Japanese Pacific Array (broadband OBSs "ocean-bottom broadband dispersion survey" with 2-years autonomy), the Obsismer program in the French Lesser Antilles (eight 6-months deployments) and the Osisec program in Ecuador (four 6-months deployments). These OBSs are autonomous, they are self-recovered or recovered using an ROV. These systems are costly including ship time, and require to recover the OBS before to start working on data.Among the most recent alternative we developed a 3-years autonomy OBS equipped with a Nanometrics Trillium 120 s, a triaxial accelerometer, a differential, an absolute pressure gauge, and a hydrophone. MUG-OBS is a free falling instrument rated down to 6000 m. The installation of the sensor is monitored by acoustic commands from the surface and a health bulletin with data checking is recovered by acoustic during the installation. The major innovation is that it is possible to recover the data any time on demand (regularly every 6-months or after a seismic crisis) utilizing one of the 6 data-shuttles released from the surface by acoustic command using a one day fast cruise boat of opportunity. Since sensors stayed at the same location for 3 years (when an OBS is redeployed on the same site, it will not land in the same place), it is a perfect tool to monitor slow seismic events, background seismic activity and aftershock distribution. Clock, drift measurement and GPS localization is automatic when the shuttle reaches the surface. A new version is being developed; for remote areas, shuttles released automatically and a seismic events bulletin is transmitted. Selected data can be recovered by two- way Iridium

  11. Exploring Earthquakes in Real-Time

    NASA Astrophysics Data System (ADS)

    Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.

    2013-12-01

    Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.

  12. Environmental monitoring and research at the John F. Kennedy Space Center.

    PubMed

    Hall, C R; Hinkle, C R; Knott, W M; Summerfield, B R

    1992-08-01

    The Biomedical Operations and Research Office at the NASA John F. Kennedy Space Center has been supporting environmental monitoring and research since the mid-1970s. Program elements include monitoring of baseline conditions to document natural variability in the ecosystem, assessments of operations and construction of new facilities, and ecological research focusing on wildlife habitat associations. Information management is centered around development of a computerized geographic information system that incorporates remote sensing and digital image processing technologies along with traditional relational data base management capabilities. The proactive program is one in which the initiative is to anticipate potential environmental concerns before they occur and, by utilizing in-house expertise, develop impact minimization or mitigation strategies to reduce environmental risk.

  13. Environmental monitoring and research at the John F. Kennedy Space Center

    SciTech Connect

    Hall, C.R.; Hinkle, C.R.; Knott, W.M.; Summerfield, B.R. )

    1992-08-01

    The Biomedical Operations and Research Office at the NASA John F. Kennedy Space Center has been supporting environmental monitoring and research since the mid-1970s. Program elements include monitoring of baseline conditions to document natural variability in the ecosystem, assessments of operations and construction of new facilities, and ecological research focusing on wildlife habitat associations. Information management is centered around development of a computerized geographic information system that incorporates remote sensing and digital image processing technologies along with traditional relational data base management capabilities. The proactive program is one in which the initiative is to anticipate potential environmental concerns before they occur and, by utilizing in-house expertise, develop impact minimization or mitigation strategies to reduce environmental risk.

  14. Environmental monitoring and research at the John F. Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Hall, C. R.; Hinkle, C. R.; Knott, W. M.; Summerfield, B. R.

    1992-01-01

    The Biomedical Operations and Research Office at the NASA John F. Kennedy Space Center has been supporting environmental monitoring and research since the mid-1970s. Program elements include monitoring of baseline conditions to document natural variability in the ecosystem, assessments of operations and construction of new facilities, and ecological research focusing on wildlife habitat associations. Information management is centered around development of a computerized geographic information system that incorporates remote sensing and digital image processing technologies along with traditional relational data base management capabilities. The proactive program is one in which the initiative is to anticipate potential environmental concerns before they occur and, by utilizing in-house expertise, develop impact minimization or mitigation strategies to reduce environmental risk.

  15. Communication infrastructure in a contact center for home care monitoring of chronic disease patients.

    PubMed Central

    Maglaveras, N.; Gogou, G.; Chouvarda, I.; Koutkias, V.; Lekka, I.; Giaglis, G.; Adamidis, D.; Karvounis, C.; Louridas, G.; Goulis, D.; Avramidis, A.; Balas, E. A.

    2002-01-01

    The Citizen Health System (CHS) is a European Commission (EC) funded project in the field of IST for Health. Its main goal is to develop a generic contact center which in its pilot stage can be used in the monitoring, treatment and management of chronically ill patients at home in Greece, Spain and Germany. Such contact centers, which can use any type of communication technology, and can provide timely and preventive prompting to the patients are envisaged in the future to evolve into well-being contact centers providing services to all citizens. In this paper, we present the structure of such a generic contact center and in particular the telecommunication infrastructure, the communication protocols and procedures, and finally the educational modules that are integrated into this contact center. We discuss the procedures followed for two target groups of patients where two randomized control clinical trials are under way, namely diabetic patients with obesity problems, and congestive heart failure patients. We present examples of the communication means between the contact center medical personnel and these patients, and elaborate on the educational issues involved. PMID:12463870

  16. Revisiting Notable Earthquakes and Seismic Patterns of the Past Decade in Alaska

    NASA Astrophysics Data System (ADS)

    Ruppert, N. A.; Macpherson, K. A.; Holtkamp, S. G.

    2015-12-01

    Alaska, the most seismically active region of the United States, has produced five earthquakes with magnitudes greater than seven since 2005. The 2007 M7.2 and 2013 M7.0 Andreanof Islands earthquakes were representative of the most common source of significant seismic activity in the region, the Alaska-Aleutian megathrust. The 2013 M7.5 Craig earthquake, a strike-slip event on the Queen-Charlotte fault, occurred along the transform plate boundary in southeast Alaska. The largest earthquake of the past decade, the 2014 M7.9 Little Sitkin event in the western Aleutians, occurred at an intermediate depth and ruptured along a gently dipping fault through nearly the entire thickness of the subducted Pacific plate. Along with these major earthquakes, the Alaska Earthquake Center reported over 250,000 seismic events in the state over the last decade, and its earthquake catalog surpassed 500,000 events in mid-2015. Improvements in monitoring networks and processing techniques allowed an unprecedented glimpse into earthquake patterns in Alaska. Some notable recent earthquake sequences include the 2008 Kasatochi eruption, the 2006-2008 M6+ crustal earthquakes in the central and western Aleutians, the 2010 and 2015 Bering Sea earthquakes, the 2014 Noatak swarm, and the 2014 Minto earthquake sequence. In 2013, the Earthscope USArray project made its way into Alaska. There are now almost 40 new Transportable Array stations in Alaska along with over 20 upgraded sites. This project is changing the earthquake-monitoring scene in Alaska, lowering magnitude of completeness across large, newly instrumented parts of the state.

  17. What kind of disturbances did March 11, 2011 Tohoku Earthquake and Tsunamis leave continental margin ecosystems? : Lessons from five years monitoring research

    NASA Astrophysics Data System (ADS)

    Kitazato, Hiroshi; Kijima, Akihiro; Kogure, Kazuhiro; Hara, Motoyuki; Nagata, Toshi; Fujikura, Kasunori; Sonoda, Akira

    2016-04-01

    On March 11, 2011, huge earthquake with M9.0 took place at Japan Trench area off Northeast Japan. Vigorous disturbances of marine environments and ecosystems have taken place at coastal areas where huge tsunamis swept sediments and organisms away from the coastal areas to deeper oceans. Distributional pattern of sediments and organisms in coves and bays have strongly changed after tsunamis. Marine ecosystems at Northeast Japan have totally disturbed and damaged. Scientists from Tohoku University, the University of Tokyo and JAMSTEC have started to monitor how much marine ecosystem disturbed and how it may recover. A research team, named Tohoku Ecosystem-Associated Marine Sciences, continually makes research on marine ecosystems as ten years monitoring project funded by MEXT, Japan since 2011. On 2016, it takes five years from the Earthquake and Tsunami occurred. What happens marine ecosystems at Tohoku area during these years. Water column ecosystems are rather easy to recover from disturbances. Seaweed communities have strongly damaged, but, they gradually recover. Sediment communities have not recovered yet as sediment distribution is different from before earthquake and tsunamis. Most difficulties are scars in human minds. We, scientists, try to share scientific activities and results with local peoples including fishermen and local governments for better understanding of both oceanic conditions and fishery resources. Disaster risk reduction should accelerate with resilience of community structure. But, mental resilience is the most effective way to recover human activities at the damaged areas.

  18. Using RST approach and EOS-MODIS radiances for monitoring seismically active regions: a study on the 6 April 2009 Abruzzo earthquake

    NASA Astrophysics Data System (ADS)

    Pergola, N.; Aliano, C.; Coviello, I.; Filizzola, C.; Genzano, N.; Lacava, T.; Lisi, M.; Mazzeo, G.; Tramutoli, V.

    2010-02-01

    In the last few years, Robust Satellite data analysis Techniques (RST) have been proposed and successfully applied for monitoring major natural and environmental risks. Among the various fields of application, RST analysis has been used as a suitable tool for satellite TIR surveys in seismically active regions, devoted to detect and monitor thermal anomalies possibly related to earthquake occurrence. In this work, RST has been applied, for the first time, to thermal infrared observations collected by MODIS (Moderate Resolution Imaging Spectroradiometer) - the sensor onboard EOS (Earth Observing System) satellites - in the case of Abruzzo (Italy) earthquake occurred on 6 April 2009 (ML~5.8). First achievements, shown in this work, seem to confirm the sensitivity of the proposed approach in detecting perturbations of the Earth's emission thermal field few days before the event. The reliability of such results, based on the analysis of 10 years of MODIS observations, seems to be supported by the results achieved analyzing the same area in similar observation conditions but in seismically unperturbed periods (no earthquakes with ML≥5) that will be also presented.

  19. RAPID: Collaboration Results from Three NASA Centers in Commanding/Monitoring Lunar Assets

    NASA Technical Reports Server (NTRS)

    Torres, R. Jay; Allan, Mark; Hirsh, Robert; Wallick, Michael N.

    2009-01-01

    Three NASA centers are working together to address the challenge of operating robotic assets in support of human exploration of the Moon. This paper describes the combined work to date of the Ames Research Center (ARC), Jet Propulsion Laboratory (JPL) and Johnson Space Center (JSC) on a common support framework to control and monitor lunar robotic assets. We discuss how we have addressed specific challenges including time-delayed operations, and geographically distributed collaborative monitoring and control, to build an effective architecture for integrating a heterogeneous collection of robotic assets into a common work. We describe the design of the Robot Application Programming Interface Delegate (RAPID) architecture that effectively addresses the problem of interfacing a family of robots including the JSC Chariot, ARC K-10 and JPL ATHLETE rovers. We report on lessons learned from the June 2008 field test in which RAPID was used to monitor and control all of these assets. We conclude by discussing some future directions to extend the RAPID architecture to add further support for NASA's lunar exploration program.

  20. Signal Coherence and Improved Bandwidth in Kilometer-Scale Water-Pipe Tilt-Meters for Monitoring Slow Earthquakes

    NASA Astrophysics Data System (ADS)

    Bilham, R.; Suszek, N.; Flake, R.; Szeliga, W.; Melbourne, T.

    2005-12-01

    Slow earthquakes have been detected by GPS networks in numerous subduction zones but signals are frequently close to detection levels. Although strain-meters and tilt-meters possess a thousandfold higher resolution (~ 1 nstrain & 1 nrad), noise levels in these instruments tend to be site specific and it is sometimes considered necessary to instal clusters to distinguish tectonic signal from local noise. This approach to strain measurement can more than double the cost of initial installation. We report here first results from a half-km-long water pipe tiltmeter in which a test for signal coherence is an inherent product of the geometry of the instrument. An appealing feature of water-pipe tiltmeters is that they cost 25% less than a borehole strain-meter, assume long good long term stability within days of installation, and unlike the decade-longevity of borehole systems, have an indefinite life span. In a Michelson tilt-meter, tilt of the earth's surface is manifest as a rise in water level at one end of the pipe and an equal and opposite reduction in water level at the other. In newly installed tiltmeters in the Cascadia region we have introduced a central transducer that effectively provides two 250-m-long independent measures of tilt in each 500 m long pipe, and hence a measure of signal coherence for little extra cost. Data from each sensor are telemetered via radio modem to a remote computer at rates of 1-6 samples/minute. Initial results from four 500 m long water pipes installed in the Cascadia region, reveal that a secular drift level of better than 0.1 microradian/yr is established within a week of installation and that the two half-tiltmeters track each other closely at all periods. Noise levels are frequency dependent and vary form 0.2 nrad at hourly periods to 100 nrad at yearly periods. Atmospheric and aperiodic ocean loading appears to be the largest souce of noise at periods of several days to weeks in the bandwidth where slow earthquakes are

  1. The meteorological monitoring system for the Kennedy Space Center/Cape Canaveral Air Station

    NASA Technical Reports Server (NTRS)

    Dianic, Allan V.

    1994-01-01

    The Kennedy Space Center (KSC) and Cape Canaveral Air Station (CCAS) are involved in many weather-sensitive operations. Manned and unmanned vehicle launches, which occur several times each year, are obvious example of operations whose success and safety are dependent upon favorable meteorological conditions. Other operations involving NASA, Air Force, and contractor personnel, including daily operations to maintain facilities, refurbish launch structures, prepare vehicles for launch, and handle hazardous materials, are less publicized but are no less weather-sensitive. The Meteorological Monitoring System (MMS) is a computer network which acquires, processes, disseminates, and monitors near real-time and forecast meteorological information to assist operational personnel and weather forecasters with the task of minimizing the risk to personnel, materials, and the surrounding population. CLIPS has been integrated into the MMS to provide quality control analysis and data monitoring. This paper describes aspects of the MMS relevant to CLIPS including requirements, actual implementation details, and results of performance testing.

  2. Wilson Corners SWMU 001 2015 Annual Long Term Monitoring Report Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Lawson, Emily M.

    2016-01-01

    This document presents the findings of the 2015 Long Term Monitoring (LTM) that was completed at the Wilson Corners site, located at the National Aeronautics and Space Administration John F. Kennedy Space Center, Florida. The objectives of the 2015 LTM event were to evaluate the groundwater flow direction and gradient, to monitor the vertical and horizontal extent of the volatile organic compounds (VOCs; including the upgradient and sidegradient extents, which are monitored every five years), and to monitor select locations internal to the dissolved groundwater plume. The 2015 LTM event included several upgradient and sidegradient monitoring wells that are not sampled annually to verify the extent of VOCs in this portion of the site. The December 2015 LTM groundwater sampling event included, depth to groundwater measurements, 40 VOC samples collected using passive diffusion bags, and one VOC sample collected using low-flow techniques. Additionally, monitoring well MW0052DD was overdrilled and abandoned using rotasonic drilling techniques. The following conclusions can be made based on the 2015 LTM results: groundwater flow is generally to the west with northwest and southwest flow components from the water table to approximately 55 feet below land surface (ft BLS); peripheral monitoring wells generally delineate VOCs to groundwater cleanup target levels (GCTLs) except for monitoring wells MW0088, MW0090, MW0095, and NPSHMW0039, which had vinyl chloride (VC) concentrations near the GCTL and MW0062, which had trichloroethene (TCE), cis-1,2-dichloroethenen (cDCE), and VC concentrations above natural attenuation default concentrations (NADCs); VOCs in interior downgradient wells generally fluctuate within historic ranges except for monitoring wells in the north-northwest portion of the site, which have increasing VC concentrations indicating potential plume migration and expansion; Historically, the vertical extents of the VOCs were delineated by monitoring wells

  3. The large (M>5) co-eruptive earthquakes in Bárðarbunga caldera as observed by an accelerometer and cGPS in the caldera center

    NASA Astrophysics Data System (ADS)

    Hjörleifsdóttir, Vala; Jónsdóttir, Kristín; Geirsson, Halldór; Rodrigo Rodríguez-Cardozo, Félix; Iglesias, Arturo; Parks, Michelle; Ófeigsson, Benedikt; Vogfjord, Kristín; Dumont, Stephanie; Magnússon, Eyjólfur; Spaans, Karsten; Bagnardi, Marco; Hensch, Martin; Heimann, Sebastian; Cesca, Simone; Tumi Guðmundsson, Magnús; Hooper, Andrew; Sigmundsson, Freysteinn

    2016-04-01

    The 2014-2015 eruptive episode in Holuhraun, northern Iceland, was accompanied by almost 70 meters of caldera subsidence in the ice-covered Bárðarbunga volcano. During the subsidence, over seventy 5.7>M>5 earthquakes occurred on the caldera rim, many of them with an unusual moment tensor (large non-double-couple component), indicating that they do not involve slip on a planar fault. Non- double-couple moment tensors are principally found in volcanoes in eruption (Shuler et al 2013), and several mechanisms for generating them have been proposed, such as: slip on a ring-fault (Nettles & Ekström, 1998); closing crack or sill (Kanamori et al 1993, Riel et al 2014); or a combination of both (Heimann et al, submitted). Thus, by what processes the seismic signal is related to the caldera subsidence is still under debate. During the caldera subsidence, a high-rate (20 Hz) GPS station and an accelerometer were installed on top of the ice, near the center of the 7x11 km caldera. The GPS station started recording about three weeks into the caldera collapse and recorded over 35 m of subsidence, and several co-seismic steps of up to 40 cm in the vertical component. The size of the co-seismic steps diminished with time during the eruption. In addition to the steps, seismic waves are clearly seen in the high-rate GPS data at the caldera station. The accelerometer was installed more than two months after the start of the eruption and recorded intermittently due to unfavorable conditions on top of the ice sheet. However, more than 80 events were observed on the accelerometer, of magnitude M 1-4.3, providing important observations of s-p times. Furthermore, the deformation of the glacier surface induced by some of the largest earthquakes, was captured by 1-day COSMO-SkyMed interferograms, providing further constraints on the earthquake process. In this presentation we analyze the signals from the two instruments, together with InSAR interferograms as well as other available data

  4. Evaluating the Imbalance Between Increasing Hemodialysis Patients and Medical Staff Shortage After the Great East Japan Earthquake: Report From a Hemodialysis Center Near the Fukushima Nuclear Power Plants.

    PubMed

    Koshiba, Takaaki; Nishiuchi, Takamitsu; Akaihata, Hidenori; Haga, Nobuhiro; Kojima, Yoshiyuki; Kubo, Hajime; Kasahara, Masato; Hayashi, Masayuki

    2016-04-01

    The Great East Japan Earthquake in 2011 caused an unprecedented imbalance between an increasing number of hemodialysis patients and medical staff shortage in the Sousou area, the site of the Fukushima nuclear power plants. In 2014, capacity of our hemodialysis center reached a critical limit due to such an imbalance. We attempted to evaluate the effort of medical staff to clarify to what extent their burden had increased post-disaster. The ratio of total dialysis sessions over total working days of medical staff was determined as an approximate indicator of effort per month. The mean value of each year was compared. Despite fluctuations of the ratio, the mean value did not differ from 2010 to 2013. However, the ratio steadily increased in 2014, and there was a significant increase in the mean value. This proposed indicator of the effort of medical staff appears to reflect what we experienced, although its validity must be carefully examined in future studies.

  5. Evaluating the Imbalance Between Increasing Hemodialysis Patients and Medical Staff Shortage After the Great East Japan Earthquake: Report From a Hemodialysis Center Near the Fukushima Nuclear Power Plants.

    PubMed

    Koshiba, Takaaki; Nishiuchi, Takamitsu; Akaihata, Hidenori; Haga, Nobuhiro; Kojima, Yoshiyuki; Kubo, Hajime; Kasahara, Masato; Hayashi, Masayuki

    2016-04-01

    The Great East Japan Earthquake in 2011 caused an unprecedented imbalance between an increasing number of hemodialysis patients and medical staff shortage in the Sousou area, the site of the Fukushima nuclear power plants. In 2014, capacity of our hemodialysis center reached a critical limit due to such an imbalance. We attempted to evaluate the effort of medical staff to clarify to what extent their burden had increased post-disaster. The ratio of total dialysis sessions over total working days of medical staff was determined as an approximate indicator of effort per month. The mean value of each year was compared. Despite fluctuations of the ratio, the mean value did not differ from 2010 to 2013. However, the ratio steadily increased in 2014, and there was a significant increase in the mean value. This proposed indicator of the effort of medical staff appears to reflect what we experienced, although its validity must be carefully examined in future studies. PMID:26935477

  6. Source Process of the Mw 5.0 Au Sable Forks, New York, Earthquake Sequence from Local Aftershock Monitoring Network Data

    NASA Astrophysics Data System (ADS)

    Kim, W.; Seeber, L.; Armbruster, J. G.

    2002-12-01

    On April 20, 2002, a Mw 5 earthquake occurred near the town of Au Sable Forks, northeastern Adirondacks, New York. The quake caused moderate damage (MMI VII) around the epicentral area and it is well recorded by over 50 broadband stations in the distance ranges of 70 to 2000 km in the Eastern North America. Regional broadband waveform data are used to determine source mechanism and focal depth using moment tensor inversion technique. Source mechanism indicates predominantly thrust faulting along 45° dipping fault plane striking due South. The mainshock is followed by at least three strong aftershocks with local magnitude (ML) greater than 3 and about 70 aftershocks are detected and located in the first three months by a 12-station portable seismographic network. The aftershock distribution clearly delineate the mainshock rupture to the westerly dipping fault plane at a depth of 11 to 12 km. Preliminary analysis of the aftershock waveform data indicates that orientation of the P-axis rotated 90° from that of the mainshock, suggesting a complex source process of the earthquake sequence. We achieved an important milestone in monitoring earthquakes and evaluating their hazards through rapid cross-border (Canada-US) and cross-regional (Central US-Northeastern US) collaborative efforts. Hence, staff at Instrument Software Technology, Inc. near the epicentral area joined Lamont-Doherty staff and deployed the first portable station in the epicentral area; CERI dispatched two of their technical staff to the epicentral area with four accelerometers and a broadband seismograph; the IRIS/PASSCAL facility shipped three digital seismographs and ancillary equipment within one day of the request; the POLARIS Consortium, Canada sent a field crew of three with a near real-time, satellite telemetry based earthquake monitoring system. The Polaris station, KSVO, powered by a solar panel and batteries, was already transmitting data to the central Hub in London, Ontario, Canada within

  7. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Chen, S. E.; Yu, E.; Bhaskaran, A.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2011-12-01

    Currently, the SCEDC archives continuous and triggered data from nearly 8400 data channels from 425 SCSN recorded stations, processing and archiving an average of 6.4 TB of continuous waveforms and 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC during 2011. New website design: ? The SCEDC has revamped its website. The changes make it easier for users to search the archive, discover updates and new content. These changes also improve our ability to manage and update the site. New data holdings: ? Post processing on El Mayor Cucapah 7.2 sequence continues. To date there have been 11847 events reviewed. Updates are available in the earthquake catalog immediately. ? A double difference catalog (Hauksson et. al 2011) spanning 1981 to 6/30/11 will be available for download at www.data.scec.org and available via STP. ? A focal mechanism catalog determined by Yang et al. 2011 is available for distribution at www.data.scec.org. ? Waveforms from Southern California NetQuake stations are now being stored in the SCEDC archive and available via STP as event associated waveforms. Amplitudes from these stations are also being stored in the archive and used by ShakeMap. ? As part of a NASA/AIST project in collaboration with JPL and SIO, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California Real Time Network (http://sopac.ucsd.edu/projects/realtime; Genrich and Bock, 2006, J. Geophys. Res.). These channels will be archived at the SCEDC as miniSEED waveforms, which then can be distributed to the user community via applications such as STP. Improvements in the user tool STP: ? STP sac output now includes picks from the SCSN. New archival methods: ? The SCEDC is exploring the feasibility of archiving and distributing

  8. Wilson Corners SWMU 001 2014 Annual Long Term Monitoring Report Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Langenbach, James

    2015-01-01

    This document presents the findings of the 2014 Long Term Monitoring (LTM) that was completed at the Wilson Corners site, located at the National Aeronautics and Space Administration (NASA) John F. Kennedy Space Center (KSC), Florida. The goals of the 2014 annual LTM event were to evaluate the groundwater flow direction and gradient and to monitor the vertical and downgradient horizontal extent of the volatile organic compounds (VOCs) in groundwater at the site. The LTM activities consisted of an annual groundwater sampling event in December 2014, which included the collection of water levels from the LTM wells. During the annual groundwater sampling event, depth to groundwater was measured and VOC samples were collected using passive diffusion bags (PDBs) from 30 monitoring wells. In addition to the LTM sampling, additional assessment sampling was performed at the site using low-flow techniques based on previous LTM results and assessment activities. Assessment of monitoring well MW0052DD was performed by collecting VOC samples using low-flow techniques before and after purging 100 gallons from the well. Monitoring well MW0064 was sampled to supplement shallow VOC data north of Hot Spot 2 and east of Hot Spot 4. Monitoring well MW0089 was sampled due to its proximity to MW0090. MW0090 is screened in a deeper interval and had an unexpected detection of trichloroethene (TCE) during the 2013 LTM, which was corroborated during the March 2014 verification sampling. Monitoring well MW0130 was sampled to provide additional VOC data beneath the semi-confining clay layer in the Hot Spot 2 area.

  9. Earthquake history of Mississippi

    USGS Publications Warehouse

    von Hake, C. A.

    1974-01-01

    Since its admission into the Union in 1817, Mississippi has had only four earthquakes of intensity V or greater within its borders. Although the number of earthquakes known to have been centered within Mississippi's boundaries is small, the State has been affected by numerous shocks located in neighboring States. In 1811 and 1812, a series of great earthquakes near the New Madrid Missouri area was felt in Mississippi as far south as the gulf coast. The New Madrid series caused the banks of the Mississippi River to cave in as far as Vicksburg, mroe than 300 miles from the epicentral region. As a result of this great earthquake series, the northwest corner of Mississippi is in seismic risk zone 3, the highest risk zone. Expect for the new Madrid series, effects in Mississippi from earthquakes located outside of the State have been less than intensity V. 

  10. Earthquake history of Pennsylvania

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    Record of early earthquakes in Northeastern United States provide limited information on effects in pennsylvania until 1737, 55 years after the first permanent settlement was established. A very severe earthquake that centered in the St.Lawrence River region in 1663 may have been felt in Pennsylvania, but historical accounts are not definite. Likewise, a damaging shock at Newbury, Mass., in 1727 probably affected towns in Pennsylvania. A strong earthquake on December 18, 1737, toppled chimneys at New York City and was reported felt at Boston, Mass., Philadelphia, Pa. and New Castle, Del. Other shocks with origins outside the State were felt in 1758, 1783, and 1791. Since 1800, when two earthquakes (March 17 and November 29) were reported as "severe" at Philadelphia, 16 tremors of intensity V or greater (Modified Mercalli Scale) have originated within the State. On November 11 and 14, 1840, sever earthquakes at Philadelphia were accompnaied by a great and unusual swell on the Delaware River. 

  11. An Evaluation of North Korea’s Nuclear Test by Belbasi Nuclear Tests Monitoring Center-KOERI

    NASA Astrophysics Data System (ADS)

    Necmioglu, O.; Meral Ozel, N.; Semin, K.

    2009-12-01

    Bogazici University and Kandilli Observatory and Earthquake Research Institute (KOERI) is acting as the Turkish National Data Center (NDC) and responsible for the operation of the International Monitoring System (IMS) Primary Seismic Station (PS-43) under Belbasi Nuclear Tests Monitoring Center for the verification of compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT) since February 2000. The NDC is responsible for operating two arrays which are part of the IMS, as well as for transmitting data from these stations to the International Data Centre (IDC) in Vienna. The Belbasi array was established in 1951, as a four-element (Benioff 1051) seismic array as part of the United States Atomic Energy Detection System (USAEDS). Turkish General Staff (TGS) and U.S. Air Force Technical Application Center (AFTAC) under the Defense and Economic Cooperation Agreement (DECA) jointly operated this short period array. The station was upgraded and several seismometers were added to array during 1951 and 1994 and the station code was changed from BSRS (Belbasi Seismic Research Station) to BRTR-PS43 later on. PS-43 is composed of two sub-arrays (Ankara and Keskin): the medium-period array with a ~40 km radius located in Ankara and the short-period array with a ~3 km radius located in Keskin. Each array has a broadband element located at the middle of the circular geometry. Short period instruments are installed at depth 30 meters from the surface while medium and broadband instruments are installed at depth 60 meters from surface. On 25 May 2009, The Democratic People’s Republic of Korea (DPRK) claimed that it had conducted a nuclear test. Corresponding seismic event was recorded by IMS and IDC released first automatic estimation of time (00:54:43 GMT), location (41.2896°N and 129.0480°E) and the magnitude (4.52 mb) of the event in less than two hours time (USGS: 00:54:43 GMT; 41.306°N, 129.029°E; 4.7 mb) During our preliminary analysis of the 25th May 2009 DPRK

  12. A real-time navigation monitoring expert system for the Space Shuttle Mission Control Center

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Fletcher, Malise

    1993-01-01

    The ONAV (Onboard Navigation) Expert System has been developed as a real time console assistant for use by ONAV flight controllers in the Mission Control Center at the Johnson Space Center. This expert knowledge based system is used to monitor the Space Shuttle onboard navigation system, detect faults, and advise flight operations personnel. This application is the first knowledge-based system to use both telemetry and trajectory data from the Mission Operations Computer (MOC). To arrive at this stage, from a prototype to real world application, the ONAV project has had to deal with not only AI issues but operating environment issues. The AI issues included the maturity of AI languages and the debugging tools, verification, and availability, stability and size of the expert pool. The environmental issues included real time data acquisition, hardware suitability, and how to achieve acceptance by users and management.

  13. Robust Satellite Techniques (RST) for monitoring Earthquake active regions: the case of Abruzzo April 6th 2009 event. (Invited)

    NASA Astrophysics Data System (ADS)

    Pergola, N.; Aliano, C.; Corrado, R.; Genzano, N.; Lisi, M.; Mazzeo, G.; Tramutoli, V.; Coviello, I.; Filizzola, C.; Lacava, T.; Paciello, R.

    2009-12-01

    Space-time fluctuations of Earth’s emitted Thermal Infrared (TIR) radiation have been observed from satellite months to weeks before earthquakes occurrence. The general RST approach has been proposed in order to discriminate normal (i.e. related to the change of natural factor and/or observation conditions) TIR signal fluctuations from anomalous signal transients possibly associated to earthquake occurrence. In this work, the same approach is applied to the case of Abruzzo April 6th 2009 event (M=6.3) by using, for the first time: - the geostationary satellite MSG having improved spatial and temporal resolution - contemporary observations recorded by 3 independent satellite systems (NOAA-AVHRR, EOS-MODIS, MSG-SEVIRI) - a very long historical data set made of 30 years of satellite records. Preliminary results will be discussed also considering physical models that could explain the apparent correlation between anomalous space-time TIR transients and earthquake occurrence.

  14. Launch Complex 39A, SWMU 008, Operations, Maintenance, and Monitoring Report, Kennedy Space Center, FL

    NASA Technical Reports Server (NTRS)

    Wilson, Deborah M.

    2016-01-01

    This Operations, Maintenance, and Monitoring Report (OMMR) presents the findings, observations, and results from Year 1 operation of the air sparging (AS) groundwater interim measure (IM) for High-Concentration Plumes (HCPs) and Low-Concentration Plumes (LCPs) within the perimeter fence line at Launch Complex 39A (LC39A) located at Kennedy Space Center (KSC), Florida. The objective of the LC39A groundwater IM is to actively decrease concentrations of trichloroethene (TCE), cis-1,2-dichloroethene (cDCE), and vinyl chloride (VC) in groundwater in the HCP and LCP within the pad perimeter fence line via AS to levels less than Florida Department of Environmental Protection (FDEP) Groundwater Cleanup Target Levels (GCTLs). The objective was developed because LC39A is currently being leased to Space Exploration Technologies (SpaceX), and the original IM for monitored natural attenuation (MNA) over an extended period of time was not suitable for future planned site use.

  15. Federal Radiological Monitoring and Assessment Center (FRMAC) overview of FRMAC operations

    SciTech Connect

    1996-02-01

    In the event of a major radiological emergency, 17 federal agencies with various statutory responsibilities have agreed to coordinate their efforts at the emergency scene under the umbrella of the Federal Radiological Emergency Response plan (FRERP). This cooperative effort will assure the designated Lead Federal Agency (LFA) and the state(s) that all federal radiological assistance fully supports their efforts to protect the public. The mandated federal cooperation ensures that each agency can obtain the data critical to its specific responsibilities. This Overview of the Federal Radiological Monitoring and Assessment Center (FRMAC) Operations describes the FRMAC response activities to a major radiological emergency. It also describes the federal assets and subsequent operational activities which provide federal radiological monitoring and assessment of the off-site areas. These off-site areas may include one or more affected states.

  16. Utility of video-EEG monitoring in a tertiary care epilepsy center.

    PubMed

    Kumar-Pelayo, M; Oller-Cramsie, M; Mihu, N; Harden, C

    2013-09-01

    Our video-EEG monitoring (VEEG) unit is part of a typical metropolitan tertiary care center that services a diverse patient population. We aimed to determine if the specific clinical reason for inpatient VEEG was actually resolved. Our method was to retrospectively determine the stated goal of inpatient VEEG and to analyze the outcome of one hundred consecutive adult patients admitted for VEEG. The reason for admission fit into one of four categories: 1) to characterize paroxysmal events as either epileptic or nonepileptic, 2) to localize epileptic foci, 3) to characterize the epilepsy syndrome, and 4) to attempt safe antiepileptic drug adjustment. We found that VEEG was successful in accomplishing the goal of admission in 77% of cases. The remaining 23% failed primarily due to lack of typical events during monitoring. Furthermore, of the overall study cohort, VEEG outcomes altered medical management in 53% and surgery was pursued in 5%.

  17. Data sets for snow cover monitoring and modelling from the National Snow and Ice Data Center

    NASA Astrophysics Data System (ADS)

    Holm, M.; Daniels, K.; Scott, D.; McLean, B.; Weaver, R.

    2003-04-01

    A wide range of snow cover monitoring and modelling data sets are pending or are currently available from the National Snow and Ice Data Center (NSIDC). In-situ observations support validation experiments that enhance the accuracy of remote sensing data. In addition, remote sensing data are available in near-real time, providing coarse-resolution snow monitoring capability. Time series data beginning in 1966 are valuable for modelling efforts. NSIDC holdings include SMMR and SSM/I snow cover data, MODIS snow cover extent products, in-situ and satellite data collected for NASA's recent Cold Land Processes Experiment, and soon-to-be-released ASMR-E passive microwave products. The AMSR-E and MODIS sensors are part of NASA's Earth Observing System flying on the Terra and Aqua satellites Characteristics of these NSIDC-held data sets, appropriateness of products for specific applications, and data set access and availability will be presented.

  18. Health monitoring and disease prevention at the Zebrafish International Resource Center.

    PubMed

    Varga, Z M; Murray, K N

    2016-01-01

    In this chapter we review the components of the fish health program at the Zebrafish International Resource Center. We describe health-monitoring strategies to assess individual and colony health, practices to prevent the spread of pathogens within the fish colony, and a biosecurity program designed to prevent entry of new fish pathogens. While this program is designed for a facility on a recirculating water system with expectations of high volumes of import and export, many of the components can be directly applied or modified for application in facilities of different sizes and with other programmatic goals.

  19. Real time electromagnetic monitoring system used for short-term earthquakes forecast related to the seismic-active Vrancea zone

    NASA Astrophysics Data System (ADS)

    Stanica, Dumitru; Armand Stanica, Dragos

    2016-04-01

    The existence of the pre-seismic electromagnetic signals related to the earthquakes is still under scientific debate and requires new reliable information about their possible inter-relationship. In this paper, to obtain new insights into the seismic active Vrancea zone (Romania), a 3-D magnetotelluric imaging has been used to strengthen the connection between the geodynamic model and a possible generation mechanism of the intermediate depth earthquakes. Consequently, it is considered that before an earthquake initiation, due to the torsion effect, a high stress reached inside the seismogenic volume that may generates dehydration and rupture processes of the rocks, associated with the fluid migration through the lithospheric faults system, what leads to the resistivity changes. These changes have been investigated by using ULF electromagnetic data recorded in real time at the Geodynamic Observatory Provita de Sus (GOPS), placed on the Carpathian Electrical Conductivity Anomaly (CECA) at about 100km far from the seismic active Vrancea zone. The daily mean distribution of the normalized function Bzn(f) = Bz(f)/Bperp(f) (where: Bz is vertical component of the geomagnetic field; Bperp is geomagnetic component perpendicular to strike; f is frequency in Hz) and its standard deviation are performed by using a FFT band-pass filter analysis in the ULF range 0.001Hz to 0.0083Hz, for which a 2-D geoelectrical structure under GOPS has been identified. To provide reliable information in anticipating the likelihood occurrence of an earthquake of Mw higher than 4, a statistical analysis based on standardized random variable equation has been used to identify the anomalous intervals on the new time series (Bzn*) carried out in a span of three years (2013-2015). The final conclusion is that the Bzn* shows a significant anomalous effect some days (weeks) before an impending earthquake and it should be used for short-term earthquakes forecast.

  20. Facial Nerve Monitoring During Parotidectomy:A Two-Center Retrospective Study

    PubMed Central

    Régloix, Stanislas Ballivet-de; Grinholtz-Haddad, Julia; Maurin, Olga; Genestier, Louise; Lisan, Quentin; Pons, Yoann

    2016-01-01

    Introduction: We present a retrospective two-center study series and discussion of the current literature to assess the benefits of facial nerve monitoring during parotidectomy. Materials and Methods: From 2007 to 2012, 128 parotidectomies were performed in 125 patients. Of these, 47 procedures were performed without facial nerve monitoring (group 1) and 81 with facial nerve monitoring (group 2). The primary endpoint was the House-Brackmann classification at 1 month and 6 months. Facial palsy was determined when the House-Brackmann grade was 3 or higher. Results: In group 1, 15 facial palsies were noted; 8 were transient and 7 were definitive. In group 2, 19 facial palsies were noted; 12 were transient and 7 were definitive. At both one and six months after parotidectomy, the rate of facial palsy in reoperation cases was significantly higher in group 1 than in group 2. Conclusion: Facial nerve monitoring is a simple, effective adjunct method that is available to surgeons to assist with the functional preservation of the facial nerve during parotid surgery. Although it does not improve the facial prognosis in first-line surgery, it does improve the facial prognosis in reoperations. PMID:27602336

  1. Using Ambient Seismic Noise to Monitor Post-Seismic Relaxation After the 2010 Mw 7.1 Darfield Earthquake, New Zealand

    NASA Astrophysics Data System (ADS)

    Savage, M. K.; Heckels, R.; Townend, J.

    2015-12-01

    Quantifying seismic velocity changes following large earthquakes can provide insights into the crustal response of the earth. The use of ambient seismic noise to monitor these changes is becoming increasingly widespread. Cross-correlations of long-duration ambient noise records can be used to give stable impulse response functions without the need for repeated seismic events. Temporal velocity changes were detected in the four months following the September 2010 Mw 7.1 Darfield event in South Island, New Zealand, using temporary seismic networks originally deployed to record aftershocks in the region. The arrays consisted of stations lying on and surrounding the fault, with a maximum inter-station distance of 156km. The 2010-2011 Canterbury earthquake sequence occurred largely on previously unknown and buried faults. The Darfield earthquake was the first and largest in a sequence of events that hit the region, rupturing the Greendale Fault. A surface rupture of nearly 30km was observed. The sequence also included the Mw 6.3 February 2011 Christchurch event, which caused widespread damage throughout the city and resulted in almost 200 deaths. Nine-component, day-long Green's functions were computed for frequencies between 0.1 - 1.0 Hz for full waveform seismic data from immediately after the 4th September 2010 earthquake until mid-January 2011. Using the moving window cross-spectral method, stacks of daily functions covering the study period (reference functions), were compared to consecutive 10 day stacks of cross-correlations to measure time delays between them. These were then inverted for seismic velocity changes with respect to the reference functions. Over the study period an increase in seismic velocity of 0.25% ± 0.02% was determined proximal to the Greendale fault. These results are similar to studies in other regions, and we attribute the changes to post-seismic relaxation through crack-healing of the Greendale Fault and throughout the region.

  2. Monitoring, sampling and analysis of fine particulates -- Results and experiences from DOE's Federal Energy Technology Center

    SciTech Connect

    White, C.M.; Anderson, R.; Martello, D.; Rohar, P.; George, E.; Irdi, G.; Veloski, G.; Tamilia, J.; Lynn, R.; Waldner, K.; Hickey, R.; Feeley, T.; Casuccio, G.S.; Schlaegle, S.F.; Doerr, A.

    1999-07-01

    The overall goal of the DOE fine particulate program is to ensure that the best science and technology is available for any regulatory decision-making related to the health and environmental impacts of ambient fine particulate matter and regional haze. Interest primarily lies in the particulate fraction having aerodynamic diameters of 2.5 microns and less (PM2.5). Particulates of this size are the focus of the newly established National Ambient Air Quality Standards. As such, the Federal Energy Technology Center (FETC) is establishing a fine particulate sampling station at the Center's Pittsburgh site located in South Park Township, PA. This sampling station is one of a group of stations scattered throughout Pennsylvania, West Virginia, and Ohio that constitute the Upper Ohio River Valley Project. The station is equipped with a full complement of fine particulate and gaseous monitors including the following: (1) R and P Sequential FRM sampler, (2) Grimm PM2.5 continuous sampler, (3) TSI Dustrak PM2.5 continuous sampler, (4) R and P TEOM equipped with an AccuSampler, (5) Andersen speciation sampler, (6) MetOne speciation sampler, (7) EcoChem continuous PAH monitor, (8) Total peroxide monitor that employs the Greg Kok method, (9) Burkard 7 day pollen and mold spore sampler, (10) Continuous gas monitors for O{sub 3}, SO{sub 2}, NH{sub 3}, CO, H{sub 2}S, NO{sub y}, NO{sub x}, and (11) Meteorological instruments. The presentation will describe the initial results for the summer 1999 season from the above instruments. The chemical analysis of the aqueous extracts of the FRM filters will be discussed, including the anions present as determined by ion chromatography, and the metals present.

  3. Testing hypotheses of earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

  4. Earthquake history of Tennessee

    USGS Publications Warehouse

    von Hake, C. A.

    1977-01-01

     The western part of the State was shaken strongly by the New Madrid, Mo., earthquakes of 1811-12 and by earthquakes in 1843 and 1895. The area has also experienced minor shocks. Additional activity has occurred in the eastern part of the State, near the North Carolina border. Forty shocks of intensity V (Modified Mercalli scale) or greater have been cataloged as occurring within the State. Many other earthquakes centered in bordering States have affected points in Tennessee. The following summary covers only hose shocks of intensity VI or greater. 

  5. Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Bachmann, C. E.; Wiemer, S.; Woessner, J.; Hainzl, S.

    2011-08-01

    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was

  6. Response to the great East Japan earthquake of 2011 and the Fukushima nuclear crisis: the case of the Laboratory Animal Research Center at Fukushima Medical University.

    PubMed

    Katahira, Kiyoaki; Sekiguchi, Miho

    2013-01-01

    A magnitude 9.0 great earthquake, the 2011 off the Pacific coast of Tohoku Earthquake, occurred on March 11, 2011, and subsequent Fukushima Daiichi Nuclear Power Station (Fukushima NPS) accidents stirred up natural radiation around the campus of Fukushima Medical University (FMU). FMU is located in Fukushima City, and is 57 km to the northwest of Fukushima NPS. Due to temporary failure of the steam boilers, the air conditioning system for the animal rooms, all autoclaves, and a cage washer could not be used at the Laboratory Animal Research Center (LARC) of FMU. The outside air temperature dropped to zero overnight, and the temperature inside the animal rooms fell to 10°C for several hours. We placed sterilized nesting materials inside all cages to encourage rodents to create nests. The main water supply was cut off for 8 days in all, while supply of steam and hot water remained unavailable for 12 days. It took 20 days to restore the air conditioning system to normal operation at the facility. We measured radiation levels in the animal rooms to confirm the safety of care staff and researchers. On April 21, May 9, and June 17, the average radiation levels at a central work table in the animal rooms with HEPA filters were 46.5, 44.4, and 43.4 cpm, respectively, which is equal to the background level of the equipment. We sincerely hope our experiences will be a useful reference regarding crisis management for many institutes having laboratory animals. PMID:23615301

  7. GPS Monitoring of Surface Change During and Following the Fortuitous Occurrence of the M(sub w) = 7.3 Landers Earthquake in our Network

    NASA Technical Reports Server (NTRS)

    Miller, M. Meghan

    1998-01-01

    Accomplishments: (1) Continues GPS monitoring of surface change during and following the fortuitous occurrence of the M(sub w) = 7.3 Landers earthquake in our network, in order to characterize earthquake dynamics and accelerated activity of related faults as far as 100's of kilometers along strike. (2) Integrates the geodetic constraints into consistent kinematic descriptions of the deformation field that can in turn be used to characterize the processes that drive geodynamics, including seismic cycle dynamics. In 1991, we installed and occupied a high precision GPS geodetic network to measure transform-related deformation that is partitioned from the Pacific - North America plate boundary northeastward through the Mojave Desert, via the Eastern California shear zone to the Walker Lane. The onset of the M(sub w) = 7.3 June 28, 1992, Landers, California, earthquake sequence within this network poses unique opportunities for continued monitoring of regional surface deformation related to the culmination of a major seismic cycle, characterization of the dynamic behavior of continental lithosphere during the seismic sequence, and post-seismic transient deformation. During the last year, we have reprocessed all three previous epochs for which JPL fiducial free point positioning products available and are queued for the remaining needed products, completed two field campaigns monitoring approx. 20 sites (October 1995 and September 1996), begun modeling by development of a finite element mesh based on network station locations, and developed manuscripts dealing with both the Landers-related transient deformation at the latitude of Lone Pine and the velocity field of the whole experiment. We are currently deploying a 1997 observation campaign (June 1997). We use GPS geodetic studies to characterize deformation in the Mojave Desert region and related structural domains to the north, and geophysical modeling of lithospheric behavior. The modeling is constrained by our

  8. Near real-time model to monitor SST anomalies related to undersea earthquakes and SW monsoon phenomena from TRMM-AQUA satellite data

    NASA Astrophysics Data System (ADS)

    Chakravarty, Subhas

    Near real-time interactive computer model has been developed to extract daily mean global Sea Surface Temperature (SST) values of 1440x720 pixels, each one covering 0.25° x0.25° lat-long area and SST anomalies from longer period means pertaining to any required oceanic grid size of interest. The core MATLAB code uses the daily binary files (3-day aggregate values) of global SST data (derived from TRMM/TMI-AQUA/AMSRE satellite sensors) available on near real-time basis through the REMSS/NASA website and converts these SSTs into global/regional maps and displays as well as digitised text data tables for further analysis. As demonstrated applications of the model, the SST data for the period between 2003-2009 has been utilised to study (a) SST anomalies before, during and after the occurrence of two great under-sea earthquakes of 26 December 2004 and 28 March 2005 near the western coast of Sumatra and (b) variation of pixel numbers with SSTs between 27-31° C within (i) Nino 4 region and (ii) a broader western Pacific region (say Nino-BP) affected by ENSO events before (January-May) and during (June-October) Monsoon onset/progress. Preliminary results of these studies have been published (Chakravarty, The Open Oceanography Journal, 2009 and Chakravarty, IEEE Xplore, 2009). The results of the SST-earthquake analysis indicate a small but consistent warming of 0.2-0.3° C in the 2° x2° grid area near the earthquake epicentre starting a week earlier to a week later for the event of 26 December 2004. The changes observed in SST for the second earthquake is also indicated but with less clarity owing to the mixing of land and ocean surfaces and hence less number of SST pixels available within the 2° x 2° grid area near the corresponding epicen-tre. Similar analysis for the same period of non-earthquake years did not show any such SST anomalies. These results have far reaching implications to use SST as a possible parameter to be monitored for signalling occurrence of

  9. Photovoltaic Performance and Reliability Database: A Gateway to Experimental Data Monitoring Projects for PV at the Florida Solar Energy Center

    DOE Data Explorer

    This site is the gateway to experimental data monitoring projects for photovoltaic (PV) at the Florida Solar Energy Center. The website and the database were designed to facilitate and standardize the processes for archiving, analyzing and accessing data collected from dozens of operational PV systems and test facilities monitored by FSEC's Photovoltaics and Distributed Generation Division. [copied from http://www.fsec.ucf.edu/en/research/photovoltaics/data_monitoring/index.htm

  10. Episodic tremor triggers small earthquakes

    NASA Astrophysics Data System (ADS)

    Balcerak, Ernie

    2011-08-01

    It has been suggested that episodic tremor and slip (ETS), the weak shaking not associated with measurable earthquakes, could trigger nearby earthquakes. However, this had not been confirmed until recently. Vidale et al. monitored seismicity in the 4-month period around a 16-day episode of episodic tremor and slip in March 2010 in the Cascadia region. They observed five small earthquakes within the subducting slab during the ETS episode. They found that the timing and locations of earthquakes near the tremor suggest that the tremor and earthquakes are related. Furthermore, they observed that the rate of earthquakes across the area was several times higher within 2 days of tremor activity than at other times, adding to evidence of a connection between tremor and earthquakes. (Geochemistry, Geophysics, Geosystems, doi:10.1029/2011GC003559, 2011)

  11. Data Management Coordinators Monitor STS-78 Mission at the Huntsville Operations Support Center

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Launched on June 20, 1996, the STS-78 mission's primary payload was the Life and Microgravity Spacelab (LMS), which was managed by the Marshall Space Flight Center (MSFC). During the 17 day space flight, the crew conducted a diverse slate of experiments divided into a mix of life science and microgravity investigations. In a manner very similar to future International Space Station operations, LMS researchers from the United States and their European counterparts shared resources such as crew time and equipment. Five space agencies (NASA/USA, European Space Agency/Europe (ESA), French Space Agency/France, Canadian Space Agency /Canada, and Italian Space Agency/Italy) along with research scientists from 10 countries worked together on the design, development and construction of the LMS. This photo represents Data Management Coordinators monitoring the progress of the mission at the Huntsville Operations Support Center (HOSC) Spacelab Payload Operations Control Center (SL POCC) at MSFC. Pictured are assistant mission scientist Dr. Dalle Kornfeld, Rick McConnel, and Ann Bathew.

  12. The Deep Impact Network Experiment Operations Center Monitor and Control System

    NASA Technical Reports Server (NTRS)

    Wang, Shin-Ywan (Cindy); Torgerson, J. Leigh; Schoolcraft, Joshua; Brenman, Yan

    2009-01-01

    The Interplanetary Overlay Network (ION) software at JPL is an implementation of Delay/Disruption Tolerant Networking (DTN) which has been proposed as an interplanetary protocol to support space communication. The JPL Deep Impact Network (DINET) is a technology development experiment intended to increase the technical readiness of the JPL implemented ION suite. The DINET Experiment Operations Center (EOC) developed by JPL's Protocol Technology Lab (PTL) was critical in accomplishing the experiment. EOC, containing all end nodes of simulated spaces and one administrative node, exercised publish and subscribe functions for payload data among all end nodes to verify the effectiveness of data exchange over ION protocol stacks. A Monitor and Control System was created and installed on the administrative node as a multi-tiered internet-based Web application to support the Deep Impact Network Experiment by allowing monitoring and analysis of the data delivery and statistics from ION. This Monitor and Control System includes the capability of receiving protocol status messages, classifying and storing status messages into a database from the ION simulation network, and providing web interfaces for viewing the live results in addition to interactive database queries.

  13. Temporal Evolution of Effective Upper Mantle Viscosity from Postseismic Response to the 2006-2007 Great Kuril Earthquakes: Four Years of GPS Monitoring

    NASA Astrophysics Data System (ADS)

    Kogan, M. G.; Vasilenko, N. F.; Frolov, D. I.; Freymueller, J. T.; Prytkov, A. S.

    2012-12-01

    Transient surface deformation was still observed by GPS 40 years after two giant (M ~9) megathrust earthquakes in the 20th century: the 1960 Chile and the 1964 Alaska events [Hu et al., 2004; Suito and Freymueller, 2009]. The postseismic signal was attributed to viscoelastic relaxation in the Maxwell mantle wedge with constant viscosity on the order of 10^19 Pa s. In contrast, postseismic deformation for 3-4 years after the 2002 M 7.9 Denali and the 1997 M 7.6 Manyi, Tibet earthquakes requires much lower Maxwell viscosity on the order of 10^17 - 10^18 Pa s [Freed et al, 2006; Ryder et al., 2007; Biggs et al., 2009]. Also these early postseismic GPS and InSAR time series suggest an increase in viscosity with time, which would be inconsistent with a uniform Maxwell viscosity. Here we analyze surface deformation following the doublet of the 2006-2007 M > 8 Kuril megathrust earthquakes using 4 years of postseismic continuous GPS time series on the Kuril GPS Array. We split time series into four annual intervals starting at epoch 2007.5, i.e., about 7 months after the 2006 earthquake, and search for the best-fitting Maxwell viscosity year by year, after accounting for afterslip and the background interseismic strain signal. Earlier we showed that the contribution of afterslip to the Kuril postseismic displacement is small since about epoch 2007.5 [Kogan et al, 2011]. The background interseismic strain signal was not measured on the central Kurils at the stations showing the largest postseismic motion because observations started several months after the earthquakes. From analysis of trench-parallel gravity anomalies, Song and Simons [2003] proposed weak interseismic locking at the subduction interface in the central Kurils. If this hypothesis holds, we can expect small interseismic velocities at the sites affected by postseismic deformation. We tested three simple variants of corrections for interseismic motion of these sites, ranging from 0 to the mean velocity at the

  14. Science center capabilities to monitor and investigate Michigan’s water resources, 2016

    USGS Publications Warehouse

    Giesen, Julia A.; Givens, Carrie E.

    2016-09-06

    Michigan faces many challenges related to water resources, including flooding, drought, water-quality degradation and impairment, varying water availability, watershed-management issues, stormwater management, aquatic-ecosystem impairment, and invasive species. Michigan’s water resources include approximately 36,000 miles of streams, over 11,000 inland lakes, 3,000 miles of shoreline along the Great Lakes (MDEQ, 2016), and groundwater aquifers throughout the State.The U.S. Geological Survey (USGS) works in cooperation with local, State, and other Federal agencies, as well as tribes and universities, to provide scientific information used to manage the water resources of Michigan. To effectively assess water resources, the USGS uses standardized methods to operate streamgages, water-quality stations, and groundwater stations. The USGS also monitors water quality in lakes and reservoirs, makes periodic measurements along rivers and streams, and maintains all monitoring data in a national, quality-assured, hydrologic database.The USGS in Michigan investigates the occurrence, distribution, quantity, movement, and chemical and biological quality of surface water and groundwater statewide. Water-resource monitoring and scientific investigations are conducted statewide by USGS hydrologists, hydrologic technicians, biologists, and microbiologists who have expertise in data collection as well as various scientific specialties. A support staff consisting of computer-operations and administrative personnel provides the USGS the functionality to move science forward. Funding for USGS activities in Michigan comes from local and State agencies, other Federal agencies, direct Federal appropriations, and through the USGS Cooperative Matching Funds, which allows the USGS to partially match funding provided by local and State partners.This fact sheet provides an overview of the USGS current (2016) capabilities to monitor and study Michigan’s vast water resources. More

  15. Development of Distributed Research Center for monitoring and projecting regional climatic and environmental changes: first results

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Shiklomanov, Alexander; Okladinikov, Igor; Prusevich, Alex; Titov, Alexander

    2016-04-01

    Description and first results of the cooperative project "Development of Distributed Research Center for monitoring and projecting of regional climatic and environmental changes" recently started by SCERT IMCES and ESRC UNH are reported. The project is aimed at development of hardware and software platform prototype of Distributed Research Center (DRC) for monitoring and projecting regional climatic and environmental changes over the areas of mutual interest and demonstration the benefits of such collaboration that complements skills and regional knowledge across the northern extratropics. In the framework of the project, innovative approaches of "cloud" processing and analysis of large geospatial datasets will be developed on the technical platforms of two U.S. and Russian leading institutions involved in research of climate change and its consequences. Anticipated results will create a pathway for development and deployment of thematic international virtual research centers focused on interdisciplinary environmental studies by international research teams. DRC under development will comprise best features and functionality of earlier developed by the cooperating teams' information-computational systems RIMS (http://rims.unh.edu) and CLIMATE(http://climate.scert.ru/), which are widely used in Northern Eurasia environment studies. The project includes several major directions of research (Tasks) listed below. 1. Development of architecture and defining major hardware and software components of DRC for monitoring and projecting of regional environmental changes. 2. Development of an information database and computing software suite for distributed processing and analysis of large geospatial data hosted at ESRC and IMCES SB RAS. 3. Development of geoportal, thematic web client and web services providing international research teams with an access to "cloud" computing resources at DRC; two options will be executed: access through a basic graphical web browser and

  16. Lecture Demonstrations on Earthquakes for K-12 Teachers and Students

    NASA Astrophysics Data System (ADS)

    Dry, M. D.; Patterson, G. L.

    2005-12-01

    attached geophone, a touch-screen monitor, and various manipulatives. CERI is also developing suitcase kits and activities for teachers to borrow and use in their classrooms. The suitcase kits include activities based on state learning standards, such as layers of the Earth and plate tectonics. Items included in the suitcase modules include a shake table and dollhouse, an oscilloscope and geophone, a resonance model, a Slinky, Silly putty, Popsicle sticks, and other items. Almost all of the activities feature a lecture demonstration component. These projects would not be possible without leveraged funding from the Mid-America Earthquake Center (MAEC) and the Center for Earthquake Research and Information, with additional funding from the National Earthquake Hazards Reduction Program (NEHRP).

  17. Test data from the chloride-monitor well at Sun City Center, Hillsborough County, Florida

    USGS Publications Warehouse

    Sinclair, William C.

    1979-01-01

    A test well drilled for Southwest Florida Water Management District at Sun City Center in Hillsborough County, will serve to monitor the interface between freshwater in the aquifer and the underlying chloride water. The sulfate content of the water in the aquifer at the well site exceeds 250 mg/L below a depth of about 700 feet. Wells for domestic and public supply in the area bottom at less than 500 feet and are separated from the sulfate water by about 100 feet of poorly-permeable limestone. The freshwater-chloride water interface is quite sharp and occurs at a depth of 1,410 feet. The chloride water is similar in composition to seawater but nearly twice as saline. (Woodard-USGS).

  18. Pain reduction and financial incentives to improve glucose monitoring adherence in a community health center.

    PubMed

    Huntsman, Mary Ann H; Olivares, Faith J; Tran, Christina P; Billimek, John; Hui, Elliot E

    2014-01-01

    Self-monitoring of blood glucose is a critical component of diabetes management. However, patients often do not maintain the testing schedule recommended by their healthcare provider. Many barriers to testing have been cited, including cost and pain. We present a small pilot study to explore whether the use of financial incentives and pain-free lancets could improve adherence to glucose testing in a community health center patient population consisting largely of non-English speaking ethnic minorities with low health literacy. The proportion of patients lost to follow-up was 17%, suggesting that a larger scale study is feasible in this type of setting, but we found no preliminary evidence suggesting a positive effect on adherence by either financial incentives or pain-free lancets. Results from this pilot study will guide the design of larger-scale studies to evaluate approaches to overcome the variety of barriers to glucose testing that are present in disadvantaged patient populations. PMID:25486531

  19. Activation and implementation of a Federal Radiological Monitoring and Assessment Center

    SciTech Connect

    Doyle, J.F. III

    1989-01-01

    The Nevada Operations Office of the U.S. Department of Energy (DOE/NV) has been assigned the primary responsibility for responding to a major radiological emergency. The initial response to any radiological emergency, however, will probably be conducted under the DOE regional radiological assistance plan (RAP). If the dimensions of the crisis demand federal assistance, the following sequence of events may be anticipated: (1) DOE regional RAP response, (2) activation of the Federal Radiological Monitoring and Assistance Center (FRMAC) requested, (3) aerial measuring systems and DOE/NV advance party respond, (4) FRMAC activated, (5) FRMAC responds to state(s) and cognizant federal agency (CFA), and (6) management of FRMAC transferred to the Environmental Protection Agency (EPA). The paper discusses activation channels, authorization, notification, deployment, and interfaces.

  20. Seismically active area monitoring by robust TIR satellite techniques: a sensitivity analysis on low magnitude earthquakes in Greece and Turkey

    NASA Astrophysics Data System (ADS)

    Corrado, R.; Caputo, R.; Filizzola, C.; Pergola, N.; Pietrapertosa, C.; Tramutoli, V.

    2005-01-01

    Space-time TIR anomalies, observed from months to weeks before earthquake occurrence, have been suggested by several authors as pre-seismic signals. Up to now, such a claimed connection of TIR emission with seismic activity has been considered with some caution by scientific community mainly for the insufficiency of the validation data-sets and the scarce importance attached by those authors to other causes (e.g. meteorological) that, rather than seismic activity, could be responsible for the observed TIR signal fluctuations. A robust satellite data analysis technique (RAT) has been recently proposed which, thanks to a well-founded definition of TIR anomaly, seems to be able to identify anomalous space-time TIR signal transients even in very variable observational (satellite view angle, land topography and coverage, etc.) and natural (e.g. meteorological) conditions. Its possible application to satellite TIR surveys in seismically active regions has been already tested in the case of several earthquakes (Irpinia: 23 November 1980, Athens: 7 September 1999, Izmit: 17 August 1999) of magnitude higher than 5.5 by using a validation/confutation approach, devoted to verify the presence/absence of anomalous space-time TIR transients in the presence/absence of seismic activity. In these cases, a magnitude threshold (generally M<5) was arbitrarily chosen in order to identify seismically unperturbed periods for confutation purposes. In this work, 9 medium-low magnitude (4earthquakes which occurred in Greece and Turkey have been analyzed in order to verify if, even in these cases, anomalous TIR transients can be observed. The analysis, which was performed using 8 years of Meteosat TIR observations, demonstrated that anomalous TIR transients can be observed even in the presence of medium-low magnitude earthquakes (4earthquake occurrence is concerned, such a result suggests

  1. Appraising the Early-est earthquake monitoring system for tsunami alerting at the Italian Candidate Tsunami Service Provider

    NASA Astrophysics Data System (ADS)

    Bernardi, F.; Lomax, A.; Michelini, A.; Lauciani, V.; Piatanesi, A.; Lorito, S.

    2015-09-01

    In this paper we present and discuss the performance of the procedure for earthquake location and characterization implemented in the Italian Candidate Tsunami Service Provider at the Istituto Nazionale di Geofisica e Vulcanologia (INGV) in Rome. Following the ICG/NEAMTWS guidelines, the first tsunami warning messages are based only on seismic information, i.e., epicenter location, hypocenter depth, and magnitude, which are automatically computed by the software Early-est. Early-est is a package for rapid location and seismic/tsunamigenic characterization of earthquakes. The Early-est software package operates using offline-event or continuous-real-time seismic waveform data to perform trace processing and picking, and, at a regular report interval, phase association, event detection, hypocenter location, and event characterization. Early-est also provides mb, Mwp, and Mwpd magnitude estimations. mb magnitudes are preferred for events with Mwp ≲ 5.8, while Mwpd estimations are valid for events with Mwp ≳ 7.2. In this paper we present the earthquake parameters computed by Early-est between the beginning of March 2012 and the end of December 2014 on a global scale for events with magnitude M ≥ 5.5, and we also present the detection timeline. We compare the earthquake parameters automatically computed by Early-est with the same parameters listed in reference catalogs. Such reference catalogs are manually revised/verified by scientists. The goal of this work is to test the accuracy and reliability of the fully automatic locations provided by Early-est. In our analysis, the epicenter location, hypocenter depth and magnitude parameters do not differ significantly from the values in the reference catalogs. Both mb and Mwp magnitudes show differences to the reference catalogs. We thus derived correction functions in order to minimize the differences and correct biases between our values and the ones from the reference catalogs. Correction of the Mwp

  2. Seismotectonics of the May 19, 2011 Simav- Kutahya Earthquake Activity

    NASA Astrophysics Data System (ADS)

    Komec Mutlu, Ahu

    2014-05-01

    Aftershock sequence of May 19, 2011 Simav earthquake (Mw = 5.8) is relocated with a new 1-D seismic velocity model and focal mechanisms of largest aftershocks are determined. The May 19, 2011 Simav-Kutahya earthquake is occured in the most seismically active region of western Turkey. During six months after the mainshock, more than 5000 earthquakes are recorded and aftershocks followed over a period of almost two years. In this study, more than 7600 aftershocks occured between years 2011 and 2012 with magnitudes greater than 1.8 relocated. Waveform data is collected by 13 three component seismic stations from three different networks (Kandilli Observatory and Earthquake Research Institute (NEMC-National Earthquake Monitoring Center), Prime Ministry Disaster and Emergency Management Presidency, Department of Earthquake and Canakkale Onsekiz Mart University Geophysics Department). These seismic stations are deployed closer than 80 km epicentral distance in the Simav-Kutahya. Average crustal velocity and average crustal thickness for the region are computed as 5.68 km/sn and 37.6 km, respectively. The source mechanism of fifty aftershocks with magnitudes greater than 4.0 are derived from first motion P phases. Analysis of focal mechanisms indicate mainly normal fault motions with oblique slip.

  3. Incorporating Fundamentals of Climate Monitoring into Climate Indicators at the National Climatic Data Center

    NASA Astrophysics Data System (ADS)

    Arndt, D. S.

    2014-12-01

    In recent years, much attention has been dedicated to the development, testing and implementation of climate indicators. Several Federal agencies and academic groups have commissioned suites of indicators drawing upon and aggregating information available across the spectrum of climate data stewards and providers. As a long-time participant in the applied climatology discipline, NOAA's National Climatic Data Center (NCDC) has generated climate indicators for several decades. Traditionally, these indicators were developed for sectors with long-standing relationships with, and needs of, the applied climatology field. These have recently been adopted and adapted to meet the needs of sectors who have newfound sensitivities to climate and needs for climate data. Information and indices from NOAA's National Climatic Data Center have been prominent components of these indicator suites, and in some cases have been drafted in toto by these aggregators, often with improvements to the communicability and aesthetics of the indicators themselves. Across this history of supporting needs for indicators, NCDC climatologists developed a handful of practical approaches and philosophies that inform a successful climate monitoring product. This manuscript and presentation will demonstrate the utility this set of practical applications that translate raw data into useful information.

  4. Insights into the origins of drumbeat earthquakes, periodic low frequency seismicity, and plug degradation from multi-instrument monitoring at Tungurahua volcano, Ecuador, April 2015

    NASA Astrophysics Data System (ADS)

    Bell, Andrew; Hernandez, Stephen; Gaunt, Elizabeth; Mothes, Patricia; Hidalgo, Silvana; Ruiz, Mario

    2016-04-01

    Highly-periodic repeating 'drumbeat' earthquakes have been reported from several andesitic and dacitic volcanoes. Physical models for the origin of drumbeat earthquakes incorporate, to different extents, the incremental upward movement of viscous magma. However, the roles played by stick-slip friction, brittle failure, and fluid flow, and the relations between drumbeat earthquakes and other low-frequency seismic signals, remain controversial. Here we report the results of analysis of three weeks of geophysical data recorded during an unrest episode at Tungurahua, an andesitic stratovolcano in Ecuador, during April 2015, by the monitoring network of the Instituto Geofisico of Ecuador. Combined seismic, geodetic, infrasound, and gas monitoring has provided new insights into the origins of periodic low-frequency seismic signals, conduit processes, and the nature of current unrest. Over the three-week period, the relative seismic amplitude (RSAM) correlated closely with short-term deformation rates and gas fluxes. However, the characteristics of the seismic signals, as recorded at a short-period station closest to the summit crater, changed considerably with time. Initially high RSAM and gas fluxes, with modest ash emissions, were associated with continuous and 'pulsed' tremor signals (amplitude modulated, with 30-100 second periods). As activity levels decreased over several days, tremor episodes became increasingly intermittent, and short-lived bursts of low-frequency earthquakes with quasiperiodic inter-event times were observed. Following one day of quiescence, the onset of pronounced low frequency drumbeat earthquakes signalled the resumption of elevated unrest, initially with mean inter-event times of 32 seconds, and later increasing to 74 seconds and longer, with periodicity progressively breaking down over several days. A reduction in RSAM was then followed by one week of persistent, quasiperiodic, longer-duration emergent low-frequency pulses, including

  5. Interoperable Access to Near Real Time Ocean Observations with the Observing System Monitoring Center

    NASA Astrophysics Data System (ADS)

    O'Brien, K.; Hankin, S.; Mendelssohn, R.; Simons, R.; Smith, B.; Kern, K. J.

    2013-12-01

    The Observing System Monitoring Center (OSMC), a project funded by the National Oceanic and Atmospheric Administration's Climate Observations Division (COD), exists to join the discrete 'networks' of In Situ ocean observing platforms -- ships, surface floats, profiling floats, tide gauges, etc. - into a single, integrated system. The OSMC is addressing this goal through capabilities in three areas focusing on the needs of specific user groups: 1) it provides real time monitoring of the integrated observing system assets to assist management in optimizing the cost-effectiveness of the system for the assessment of climate variables; 2) it makes the stream of real time data coming from the observing system available to scientific end users into an easy-to-use form; and 3) in the future, it will unify the delayed-mode data from platform-focused data assembly centers into a standards- based distributed system that is readily accessible to interested users from the science and education communities. In this presentation, we will be focusing on the efforts of the OSMC to provide interoperable access to the near real time data stream that is available via the Global Telecommunications System (GTS). This is a very rich data source, and includes data from nearly all of the oceanographic platforms that are actively observing. We will discuss how the data is being served out using a number of widely used 'web services' (including OPeNDAP and SOS) and downloadable file formats (KML, csv, xls, netCDF), so that it can be accessed in web browsers and popular desktop analysis tools. We will also be discussing our use of the Environmental Research Division's Data Access Program (ERDDAP), available from NOAA/NMFS, which has allowed us to achieve our goals of serving the near real time data. From an interoperability perspective, it's important to note that access to the this stream of data is not just for humans, but also for machine-to-machine requests. We'll also delve into how we

  6. Therapeutic drug monitoring of mexiletine at a large academic medical center

    PubMed Central

    Nei, Scott D; Danelich, Ilya M; Lose, Jennifer M; Leung, Lydia Yuk Ting; Asirvatham, Samuel J; McLeod, Christopher J

    2016-01-01

    Introduction: The therapeutic trough range for mexiletine (0.8–2 mcg/mL) was largely established in the setting of arrhythmia prophylaxis following myocardial infarction. Objective: Describe the usage patterns of serum mexiletine concentrations and the impact of these concentrations on mexiletine dosing in modern practice for ventricular arrhythmia treatment. Methods: A single-center, retrospective analysis was conducted using the electronic medical record to identify serum mexiletine concentrations drawn between December 2004 and December 2014. The primary endpoint was the incidence of mexiletine concentrations drawn as troughs. Secondary outcomes included the incidence of mexiletine concentrations that prompted a dose change, association between adverse events and elevated concentrations, and association between baseline characteristics and mexiletine concentrations. Results: A total of 237 individual concentrations were included for analysis with 109 (46.0%) drawn appropriately as trough concentrations. Only 31 (13.1%) of the 237 concentrations drawn prompted a dose change. Mexiletine was primarily used for the treatment of ventricular arrhythmias (96.2%), and 108 (45.6%) concentrations were drawn in an effort to assess efficacy. The median concentration was statistically different between patients with and without an adverse event (0.8 vs 0.7 mcg/mL, respectively; p = 0.017), but may not represent a clinical significance. Patients with hepatic dysfunction had higher median concentrations compared to those without hepatic dysfunction (1.30 vs 1.07 mcg/mL; p = 0.01). Conclusion: Mexiletine concentrations are often drawn at inappropriate times and seldom influence a dose change. This study suggests that routine monitoring of mexiletine concentrations may not be necessary; however, therapeutic drug monitoring may be considered in patients with hepatic dysfunction or to confirm mexiletine absorption in patients where this represents a concern. PMID

  7. Chlorine dioxide: a new agent for dialysis monitor disinfection in a pediatric center.

    PubMed

    Palo, T D; Atti, M; Bellantuono, R; Giordano, M; Caringella, D A

    1997-01-01

    In order to evaluate the bacterial and endotoxin contamination in the dialysis fluids of our pediatric center and the effectiveness of chlorine dioxide (CD) compared with a conventional method, (1) deionized water, (2) dialysate fluid, (3) basic concentrate, and (4) acid concentrate were tested in 4 dialysis machines. Monitor sterilization was made using CD in protocol A and sodium hypochlorite/acetic acid in protocol B. Once every 2 weeks the deionized water set of distribution was routinely disinfected with peracetic acid. Each protocol lasted 1 months and the samples were taken, under aseptic conditions, on the 15th, 22nd and 27th day. All samples, at all stages of the study, showed an endotoxin concentration below the limits recommended by the Canadian Standard Association. Fifty-nine out of 72 samples in A and 62 out of 72 samples in B showed a bacterial count within the range recommended by the Association for the Advancement of Medical Instrumentation. The data show that both protocols produced the same results. However, protocol A is to be preferred for its simultaneous disinfecting-cleaning and descaling activity which proves time-saving. PMID:9262845

  8. Operational control of radiation conditions in Space Monitoring Data Center of Moscow State University

    NASA Astrophysics Data System (ADS)

    Kalegaev, Vladimir; Shugay, Yulia; Bobrovnikov, Sergey; Kuznetsov, Nikolay; Barinova, Vera; Myagkova, Irina; Panasyuk, Mikhail

    2016-07-01

    Space Monitoring Data Center (SMDC) of Moscow State University provides mission support for Russian satellites and give operational analysis of radiation conditions in space. SMDC Web-sites (http://smdc.sinp.msu.ru/ and http://swx.sinp.msu.ru/) give access to current data on the level of solar activity, geomagnetic and radiation state of Earth's magnetosphere and heliosphere in near-real time. For data analysis the models of space environment factors working online have been implemented. Interactive services allow one to retrieve and analyze data at a given time moment. Forecasting applications including solar wind parameters, geomagnetic and radiation condition forecasts have been developed. Radiation dose and SEE rate control are of particular importance in practical satellite operation. Satellites are always under the influence of high-energy particle fluxes during their orbital flight. The three main sources of particle fluxes: the Earth's radiation belts, the galactic cosmic rays, and the solar energetic particles (SEP), are taken into account by SMDC operational services to estimate the radiation dose caused by high-energy particles to a satellite at LEO orbits. ISO 15039 and AP8/AE8 physical models are used to estimate effects of galactic cosmic rays and radiation belt particle fluxes. Data of geosynchronous satellites (GOES or Electro-L1) allow to reconstruct the SEP fluxes spectra at a given low Earth orbit taking into account the geomagnetic cut-off depending on geomagnetic activity level.

  9. Multi-Year Combination of Tide Gauge Benchmark Monitoring (TIGA) Analysis Center Products

    NASA Astrophysics Data System (ADS)

    Hunegnaw, A.; Teferle, F. N.

    2014-12-01

    In 2013 the International GNSS Service (IGS) Tide Gauge Benchmark Monitoring (TIGA) Working Group (WG) started their reprocessing campaign, which proposes to re-analyze all relevant Global Positioning System (GPS) observations from 1994 to 2013. This re-processed dataset will provide high quality estimates of land motions, enabling regional and global high-precision geophysical/geodetic studies. Several of the individual TIGA Analysis Centers (TACs) have completed processing the full history of GPS observations recorded by the IGS global network, as well as, many other GPS stations at or close to tide gauges, which are available from the TIGA data centre at the University of La Rochelle (www.sonel.org). The TAC solutions contain a total of over 700 stations. Following the recent improvements in processing models and strategies, this is the first complete reprocessing attempt by the TIGA WG to provide homogeneous position time series. The TIGA Combination Centre (TCC) at the University of Luxembourg (UL) has computed a first multi-year weekly combined solution using two independent combination software packages: CATREF and GLOBK. These combinations allow an evaluation of any effects from the combination software and of the individual TAC contributions and their influences on the combined solution. In this study we will present the first UL TIGA multi-year combination results and discuss these in terms of geocentric sea level changes

  10. Long term Combination of Tide Gauge Benchmark Monitoring (TIGA) Analysis Center Products

    NASA Astrophysics Data System (ADS)

    Teferle, F. N.; Hunegnaw, A.

    2015-12-01

    The International GNSS Service (IGS) Tide Gauge Benchmark Monitoring (TIGA) Working Group (WG) has recently finallized their reprocessing campaign, using all relevant Global Positioning System (GPS) observations from 1995 to 2014. This re-processed dataset will provide high quality estimates of land motions, enabling regional and global high-precision geophysical/geodeticstudies. Several of the individual TIGA Analysis Centers (TACs) have completed processing the full history of GPS observations recorded by the IGS global network, as well as, many other GPS stationsat or close to tide gauges, which are available from the TIGA data centre at the University of La Rochelle (www.sonel.org). The TAC solutions contain a total of over 700 stations. Following the recentimprovements in processing models and strategies, this is the first complete reprocessing attempt by the TIGA WG to provide homogeneous position time series. The TIGA Combination Centre (TCC) atthe University of Luxembourg (UL) has computed a first multi-year weekly combined solution using two independent combination software packages: CATREF and GLOBK. These combinations allow anevaluation of any effects from the combination software and of the individual TAC contributions and their influences on the combined solution. In this study we will present the first UL TIGA multi-yearcombination results and discuss these in terms of geocentric sea level changes.

  11. Network-based real-time radiation monitoring system in Synchrotron Radiation Research Center.

    PubMed

    Sheu, R J; Wang, J P; Chen, C R; Liu, J; Chang, F D; Jiang, S H

    2003-10-01

    The real-time radiation monitoring system (RMS) in the Synchrotron Radiation Research Center (SRRC) has been upgraded significantly during the past years. The new framework of the RMS is built on the popular network technology, including Ethernet hardware connections and Web-based software interfaces. It features virtually no distance limitations, flexible and scalable equipment connections, faster response time, remote diagnosis, easy maintenance, as well as many graphic user interface software tools. This paper briefly describes the radiation environment in SRRC and presents the system configuration, basic functions, and some operational results of this real-time RMS. Besides the control of radiation exposures, it has been demonstrated that a variety of valuable information or correlations could be extracted from the measured radiation levels delivered by the RMS, including the changes of operating conditions, beam loss pattern, radiation skyshine, and so on. The real-time RMS can be conveniently accessed either using the dedicated client program or World Wide Web interface. The address of the Web site is http:// www-rms.srrc.gov.tw.

  12. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    NASA Technical Reports Server (NTRS)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  13. Using graphics and expert system technologies to support satellite monitoring at the NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M.; Shirah, Gregory W.; Luczak, Edward C.

    1994-01-01

    At NASA's Goddard Space Flight Center, fault-isolation expert systems have been developed to support data monitoring and fault detection tasks in satellite control centers. Based on the lessons learned during these efforts in expert system automation, a new domain-specific expert system development tool named the Generic Spacecraft Analysts Assistant (GenSAA), was developed to facilitate the rapid development and reuse of real-time expert systems to serve as fault-isolation assistants for spacecraft analysts. This paper describes GenSAA's capabilities and how it is supporting monitoring functions of current and future NASA missions for a variety of satellite monitoring applications ranging from subsystem health and safety to spacecraft attitude. Finally, this paper addresses efforts to generalize GenSAA's data interface for more widespread usage throughout the space and commercial industry.

  14. Hidden earthquakes

    SciTech Connect

    Stein, R.S.; Yeats, R.S.

    1989-06-01

    Seismologists generally look for earthquakes to happen along visible fault lines, e.g., the San Andreas fault. The authors maintain that another source of dangerous quakes has been overlooked: the release of stress along a fault that is hidden under a fold in the earth's crust. The paper describes the differences between an earthquake which occurs on a visible fault and one which occurs under an anticline and warns that Los Angeles greatest earthquake threat may come from a small quake originating under downtown Los Angeles, rather than a larger earthquake which occurs 50 miles away at the San Andreas fault.

  15. The Terminator Time in subionospheric VLF/LF diurnal variation as recorded by the Romanian VLF/LF radio monitoring system related to earthquake occurrence and volcano erruptions

    NASA Astrophysics Data System (ADS)

    Moldovan, I. A.; Moldovan, A. S.; Biagi, P. F.; Ionescu, C.; Schwingenschuh, K.; Boudjada, M. Y.

    2012-04-01

    The Romanian VLF/LF monitoring system consisting in a radio receiver and the infrastructure that is necessary to record and transmit the collected data is part of the European international network named INFREP. Information on electromagnetic fields' intensities created by transmitters at a receiving site are indicating the quality of the propagation along the paths between the receivers and transmitters. Studying the ionosphere's influences on the electromagnetic waves' propagation along a certain path is a method to put into evidence possible modifications of its lower structure and composition as earthquakes' precursors. The VLF/LF receiver installed in Romania was put into operation in February 2009 and has already 3 years of testing, functioning and proving its utility in the forecast of some earthquakes or volcanic eruptions. Simultaneously we monitor, in the same site with the VLF/LF receiver, the vertical atmospheric electric field and different other meteorological parameters as: temperature, pressure or rainfall. The global magnetic conditions are emphasized with the help of Daily Geomagnetic Index Kp. At a basic level, the adopted analysis consists in a simple statistical evaluation of the signals by comparing the instantaneous values to the trend of the signal. In this paper we pay attention to the terminator times in subionospheric VLF/LF diurnal variation, which are defined as the times of minimum in amplitude (or phase) around sunrise and sunset. These terminator times are found to shift significantly just around the earthquake. In the case of Kobe earthquake, there were found significant shifts in both morning and evening terminator times and these authors interpreted the shift in terminator time in terms of the lowering of lower ionosphere by using the full-wave mode theory. A LabVIEW application which accesses the VLF/LF receiver through internet was developed. This program opens the receiver's web-page and automatically retrieves the list of data

  16. Earthquakes and the urban environment. Volume II

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 2 contains chapters on earthquake prediction, control, building design and building response.

  17. Earthquakes and the urban environment. Volume I

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 1 contains chapters on earthquake parameters and hazards.

  18. Postseismic deformation monitoring and modeling of the 2008 Mw7.9 Wenchuan, China earthquake, constrained using GPS and InSAR measurements

    NASA Astrophysics Data System (ADS)

    Shen, Z.; Wang, M.; Sun, J.; Liao, H.; Zhang, P.; Hao, M.; Wan, Y.; Burgmann, R.; Zeng, Y.; Gan, W.; Tao, W.; Wang, Q.; Wang, K.; Wang, Y.; Chen, W.; Wang, F.; Xue, L.

    2009-12-01

    We synthesize GPS data observed from a continuous array and InSAR data from JAXA ALOS and ESA Envisat satellites to monitor postseismic deformation of the 2008 Mw7.9 Wenchuan, China earthquake. The GPS array is composed of three networks, totaling about 50 stations. Two of the networks spanning the fault rupture zone were installed after the quake, one by China Earthquake Administration and Peking University and the other by the Sichuan Bureau of Surveying and Mapping, respectively. Another network was installed at the footwall side of the rupture prior to the quake by the Sichuan Seismological Bureau. Some of the sites utilize pillar type monuments on hill tops, and others are on building-tops of concrete structures with bedrock foundation. Preliminary analysis of the postseismic data reveals that: (a) Postseismic deformation decays with time, and the temporal behavior of early deformation can be modeled by a logarithmic function with a time constant of 8 days (between 4 and 15 days at 95% confidence). (b) Postseismic deformation mimics the coseismic deformation pattern across much of the region except in the near field on the footwall side of the fault, where postseismic displacements reverse their coseismic motion directions and point to southeast. This observation suggests a deformation source of afterslip on the lower part of a listric fault, whose dip angle gets progressively shallower with increasing depth. (c) Vertical postseismic deformation shows continuous uplifting of the hanging-wall block and sinking in the footwall block in the near-field, suggesting that afterslip, not lower crust/upper mantle viscous relaxation, is the dominant mechanism for immediate postseismic deformation after the quake. We are modeling the postseismic deformation data using a model incorporating both the afterslip and visco-elastic relaxation in the lithosphere. The fault geometry is provided by inversion of coseismic deformation data (Wan et al., this meeting), and layered

  19. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    USGS Publications Warehouse

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    On March 27, 1964, at 5:36 p.m., a magnitude 9.2 earthquake, the largest recorded earthquake in U.S. history, struck southcentral Alaska (fig. 1). The Great Alaska Earthquake (also known as the Good Friday Earthquake) occurred at a pivotal time in the history of earth science, and helped lead to the acceptance of plate tectonic theory (Cox, 1973; Brocher and others, 2014). All large subduction zone earthquakes are understood through insights learned from the 1964 event, and observations and interpretations of the earthquake have influenced the design of infrastructure and seismic monitoring systems now in place. The earthquake caused extensive damage across the State, and triggered local tsunamis that devastated the Alaskan towns of Whittier, Valdez, and Seward. In Anchorage, the main cause of damage was ground shaking, which lasted approximately 4.5 minutes. Many buildings could not withstand this motion and were damaged or collapsed even though their foundations remained intact. More significantly, ground shaking triggered a number of landslides along coastal and drainage valley bluffs underlain by the Bootlegger Cove Formation, a composite of facies containing variably mixed gravel, sand, silt, and clay which were deposited over much of upper Cook Inlet during the Late Pleistocene (Ulery and others, 1983). Cyclic (or strain) softening of the more sensitive clay facies caused overlying blocks of soil to slide sideways along surfaces dipping by only a few degrees. This guide is the document version of an interactive web map that was created as part of the commemoration events for the 50th anniversary of the 1964 Great Alaska Earthquake. It is accessible at the U.S. Geological Survey (USGS) Alaska Science Center website: http://alaska.usgs.gov/announcements/news/1964Earthquake/. The website features a map display with suggested tour stops in Anchorage, historical photographs taken shortly after the earthquake, repeat photography of selected sites, scanned documents

  20. Combination of Tide Gauge Benchmark Monitoring (TIGA) Analysis Center from repro2 solutions

    NASA Astrophysics Data System (ADS)

    Hunegnaw, Addisu; Teferle, Felix Norman

    2016-04-01

    Recently the International GNSS Service (IGS) Tide Gauge Benchmark Monitoring (TIGA) Working Group (WG) has completed their repro2 solutions by re-analyzing the full history of all relevant Global Positioning System (GPS) observations from 1995 to 2015. This re-processed data set will provide high-quality estimates of vertical land movements for more than 500 stations, enabling regional and global high-precision geophysical/geodetic studies. All the TIGA Analysis Centres (TACs) have processed the observations recorded by GPS stations at or close to tide gauges, which are available from the TIGA Data Center at the University of La Rochelle (www.sonel.org) besides those of the global IGS core network used for its reference frame implementations. Following the recent improvements in processing models, strategies (http://acc.igs.org/reprocess2.html), this is the first complete re-processing attempt by the TIGA WG to provide homogeneous position time series relevant to sea level changes. In this study we report on a first multi-year daily combined solution from the TIGA Combination Centre (TCC) at the University of Luxembourg (UL) with respect to the latest International Terrestrial Reference Frame (ITRF2014). Using two independent combination software packages, CATREF and GLOBK, we have computed a first daily combined solution from TAC solutions already available to the TIGA WG. These combinations allow an evaluation of any effects from the combination software and of the individual TAC parameters and their influences on the combined solution with respect to the latest ITRF2014. Some results of the UL TIGA multi-year combinations in terms of geocentric sea level changes will be presented and discussed.

  1. Update NEMC Database using Arcgis Software and Example of Simav-Kutahya earthquake sequences

    NASA Astrophysics Data System (ADS)

    Altuncu Poyraz, S.; Kalafat, D.; Kekovali, K.

    2011-12-01

    In this study, totally 144043 earthquake data from the Kandilli Observatory Earthquake Research Institute & National Earthquake Monitoring Center (KOERI-NEMC) seismic catalog between 2.0≤M≤7.9 occured in Turkey for the time interval 1900-2011 were used. The data base includes not only coordinates, date, magnitude and depth of these earthquakes but also location and installation information, field studies, geology, technical properties of 154 seismic stations. Additionally, 1063 historical earthquakes included to the data base. Source parameters of totally 738 earthquakes bigger than M≥4.0 occured between the years 1938-2008 were added to the database. In addition, 103 earthquake's source parameters were calculated (bigger than M≥4.5) since 2008. In order to test the charateristics of earthquakes, questioning, visualization and analyzing aftershock sequences on 19 May 2011 Simav-Kutahya earthquake were selected and added to the data base. The Simav earthquake (western part of Anatolia) with magnitude Ml= 5.9 occurred at local time 23:15 is investigated, in terms of accurate event locations and source properties of the largest events. The aftershock distribution of Simav earthquake shows the activation of a 17-km long zone, which extends in depth between 5 and 10 km. In order to make contribution to better understand the neotectonics of this region, we analysed the earthquakes using the KOERI (Kandilli Observatory and Earthquake Research Institute) seismic stations along with the seismic stations that are operated by other communities and recorded suscessfuly the Simav seismic activity in 2011. Source mechanisms of 19 earthquakes with magnitudes between 3.8 ≤ML<6.0 were calculated by means of Regional Moment Tensor Inversion (RMT) technique. The mechanism solutions show the presence of east-west direction normal faults in the region. As a result an extensional regime is dominated in the study area. The aim of this study is to store and compile earthquake

  2. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  3. Broadband characteristics of earthquakes recorded during a dome-building eruption at Mount St. Helens, Washington, between October 2004 and May 2005: Chapter 5 in A volcano rekindled: the renewed eruption of Mount St. Helens, 2004-2006

    USGS Publications Warehouse

    Horton, Stephen P.; Norris, Robert D.; Moran, Seth C.; Sherrod, David R.; Scott, William E.; Stauffer, Peter H.

    2008-01-01

    From October 2004 to May 2005, the Center for Earthquake Research and Information of the University of Memphis operated two to six broadband seismometers within 5 to 20 km of Mount St. Helens to help monitor recent seismic and volcanic activity. Approximately 57,000 earthquakes identified during the 7-month deployment had a normal magnitude distribution with a mean magnitude of 1.78 and a standard deviation of 0.24 magnitude units. Both the mode and range of earthquake magnitude and the rate of activity varied during the deployment. We examined the time domain and spectral characteristics of two classes of events seen during dome building. These include volcano-tectonic earthquakes and lower-frequency events. Lower-frequency events are further classified into hybrid earthquakes, low-frequency earthquakes, and long-duration volcanic tremor. Hybrid and low-frequency earthquakes showed a continuum of characteristics that varied systematically with time. A progressive loss of high-frequency seismic energy occurred in earthquakes as magma approached and eventually reached the surface. The spectral shape of large and small earthquakes occurring within days of each other did not vary with magnitude. Volcanic tremor events and lower-frequency earthquakes displayed consistent spectral peaks, although higher frequencies were more favorably excited during tremor than earthquakes.

  4. Analog earthquakes

    SciTech Connect

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  5. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    NASA Astrophysics Data System (ADS)

    Serata, S.

    2006-12-01

    The Serata Stressmeter has been developed to measure and monitor earthquake shear stress build-up along shallow active faults. The development work made in the past 25 years has established the Stressmeter as an automatic stress measurement system to study timing of forthcoming major earthquakes in support of the current earthquake prediction studies based on statistical analysis of seismological observations. In early 1982, a series of major Man-made earthquakes (magnitude 4.5-5.0) suddenly occurred in an area over deep underground potash mine in Saskatchewan, Canada. By measuring underground stress condition of the mine, the direct cause of the earthquake was disclosed. The cause was successfully eliminated by controlling the stress condition of the mine. The Japanese government was interested in this development and the Stressmeter was introduced to the Japanese government research program for earthquake stress studies. In Japan the Stressmeter was first utilized for direct measurement of the intrinsic lateral tectonic stress gradient G. The measurement, conducted at the Mt. Fuji Underground Research Center of the Japanese government, disclosed the constant natural gradients of maximum and minimum lateral stresses in an excellent agreement with the theoretical value, i.e., G = 0.25. All the conventional methods of overcoring, hydrofracturing and deformation, which were introduced to compete with the Serata method, failed demonstrating the fundamental difficulties of the conventional methods. The intrinsic lateral stress gradient determined by the Stressmeter for the Japanese government was found to be the same with all the other measurements made by the Stressmeter in Japan. The stress measurement results obtained by the major international stress measurement work in the Hot Dry Rock Projects conducted in USA, England and Germany are found to be in good agreement with the Stressmeter results obtained in Japan. Based on this broad agreement, a solid geomechanical

  6. Rapid characterization of the 2015 Mw 7.8 Gorkha, Nepal, earthquake sequence and its seismotectonic context

    USGS Publications Warehouse

    Hayes, Gavin; Briggs, Richard; Barnhart, William D.; Yeck, William; McNamara, Daniel E.; Wald, David J.; Nealy, Jennifer; Benz, Harley M.; Gold, Ryan D.; Jaiswal, Kishor S.; Marano, Kristin; Earle, Paul; Hearne, Mike; Smoczyk, Gregory M.; Wald, Lisa A.; Samsonov, Sergey

    2015-01-01

    Earthquake response and related information products are important for placing recent seismic events into context and particularly for understanding the impact earthquakes can have on the regional community and its infrastructure. These tools are even more useful if they are available quickly, ahead of detailed information from the areas affected by such earthquakes. Here we provide an overview of the response activities and related information products generated and provided by the U.S. Geological Survey National Earthquake Information Center in association with the 2015 M 7.8 Gorkha, Nepal, earthquake. This group monitors global earthquakes 24  hrs/day and 7  days/week to provide rapid information on the location and size of recent events and to characterize the source properties, tectonic setting, and potential fatalities and economic losses associated with significant earthquakes. We present the timeline over which these products became available, discuss what they tell us about the seismotectonics of the Gorkha earthquake and its aftershocks, and examine how their information is used today, and might be used in the future, to help mitigate the impact of such natural disasters.

  7. Groundwater monitoring program plan and conceptual site model for the Al-Tuwaitha Nuclear Research Center in Iraq.

    SciTech Connect

    Copland, John Robin; Cochran, John Russell

    2013-07-01

    The Radiation Protection Center of the Iraqi Ministry of Environment is developing a groundwater monitoring program (GMP) for the Al-Tuwaitha Nuclear Research Center located near Baghdad, Iraq. The Al-Tuwaitha Nuclear Research Center was established in about 1960 and is currently being cleaned-up and decommissioned by Iraqs Ministry of Science and Technology. This Groundwater Monitoring Program Plan (GMPP) and Conceptual Site Model (CSM) support the Radiation Protection Center by providing: A CSM describing the hydrogeologic regime and contaminant issues, recommendations for future groundwater characterization activities, and descriptions of the organizational elements of a groundwater monitoring program. The Conceptual Site Model identifies a number of potential sources of groundwater contamination at Al-Tuwaitha. The model also identifies two water-bearing zones (a shallow groundwater zone and a regional aquifer). The depth to the shallow groundwater zone varies from approximately 7 to 10 meters (m) across the facility. The shallow groundwater zone is composed of a layer of silty sand and fine sand that does not extend laterally across the entire facility. An approximately 4-m thick layer of clay underlies the shallow groundwater zone. The depth to the regional aquifer varies from approximately 14 to 17 m across the facility. The regional aquifer is composed of interfingering layers of silty sand, fine-grained sand, and medium-grained sand. Based on the limited analyses described in this report, there is no severe contamination of the groundwater at Al-Tuwaitha with radioactive constituents. However, significant data gaps exist and this plan recommends the installation of additional groundwater monitoring wells and conducting additional types of radiological and chemical analyses.

  8. Cooperative Monitoring Center Occasional Paper/11: Cooperative Environmental Monitoring in the Coastal Regions of India and Pakistan

    SciTech Connect

    Rajen, Gauray

    1999-06-01

    The cessation of hostilities between India and Pakistan is an immediate need and of global concern, as these countries have tested nuclear devices, and have the capability to deploy nuclear weapons and long-range ballistic missiles. Cooperative monitoring projects among neighboring countries in South Asia could build regional confidence, and, through gradual improvements in relations, reduce the threat of war and the proliferation of weapons of mass destruction. This paper discusses monitoring the trans-border movement of flow and sediment in the Indian and Pakistani coastal areas. Through such a project, India and Pakistan could initiate greater cooperation, and engender movement towards the resolution of the Sir Creek territorial dispute in their coastal region. The Joint Working Groups dialogue being conducted by India and Pakistan provides a mechanism for promoting such a project. The proposed project also falls within a regional framework of cooperation agreed to by several South Asian countries. This framework has been codified in the South Asian Seas Action Plan, developed by Bangladesh, India, Maldives, Pakistan and Sri Lanka. This framework provides a useful starting point for Indian and Pakistani cooperative monitoring in their trans-border coastal area. The project discussed in this paper involves computer modeling, the placement of in situ sensors for remote data acquisition, and the development of joint reports. Preliminary computer modeling studies are presented in the paper. These results illustrate the cross-flow connections between Indian and Pakistani coastal regions and strengthen the argument for cooperation. Technologies and actions similar to those suggested for the coastal project are likely to be applied in future arms control and treaty verification agreements. The project, therefore, serves as a demonstration of cooperative monitoring technologies. The project will also increase people-to-people contacts among Indian and Pakistani policy

  9. Earthquake history of South Carolina

    USGS Publications Warehouse

    von Hake, C. A.

    1976-01-01

    An estimated $23 million damage was caused by one of the great earthquakes in United States history in 1886. Charleston, S.C, and nearby cities suffered most of the damage, although points as far as 160 km away were strongly shaken. Many of the 20 earthquakes of intensity V or greater (Modified Mercalli scale) that centered within South Carolina occurred near Charleston. A 1924 shock in the western part of the State was felt over 145,000 km2. Several earthquakes outside the State borders were felt strongly in South Carolina. 

  10. Earthquake Monitoring at 9° 50'N on the East Pacific Rise RIDGE 2000 Integrated Studies Site

    NASA Astrophysics Data System (ADS)

    Tolstoy, M.; Waldhauser, F.; Kim, W.

    2004-12-01

    In the fall of 2003 nine ocean bottom seismometers (OBSs) were deployed from the R/V Keldysh within the `bull's-eye' region of the R2K ISS at 9° 49'N - 9° 51'N on the East Pacific Rise as part of the Ridge 2000 Integrated Studies Site. These instruments were recovered using the R/V Atlantis in April 2004, and twelve more were deployed to take their place for a second year of monitoring (with three years total planned). During the turn-around cruise, two short temporary deployments (~4-8 days), of an additional 3 OBSs each, were accomplished to provide very dense instrument spacing (a few 100 m) around specific vents where in situ chemical monitoring was taking place (Luther et al.). Good data were collected on seven of the nine long deployment and six short deployment OBSs. We will present early results from analysis of these data including an estimate of the level of activity observed through-out the seven month period of the first deployment, and preliminary epicenters. Data will also be shown from the short temporary deployments. Early analysis of these data indicates an event rate of ~8 events per day for events where arrivals are apparent on at least three instruments, and may therefore expect to be located. Also notable in these data are pulses and prolonged periods of what appear to be tremor. This tremor is not generally coherent or synchronous from station to station and is therefore likely a very localized phenomena associated with hydrothermal fluid flow. The exceptionally well characterized and monitored seafloor at this site will allow for unprecedented correlation of observed seismic activity with local biology, geology, geochemical and hydrothermal monitoring. In addition, past and future detailed geophysical imaging of this area will provide an excellent context for observed faulting and fracturing.

  11. A National Tracking Center for Monitoring Shipments of HEU, MOX, and Spent Nuclear Fuel: How do we implement?

    SciTech Connect

    Mark Schanfein

    2009-07-01

    Nuclear material safeguards specialists and instrument developers at US Department of Energy (USDOE) National Laboratories in the United States, sponsored by the National Nuclear Security Administration (NNSA) Office of NA-24, have been developing devices to monitor shipments of UF6 cylinders and other radioactive materials , . Tracking devices are being developed that are capable of monitoring shipments of valuable radioactive materials in real time, using the Global Positioning System (GPS). We envision that such devices will be extremely useful, if not essential, for monitoring the shipment of these important cargoes of nuclear material, including highly-enriched uranium (HEU), mixed plutonium/uranium oxide (MOX), spent nuclear fuel, and, potentially, other large radioactive sources. To ensure nuclear material security and safeguards, it is extremely important to track these materials because they contain so-called “direct-use material” which is material that if diverted and processed could potentially be used to develop clandestine nuclear weapons . Large sources could be used for a dirty bomb also known as a radioactive dispersal device (RDD). For that matter, any interdiction by an adversary regardless of intent demands a rapid response. To make the fullest use of such tracking devices, we propose a National Tracking Center. This paper describes what the attributes of such a center would be and how it could ultimately be the prototype for an International Tracking Center, possibly to be based in Vienna, at the International Atomic Energy Agency (IAEA).

  12. Anthropogenic triggering of large earthquakes.

    PubMed

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor "foreshocks", since the induction may occur with a delay up to several years.

  13. Anthropogenic triggering of large earthquakes.

    PubMed

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1-10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor "foreshocks", since the induction may occur with a delay up to several years. PMID:25156190

  14. Anthropogenic Triggering of Large Earthquakes

    PubMed Central

    Mulargia, Francesco; Bizzarri, Andrea

    2014-01-01

    The physical mechanism of the anthropogenic triggering of large earthquakes on active faults is studied on the basis of experimental phenomenology, i.e., that earthquakes occur on active tectonic faults, that crustal stress values are those measured in situ and, on active faults, comply to the values of the stress drop measured for real earthquakes, that the static friction coefficients are those inferred on faults, and that the effective triggering stresses are those inferred for real earthquakes. Deriving the conditions for earthquake nucleation as a time-dependent solution of the Tresca-Von Mises criterion applied in the framework of poroelasticity yields that active faults can be triggered by fluid overpressures < 0.1 MPa. Comparing this with the deviatoric stresses at the depth of crustal hypocenters, which are of the order of 1–10 MPa, we find that injecting in the subsoil fluids at the pressures typical of oil and gas production and storage may trigger destructive earthquakes on active faults at a few tens of kilometers. Fluid pressure propagates as slow stress waves along geometric paths operating in a drained condition and can advance the natural occurrence of earthquakes by a substantial amount of time. Furthermore, it is illusory to control earthquake triggering by close monitoring of minor “foreshocks”, since the induction may occur with a delay up to several years. PMID:25156190

  15. U.S. Tsunami Information technology (TIM) Modernization: Performance Assessment of Tsunamigenic Earthquake Discrimination System

    NASA Astrophysics Data System (ADS)

    Hagerty, M. T.; Lomax, A.; Hellman, S. B.; Whitmore, P.; Weinstein, S.; Hirshorn, B. F.; Knight, W. R.

    2015-12-01

    Tsunami warning centers must rapidly decide whether an earthquake is likely to generate a destructive tsunami in order to issue a tsunami warning quickly after a large event. For very large events (Mw > 8 or so), magnitude and location alone are sufficient to warrant an alert. However, for events of smaller magnitude (e.g., Mw ~ 7.5), particularly for so-called "tsunami earthquakes", magnitude alone is insufficient to issue an alert and other measurements must be rapidly made and used to assess tsunamigenic potential. The Tsunami Information technology Modernization (TIM) is a National Oceanic and Atmospheric Administration (NOAA) project to update and standardize the earthquake and tsunami monitoring systems currently employed at the U.S. Tsunami Warning Centers in Ewa Beach, Hawaii (PTWC) and Palmer, Alaska (NTWC). We (ISTI) are responsible for implementing the seismic monitoring components in this new system, including real-time seismic data collection and seismic processing. The seismic data processor includes a variety of methods aimed at real-time discrimination of tsunamigenic events, including: Mwp, Me, slowness (Theta), W-phase, mantle magnitude (Mm), array processing and finite-fault inversion. In addition, it contains the ability to designate earthquake scenarios and play the resulting synthetic seismograms through the processing system. Thus, it is also a convenient tool that integrates research and monitoring and may be used to calibrate and tune the real-time monitoring system. Here we show results of the automated processing system for a large dataset of subduction zone earthquakes containing recent tsunami earthquakes and we examine the accuracy of the various discrimation methods and discuss issues related to their successful real-time application.

  16. CTEPP-OH DATA COLLECTED ON FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data set contains data for CTEPP-OH concerning the potential sources of pollutants at the day care center including the chemicals that have been applied in the past at the day care center by staff members or by commercial contractors. The day care teacher was asked questions...

  17. CTEPP NC DATA COLLECTED ON FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data set contains data concerning the potential sources of pollutants at the day care center including the chemicals that have been applied in the past at the day care center by staff members or by commercial contractors. The day care teacher was asked questions related to t...

  18. Satellite relay telemetry of seismic data in earthquake prediction and control

    USGS Publications Warehouse

    Jackson, Wayne H.; Eaton, Jerry P.

    1971-01-01

    The Satellite Telemetry Earthquake Monitoring Program was started in FY 1968 to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research (NCER) in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project.

  19. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2004

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Estes, Steve; Prejean, Stephanie; Sanchez, John J.; Sanches, Rebecca; McNutt, Stephen R.; Paskievitch, John

    2005-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988. The primary objectives of the seismic program are the real-time seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents the calculated earthquake hypocenter and phase arrival data, and changes in the seismic monitoring program for the period January 1 through December 31, 2004.These include Mount Wrangell, Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Katmai volcanic cluster (Snowy Mountain, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin), Mount Peulik, Aniakchak Crater, Mount Veniaminof, Pavlof Volcano, Mount Dutton, Isanotski Peaks, Shishaldin Volcano, Fisher Caldera, Westdahl Peak, Akutan Peak, Makushin Volcano, Okmok Caldera, Great Sitkin Volcano, Kanaga Volcano, Tanaga Volcano, and Mount Gareloi. Over the past year, formal monitoring of Okmok, Tanaga and Gareloi were announced following an extended period of monitoring to determine the background seismicity at each volcanic center. The seismicity at Mount Peulik was still being studied at the end of 2004 and has yet to be added to the list of monitored volcanoes in the AVO weekly update. AVO located 6928 earthquakes in 2004.Monitoring highlights in 2004 include: (1) an earthquake swarm at Westdahl Peak in January; (2) an increase in seismicity at Mount Spurr starting in February continuing through the end of the year into 2005; (4) low-level tremor, and low-frequency events related to intermittent ash and steam emissions at Mount Veniaminof between April and October; (4) low-level tremor at Shishaldin Volcano between April and

  20. Earthquakes and the urban environment. Volume III

    SciTech Connect

    Berlin, G.L.

    1980-01-01

    Because of the complex nature of earthquake effects, current investigations encompass many disciplines, including those of both the physical and social sciences. Research activities center on such diversified topics as earthquake mechanics, earthquake prediction and control, the prompt and accurate detection of tsunamis (seismic sea waves), earthquake-resistant construction, seismic building code improvements, land use zoning, earthquake risk and hazard perception, disaster preparedness, plus the study of the concerns and fears of people who have experienced the effects of an earthquake. This monograph attempts to amalgamate recent research input comprising the vivifying components of urban seismology at a level useful to those having an interest in the earthquake and its effects upon an urban environment. Volume 3 contains chapters on seismic planning, social aspects and future prospects.

  1. Earthquake Analysis.

    ERIC Educational Resources Information Center

    Espinoza, Fernando

    2000-01-01

    Indicates the importance of the development of students' measurement and estimation skills. Analyzes earthquake data recorded at seismograph stations and explains how to read and modify the graphs. Presents an activity for student evaluation. (YDS)

  2. Earthquake watch

    USGS Publications Warehouse

    Hill, M.

    1976-01-01

     When the time comes that earthquakes can be predicted accurately, what shall we do with the knowledge? This was the theme of a November 1975 conference on earthquake warning and response held in San Francisco called by Assistant Secretary of the Interior Jack W. Carlson. Invited were officials of State and local governments from Alaska, California, Hawaii, Idaho, Montana, Nevada, utah, Washington, and Wyoming and representatives of the news media. 

  3. Cooperative Monitoring Center Occasional Paper/4: Missile Control in South Asia and the Role of Cooperative Monitoring Technology

    SciTech Connect

    Kamal, N.; Sawhney, P.

    1998-10-01

    The succession of nuclear tests by India and Pakistan in May 1998 has changed the nature of their missile rivalry, which is only one of numerous manifestations of their relationship as hardened adversaries, deeply sensitive to each other's existing and evolving defense capabilities. The political context surrounding this costly rivalry remains unmediated by arms control measures or by any nascent prospect of detente. As a parallel development, sensible voices in both countries will continue to talk of building mutual confidence through openness to avert accidents, misjudgments, and misinterpretations. To facilitate a future peace process, this paper offers possible suggestions for stabilization that could be applied to India's and Pakistan's missile situation. Appendices include descriptions of existing missile agreements that have contributed to better relations for other countries as well as a list of the cooperative monitoring technologies available to provide information useful in implementing subcontinent missile regimes.

  4. Investigation of the TEC Changes in the vicinity of the Earthquake Preparation Zone

    NASA Astrophysics Data System (ADS)

    Ulukavak, Mustafa; Yalcinkaya, Mualla

    2016-04-01

    Recently, investigation of the anomalies in the ionosphere before the earthquake has taken too much attention. The Total Electron Content (TEC) data has been used to monitor the changes in the ionosphere. Hence, researchers use the TEC changes before the strong earthquakes to monitor the anomalies in the ionosphere. In this study, the GPS-TEC variations, obtained from the GNSS stations in the vicinity of the earthquake preparation zone, was investigated. Nidra earthquake (M6.5), which was occurred on the north-west of Greece on November 17th, 2015 (38.755°N, 20.552°E), was selected for this study. First, the equation proposed by Dobrovolsky et al. (1979) was used to calculate the radius of the earthquake preparation zone. International GNSS Service (IGS) stations in the region were classified with respect to the radius of the earthquake preparation zone. The observation data of each station was obtained from the Crustal Dynamics Data and Information System (CDDIS) archive to estimate GPS-TEC variations between 16 October 2015 and 16 December 2015. Global Ionosphere Maps (GIM) products, obtained from the IGS, was used to check the robustness of the GPS-TEC variations. Possible anomalies were analyzed for each GNSS station by using the 15-day moving median method. In order to analyze these pre-earthquake ionospheric anomalies, we investigated three indices (Kp, F10.7 and Dst) related to the space weather conditions between 16 October 2015 and 16 December 2015. Solar and geomagnetic indices were obtained from The Oceanic and Atmospheric Administration (NOAA), The Canadian Space Weather Forecast Centre (CSWFC), and the Data Analysis Center for Geomagnetism and Space Magnetism Graduate School of Science, Kyoto University (WDC). This study aims at investigating the possible effects of the earthquake on the TEC variations.

  5. Analysis of Instrumentation to Monitor the Hydrologic Performance of Green Infrastructure at the Edison Environmental Center

    EPA Science Inventory

    Infiltration is one of the primary functional mechanisms of green infrastructure stormwater controls, so this study explored selection and placement of embedded soil moisture and water level sensors to monitor surface infiltration and infiltration into the underlying soil for per...

  6. Cooperative Monitoring Center Occasional Paper/7: A Generic Model for Cooperative Border Security

    SciTech Connect

    Netzer, Colonel Gideon

    1999-03-01

    This paper presents a generic model for dealing with security problems along borders between countries. It presents descriptions and characteristics of various borders and identifies the threats to border security, while emphasizing cooperative monitoring solutions.

  7. Patient-Centered Technological Assessment and Monitoring of Depression for Low-Income Patients

    PubMed Central

    Wu, Shinyi; Vidyanti, Irene; Liu, Pai; Hawkins, Caitlin; Ramirez, Magaly; Guterman, Jeffrey; Gross-Schulman, Sandra; Sklaroff, Laura Myerchin; Ell, Kathleen

    2014-01-01

    Depression is a significant challenge for ambulatory care because it worsens health status and outcomes, increases health care utilizations and costs, and elevates suicide risk. An automatic telephonic assessment (ATA) system that links with tasks and alerts to providers may improve quality of depression care and increase provider productivity. We used ATA system in a trial to assess and monitor depressive symptoms of 444 safety-net primary care patients with diabetes. We assessed system properties, evaluated preliminary clinical outcomes, and estimated cost savings. The ATA system is feasible, reliable, valid, safe, and likely cost-effective for depression screening and monitoring for low-income primary care population. PMID:24525531

  8. Earthquakes in Arkansas and vicinity 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Ausbrooks, Scott M.

    2011-01-01

    This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  9. Upgrading the Digital Electronics of the PEP-II Bunch Current Monitors at the Stanford Linear Accelerator Center

    SciTech Connect

    Kline, Josh; /SLAC

    2006-08-28

    The testing of the upgrade prototype for the bunch current monitors (BCMs) in the PEP-II storage rings at the Stanford Linear Accelerator Center (SLAC) is the topic of this paper. Bunch current monitors are used to measure the charge in the electron/positron bunches traveling in particle storage rings. The BCMs in the PEP-II storage rings need to be upgraded because components of the current system have failed and are known to be failure prone with age, and several of the integrated chips are no longer produced making repairs difficult if not impossible. The main upgrade is replacing twelve old (1995) field programmable gate arrays (FPGAs) with a single Virtex II FPGA. The prototype was tested using computer synthesis tools, a commercial signal generator, and a fast pulse generator.

  10. Hatfield Marine Science Center Dynamic Revetment Project DSL permit # 45455-FP, Monitoring Report February, 2015

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  11. Hatfield Marine Science Center Dynamic Revetment Project DSL Permit # 45455-FP. Monitoring Report. February, 2014.

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  12. Hatfield Marine Science Center Dynamic Revetment Project DSL Permit # 45455-FP. Monitoring Report. February, 2016

    EPA Science Inventory

    Stabilization of the Yaquina Bay shoreline along the northeastern edge of the Hatfield Marine Science Center (HMSC) campus became necessary to halt erosion that threatened both HMSC critical infrastructure (seawater storage tank) and public access to the HMSC Nature Trail. A Dyn...

  13. CTEPP DATA COLLECTION FORM 05: CHILD DAY CARE CENTER PRE-MONITORING QUESTIONNAIRE

    EPA Science Inventory

    This data collection form is used to identify the potential sources of pollutants at the day care center. The day care teacher is asked questions related to the age of their day care building; age and frequency of cleaning carpets or rugs; types of heating and air conditioning de...

  14. MONITORING TOXIC ORGANIC GASES AND PARTICLES NEAR THE WORLD TRADE CENTER AFTER SEPTEMBER 11, 2001

    EPA Science Inventory

    The September 11, 2001 attack on the World Trade Center (WTC) resulted in an intense fire and the subsequent, complete collapse of the two main structures and adjacent buildings, as well as significant damage to many surrounding buildings within and around the WTC complex. Thi...

  15. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-31

    ... Geological Survey National Earthquake Prediction Evaluation Council (NEPEC) AGENCY: Department of the... National Earthquake Prediction Evaluation Council (NEPEC) will hold a 1\\1/2\\ day meeting on September 17 and 18, 2012, at the U.S. Geological Survey National Earthquake Information Center (NEIC),...

  16. User-centered development and testing of a monitoring system that provides feedback regarding physical functioning to elderly people

    PubMed Central

    Vermeulen, Joan; Neyens, Jacques CL; Spreeuwenberg, Marieke D; van Rossum, Erik; Sipers, Walther; Habets, Herbert; Hewson, David J; de Witte, Luc P

    2013-01-01

    Purpose To involve elderly people during the development of a mobile interface of a monitoring system that provides feedback to them regarding changes in physical functioning and to test the system in a pilot study. Methods and participants The iterative user-centered development process consisted of the following phases: (1) selection of user representatives; (2) analysis of users and their context; (3) identification of user requirements; (4) development of the interface; and (5) evaluation of the interface in the lab. Subsequently, the monitoring and feedback system was tested in a pilot study by five patients who were recruited via a geriatric outpatient clinic. Participants used a bathroom scale to monitor weight and balance, and a mobile phone to monitor physical activity on a daily basis for six weeks. Personalized feedback was provided via the interface of the mobile phone. Usability was evaluated on a scale from 1 to 7 using a modified version of the Post-Study System Usability Questionnaire (PSSUQ); higher scores indicated better usability. Interviews were conducted to gain insight into the experiences of the participants with the system. Results The developed interface uses colors, emoticons, and written and/or spoken text messages to provide daily feedback regarding (changes in) weight, balance, and physical activity. The participants rated the usability of the monitoring and feedback system with a mean score of 5.2 (standard deviation 0.90) on the modified PSSUQ. The interviews revealed that most participants liked using the system and appreciated that it signaled changes in their physical functioning. However, usability was negatively influenced by a few technical errors. Conclusion Involvement of elderly users during the development process resulted in an interface with good usability. However, the technical functioning of the monitoring system needs to be optimized before it can be used to support elderly people in their self-management. PMID

  17. Discrimination of quarry blasts and earthquakes in the vicinity of Istanbul using soft computing techniques

    NASA Astrophysics Data System (ADS)

    Yıldırım, Eray; Gülbağ, Ali; Horasan, Gündüz; Doğan, Emrah

    2011-09-01

    The purpose of this article is to demonstrate the use of feedforward neural networks (FFNNs), adaptive neural fuzzy inference systems (ANFIS), and probabilistic neural networks (PNNs) to discriminate between earthquakes and quarry blasts in Istanbul and vicinity (the Marmara region). The tectonically active Marmara region is affected by the Thrace-Eskişehir fault zone and especially the North Anatolian fault zone (NAFZ). Local MARNET stations, which were established in 1976 and are operated by the Kandilli Observatory and Earthquake Research Institute (KOERI), record not only earthquakes that occur in the region, but also quarry blasts. There are a few quarry-blasting areas in the Gaziosmanpaşa, Çatalca, Ömerli, and Hereke regions. Analytical methods were applied to a set of 175 seismic events (2001-2004) recorded by the stations of the local seismic network (ISK, HRT, and CTT stations) operated by the KOERI National Earthquake Monitoring Center (NEMC). Out of a total of 175 records, 148 are related to quarry blasts and 27 to earthquakes. The data sets were divided into training and testing sets for each region. In all the models developed, the input vectors consist of the peak amplitude ratio (S/P ratio) and the complexity value, and the output is a determination of either earthquake or quarry blast. The success of the developed models on regional test data varies between 97.67% and 100%.

  18. Monitoring of the permeable pavement demonstration site at Edison Environmental Center

    EPA Science Inventory

    The EPA’s Urban Watershed Management Branch has installed an instrumented, working full-scale 110-space pervious pavement parking lot and has been monitoring several environmental stressors and runoff. This parking lot demonstration site has allowed the investigation of differenc...

  19. CTEPP-OH DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the child’s daily activities and potential exposures to pollutants at their homes for CTEPP-OH. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on th...

  20. CTEPP DATA COLLECTION FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data collection form is used to provide information on the child's daily activities and potential exposures to pollutants at their homes. It includes questions on chemicals applied and cigarettes smoked at the home over the 48-hr monitoring period. It also collects informati...

  1. CTEPP NC DATA COLLECTED ON FORM 07: CHILD DAY CARE CENTER POST-MONITORING

    EPA Science Inventory

    This data set contains data concerning the child’s daily activities and potential exposures to pollutants at their homes. It included questions on chemicals applied and cigarettes smoked at the home over the 48-h monitoring period. It also collected information on the child’s han...

  2. Aftershocks series monitoring of the September 18, 2004 M = 4.6 earthquake at the western Pyrenees: A case of reservoir-triggered seismicity?

    NASA Astrophysics Data System (ADS)

    Ruiz, M.; Gaspà, O.; Gallart, J.; Díaz, J.; Pulgar, J. A.; García-Sansegundo, J.; López-Fernández, C.; González-Cortina, J. M.

    2006-10-01

    On September 18, 2004, a 4.6 mbLg earthquake was widely felt in the region around Pamplona, at the western Pyrenees. Preliminary locations reported an epicenter less than 20 km ESE of Pamplona and close to the Itoiz reservoir, which started impounding in January 2004. The area apparently lacks of significant seismic activity in recent times. After the main shock, which was preceded by series of foreshocks reaching magnitudes of 3.3 mbLg, a dense temporal network of 13 seismic stations was deployed there to monitor the aftershocks series and to constrain the hypocentral pattern. Aftershock determinations obtained with a double-difference algorithm define a narrow epicentral zone of less than 10 km 2, ESE-WNW oriented. The events are mainly concentrated between 3 and 9 km depth. Focal solutions were computed for the main event and 12 aftershocks including the highest secondary one of 3.8 mbLg. They show mainly normal faulting with some strike-slip component and one of the nodal planes oriented NW-SE and dipping to the NE. Cross-correlation techniques applied to detect and associate events with similar waveforms, provided up to 33 families relating the 67% of the 326 relocated aftershocks. Families show event clusters grouped by periods and migrating from NW to SE. Interestingly, the narrow epicentral zone inferred here is located less than 4 km away from the 111-m high Itoiz dam. These hypocentral results, and the correlation observed between fluctuations of the reservoir water level and the seismic activity, favour the explanation of this foreshock-aftershock series as a rapid response case of reservoir-triggered seismicity, burst by the first impoundment of the Itoiz reservoir. The region is folded and affected by shallow dipping thrusts, and the Itoiz reservoir is located on the hangingwall of a low angle southward verging thrust, which might be a case sensible to water level fluctuations. However, continued seismic monitoring in the coming years is mandatory in

  3. Earthquake Shaking - Finding the "Hot Spots"

    USGS Publications Warehouse

    Field, Ned; Jones, Lucile; Jordan, Tom; Benthien, Mark; Wald, Lisa

    2001-01-01

    A new Southern California Earthquake Center study has quantified how local geologic conditions affect the shaking experienced in an earthquake. The important geologic factors at a site are softness of the rock or soil near the surface and thickness of the sediments above hard bedrock. Even when these 'site effects' are taken into account, however, each earthquake exhibits unique 'hotspots' of anomalously strong shaking. Better predictions of strong ground shaking will therefore require additional geologic data and more comprehensive computer simulations of individual earthquakes.

  4. Cooperative Monitoring Center Occasional Paper/9: De-Alerting Strategic Ballistic Missiles

    SciTech Connect

    Connell, Leonard W.; Edenburn, Michael W.; Fraley, Stanley K.; Trost, Lawrence C.

    1999-03-01

    This paper presents a framework for evaluating the technical merits of strategic ballistic missile de-alerting measures, and it uses the framework to evaluate a variety of possible measures for silo-based, land-mobile, and submarine-based missiles. De-alerting measures are defined for the purpose of this paper as reversible actions taken to increase the time or effort required to launch a strategic ballistic missile. The paper does not assess the desirability of pursuing a de-alerting program. Such an assessment is highly context dependent. The paper postulates that if de-alerting is desirable and is used as an arms control mechanism, de-alerting measures should satisfy specific cirteria relating to force security, practicality, effectiveness, significant delay, and verifiability. Silo-launched missiles lend themselves most readily to de-alerting verification, because communications necessary for monitoring do not increase the vulnerabilty of the weapons by a significant amount. Land-mobile missile de-alerting measures would be more challenging to verify, because monitoring measures that disclose the launcher's location would potentially increase their vulnerability. Submarine-launched missile de-alerting measures would be extremely challlenging if not impossible to monitor without increasing the submarine's vulnerability.

  5. X-ray Weekly Monitoring of the Galactic Center Sgr A* with Suzaku

    NASA Astrophysics Data System (ADS)

    Maeda, Yoshitomo; Nobukawa, Masayoshi; Hayashi, Takayuki; Iizuka, Ryo; Saitoh, Takayuki; Murakami, Hiroshi

    A small gas cloud, G2, is on an orbit almost straight into the supermassive blackhole Sgr A* by spring 2014. This event gives us a rare opportunity to test the mass feeding onto the blackhole by a gas. To catch a possible rise of the mass accretion from the cloud, we have been performing the bi-week monitoring of Sgr A* in autumn and spring in the 2013 fiscal year. The key feature of Suzaku is the high-sensitivity wide-band X-ray spectroscopy all in one observatory. It is characterized by a large effective area combined with low background and good energy resolution, in particular a good line spread function in the low-energy range. Since the desired flare events associated with the G2 approach is a transient event, the large effective area is critical and powerful tools to hunt them. The first monitoring in 2013 autumn was successfully made. The X-rays from Sgr A* and its nearby emission were clearly resolved from the bright transient source AX J1745.6-2901. No very large flare from Sgr A*was found during the monitoring. We also may report the X-ray properties of two serendipitous sources, the neutron star binary AX J1745.6-2901 and a magnetar SGR J1745-29.

  6. Space weather monitoring by ground-based means carried out in Polar Geophysical Center at Arctic and Antarctic Research Institute

    NASA Astrophysics Data System (ADS)

    Janzhura, Alexander

    A real-time information on geophysical processes in polar regions is very important for goals of Space Weather monitoring by the ground-based means. The modern communication systems and computer technology makes it possible to collect and process the data from remote sites without significant delays. A new acquisition equipment based on microprocessor modules and reliable in hush climatic conditions was deployed at the Roshydromet networks of geophysical observations in Arctic and is deployed at observatories in Antarctic. A contemporary system for on-line collecting and transmitting the geophysical data from the Arctic and Antarctic stations to AARI has been realized and the Polar Geophysical Center (PGC) arranged at AARI ensures the near-real time processing and analyzing the geophysical information from 11 stations in Arctic and 5 stations in Antarctic. The space weather monitoring by the ground based means is one of the main tasks standing before the Polar Geophysical Center. As studies by Troshichev and Janzhura, [2012] showed, the PC index characterizing the polar cap magnetic activity appeared to be an adequate indicator of the solar wind energy that entered into the magnetosphere and the energy that is accumulating in the magnetosphere. A great advantage of the PC index application over other methods based on satellite data is a permanent on-line availability of information about magnetic activity in both northern and southern polar caps. A special procedure agreed between Arctic and Antarctic Research Institute (AARI) and Space Institute of the Danish Technical University (DTUSpace) ensures calculation of the unified PC index in quasi-real time by magnetic data from the Thule and Vostok stations (see public site: http://pc-index.org). The method for estimation of AL and Dst indices (as indicators of state of the disturbed magnetosphere) based on data on foregoing PC indices has been elaborated and testified in the Polar Geophysical Center. It is

  7. The EM Earthquake Precursor

    NASA Astrophysics Data System (ADS)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    directional techniques were employed, resulting in three mapped, potential epicenters. The remaining, weaker signals presented similar directionality results to more epicentral locations. In addition, the directional results of the Timpson field tests lead to the design and construction of a third prototype antenna. In a laboratory setting, experiments were created to fail igneous rock types within a custom-designed Faraday Cage. An antenna emplaced within the cage detected EM emissions, which were both reproducible and distinct, and the laboratory results paralleled field results. With a viable system and continuous monitoring, a fracture cycle could be established and observed in real-time. Sequentially, field data would be reviewed quickly for assessment; thus, leading to a much improved earthquake forecasting capability. The EM precursor determined by this method may surpass all prior precursor claims, and the general public will finally receive long overdue forecasting.

  8. 4 Earthquake: Major offshore earthquakes recall the Aztec myth

    USGS Publications Warehouse

    ,

    1970-01-01

    Long before the sun clears the eastern mountains of April 29, 1970, the savanna highlands of Chiapas tremble from a magnitude 6.7 earthquake centered off the Pacific coast near Mexico’s southern border. Then, for a few hours, he Isthmus of Tehuantepec is quiet.

  9. Research on Earthquake Precursor in E-TEC: A Study on Land Surface Thermal Anomalies Using MODIS LST Product in Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, W. Y.; Wu, M. C.

    2014-12-01

    Taiwan has been known as an excellent natural laboratory characterized by rapid active tectonic rate and high dense seismicity. The Eastern Taiwan Earthquake Research Center (E-TEC) is established on 2013/09/24 in National Dong Hwa University and collaborates with Central Weather Bureau (CWB), National Center for Research on Earthquake Engineering (NCREE), National Science and Technology Center for Disaster Reduction (NCDR), Institute of Earth Science of Academia Sinica (IES, AS) and other institutions (NCU, NTU, CCU) and aims to provide an integrated platform for researchers to conduct the new advances on earthquake precursors and early warning for seismic disaster prevention in the eastern Taiwan, as frequent temblors are most common in the East Taiwan rift valley. E-TEC intends to integrate the multi-disciplinary observations and is equipped with stations to monitor a wide array of factors of quake precursors, including seismicity, GPS, strain-meter, ground water, geochemistry, gravity, electromagnetic, ionospheric density, thermal infrared remote sensing, gamma radiation etc, and will maximize the value of the data for researches with the range of monitoring equipment that enable to predict where and when the next devastated earthquake will strike Taiwan and develop reliable earthquake prediction models. A preliminary study on earthquake precursor using monthly Moderate Resolution Imaging Spectroradiometer (MODIS) Land Surface Temperature (LST) data before 2013/03/27 Mw6.2 Nantou earthquake in Taiwan is presented. Using the statistical analysis, the result shows the peak of the anomalous LST that exceeds a standard deviation of LST appeared on 2013/03/09 and became less or none anomalies observed on 2013/03/16 before the main-shock, which is in consist with the phenomenon observed by other researchers. This preliminary experimental result shows that the thermal anomalies reveal the possibility to associate surface thermal phenomena before the strong earthquakes.

  10. Earthquakes in Mississippi and vicinity 1811-2010

    USGS Publications Warehouse

    Dart, Richard L.; Bograd, Michael B.E.

    2011-01-01

    This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  11. 2010 Chile Earthquake Aftershock Response

    NASA Astrophysics Data System (ADS)

    Barientos, Sergio

    2010-05-01

    1906? Since the number of M>7.0 aftershocks has been low, does the distribution of large-magnitude aftershocks differ from previous events of this size? What is the origin of the extensional-type aftershocks at shallow depths within the upper plate? The international seismological community (France, Germany, U.K., U.S.A.) in collaboration with the Chilean seismological community responded with a total of 140 portable seismic stations to deploy in order to record aftershocks. This combined with the Chilean permanent seismic network, in the area results in 180 stations now in operation recording continuous at 100 cps. The seismic equipment is a mix of accelerometers, short -period and broadband seismic sensors deployed along the entire length of the aftershock zone that will record the aftershock sequence for three to six months. The collected seismic data will be merged and archived to produce an international data set open to the entire seismological community immediately after archiving. Each international group will submit their data as soon as possible in standard (mini seed) format with accompanying meta data to the IRIS DMC where the data will be merged into a combined data set and available to individuals and other data centers. This will be by far the best-recorded aftershock sequence of a large megathrust earthquake. This outstanding international collaboration will provide an open data set for this important earthquake as well as provide a model for future aftershock deployments around the world.

  12. The wireless networking system of Earthquake precursor mobile field observation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Teng, Y.; Wang, X.; Fan, X.; Wang, X.

    2012-12-01

    The mobile field observation network could be real-time, reliably record and transmit large amounts of data, strengthen the physical signal observations in specific regions and specific period, it can improve the monitoring capacity and abnormal tracking capability. According to the features of scatter everywhere, a large number of current earthquake precursor observation measuring points, networking technology is based on wireless broadband accessing McWILL system, the communication system of earthquake precursor mobile field observation would real-time, reliably transmit large amounts of data to the monitoring center from measuring points through the connection about equipment and wireless accessing system, broadband wireless access system and precursor mobile observation management center system, thereby implementing remote instrument monitoring and data transmition. At present, the earthquake precursor field mobile observation network technology has been applied to fluxgate magnetometer array geomagnetic observations of Tianzhu, Xichang,and Xinjiang, it can be real-time monitoring the working status of the observational instruments of large area laid after the last two or three years, large scale field operation. Therefore, it can get geomagnetic field data of the local refinement regions and provide high-quality observational data for impending earthquake tracking forecast. Although, wireless networking technology is very suitable for mobile field observation with the features of simple, flexible networking etc, it also has the phenomenon of packet loss etc when transmitting a large number of observational data due to the wireless relatively weak signal and narrow bandwidth. In view of high sampling rate instruments, this project uses data compression and effectively solves the problem of data transmission packet loss; Control commands, status data and observational data transmission use different priorities and means, which control the packet loss rate within

  13. Cooperative Monitoring Center Occasional Paper/8: Cooperative Border Security for Jordan: Assessment and Options

    SciTech Connect

    Qojas, M.

    1999-03-01

    This document is an analysis of options for unilateral and cooperative action to improve the security of Jordan's borders. Sections describe the current political, economic, and social interactions along Jordan's borders. Next, the document discusses border security strategy for cooperation among neighboring countries and the adoption of confidence-building measures. A practical cooperative monitoring system would consist of hardware for early warning, command and control, communications, and transportation. Technical solutions can expand opportunities for the detection and identification of intruders. Sensors (such as seismic, break-wire, pressure-sensing, etc.) can warn border security forces of intrusion and contribute to the identification of the intrusion and help formulate the response. This document describes conceptual options for cooperation, offering three scenarios that relate to three hypothetical levels (low, medium, and high) of cooperation. Potential cooperative efforts under a low cooperation scenario could include information exchanges on military equipment and schedules to prevent misunderstandings and the establishment of protocols for handling emergency situations or unusual circumstances. Measures under a medium cooperation scenario could include establishing joint monitoring groups for better communications, with hot lines and scheduled meetings. The high cooperation scenario describes coordinated responses, joint border patrols, and sharing border intrusion information. Finally, the document lists recommendations for organizational, technical, and operational initiatives that could be applicable to the current situation.

  14. Streamflow, groundwater, and water-quality monitoring by USGS Nevada Water Science Center

    USGS Publications Warehouse

    Gipson, Marsha L.; Schmidt, Kurtiss

    2013-01-01

    The U.S. Geological Survey (USGS) has monitored and assessed the quantity and quality of our Nation's streams and aquifers since its inception in 1879. Today, the USGS provides hydrologic information to aid in the evaluation of the availability and suitability of water for public and domestic supply, agriculture, aquatic ecosystems, mining, and energy development. Although the USGS has no responsibility for the regulation of water resources, the USGS hydrologic data complement much of the data collected by state, county, and municipal agencies, tribal nations, U.S. District Court Water Masters, and other federal agencies such as the Environmental Protection Agency, which focuses on monitoring for regulatory compliance. The USGS continues its mission to provide timely and relevant water-resources data and information that are available to water-resource managers, non-profit organizations, industry, academia, and the public. Data collected by the USGS provide the science needed for informed decision-making related to resource management and restoration, assessment of flood and drought hazards, ecosystem health, and effects on water resources from land-use changes.

  15. Selected natural attenuation monitoring data, Operable Unit 1, Naval Undersea Warfare Center, Division Keyport, Washington, June 2001

    USGS Publications Warehouse

    Dinico, Richard Steven

    2003-01-01

    Previous investigations have shown that natural attenuation and biodegradation of chlorinated volatile organic compounds (CVOCs) are substantial in shallow ground water beneath the 9-acre former landfill at Operable Unit 1 (OU 1), Naval Undersea Warfare Center (NUWC), Division Keyport, Washington. The U.S. Geological Survey (USGS) has continued to monitor ground-water geochemistry to assure that conditions remain favorable for contaminant biodegradation. This report presents the ground-water geochemical and selected CVOC data collected at OU 1 by the USGS during June 11-14, 2001 in support of the long-term monitoring for natural attenuation. Overall, the June 2001 data indicate that redox conditions in the upper aquifer remain favorable for reductive dechlorination of CVOCs because strongly reducing conditions persisted beneath much of the former landfill. Redox conditions in the intermediate aquifer down gradient of the landfill appear to have become more favorable for reductive dechlorination because June 2001 dissolved hydrogen concentrations indicated strongly reducing conditions there for the first time. Although changes in redox conditions were observed at certain wells during 2001, a longer monitoring period is needed to ascertain if phytoremediation activities are affecting the ground-water chemistry. A minor change to future monitoring is proposed. Filtered organic carbon (previously referred to as dissolved, and defined as that which passes through a 0.45-micrometer membrane filter) should be analyzed in the future rather than unfiltered (previously referred to as total) organic carbon because the filtered analysis may be a better measure of bioavailable organic carbon. Unfiltered and filtered organic carbon data were collected during June 2001 for comparison. Filtered organic carbon data collected in the future could be reasonably compared with historical unfiltered organic carbon data by multiplying the historical data by a factor of about 0.9.

  16. America's faulty earthquake plans

    SciTech Connect

    Rosen, J

    1989-10-01

    In this article, the author discusses the liklihood of major earthquakes in both the western and eastern United States as well as the level of preparedness of each region of the U.S. for a major earthquake. Current technology in both earthquake-resistance design and earthquake detection is described. Governmental programs for earthquake hazard reduction are outlined and critiqued.

  17. A patient centered system for decubitus prevention based on nutrition, drinking, physical activity and sleep monitoring.

    PubMed

    Falgenhauer, Markus; Zöscher, Sebastian; Morak, Jürgen; Schneider, Cornelia; Gugerell, Monika; Liebhart, Walter; Hayn, Dieter

    2013-01-01

    State-of-the-art decubitus prevention focuses mainly on special decubitus mattresses, which are indicated for extremely high risk only, while other risk factors such as nutrition or physical activity are hardly considered. Therefore, a monitoring system for decubitus prevention for persons with medium risk has been developed. The system consisted of an unobtrusive sensor system and a tablet for manual input of decubitus-relevant data concerning nutrition, drinking behavior and physical activity. The system was tested in a feasibility study. Results indicate that the system is usable and can provide useful information for decubitus prevention. Future work will include a field study, evaluating the system in a long-term study. PMID:23920807

  18. Retrofitting Laboratory Fume Hoods With Face Velocity Monitors at NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Wagner, Ingrid E.; Bold, Margaret D.; Diamond, David B.; Kall, Phillip M.

    1997-01-01

    Extensive use and reliance on laboratory fume hoods exist at LeRC for the control of chemical hazards (nearly 175 fume hoods). Flow-measuring devices are necessary to continually monitor hood performance. The flow-measuring device should he tied into an energy management control system to detect problems at a central location without relying on the users to convey information of a problem. Compatibility concerns and limitations should always be considered when choosing the most effective flow-measuring device for a particular situation. Good practice on initial hood design and placement will provide a system for which a flow-measuring device may be used to its full potential and effectiveness.

  19. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    PubMed

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-01

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential.

  20. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  1. Selected natural attenuation monitoring data, Operable Unit 1, Naval Undersea Warfare Center, Division Keyport, Washington, June 2002

    USGS Publications Warehouse

    Dinicola, Richard S.

    2004-01-01

    Previous investigations indicated that natural attenuation and biodegradation of chlorinated volatile organic compounds (CVOCs) are substantial in shallow ground water beneath the 9-acre former landfill at Operable Unit 1 (OU 1), Naval Undersea Warfare Center (NUWC), Division Keyport, Washington. The U.S. Geological Survey (USGS) has continued to monitor ground-water geochemistry to assure that conditions remain favorable for contaminant biodegradation. This report presents the geochemical and selected CVOC data for ground water at OU 1, collected by the USGS during June 10-14, 2002, in support of long-term monitoring for natural attenuation. Overall, the geochemical data for June 2002 indicate that redox conditions in the upper-aquifer water remain favorable for reductive dechlorination of chlorinated VOCs because strongly reducing conditions persisted beneath much of the former landfill. Redox conditions in the intermediate aquifer downgradient of the landfill also remained favorable for reductive dechlorination, although the 2002 dissolved hydrogen (H2) concentration from well MW1-28 is questionable. Changes in redox conditions were observed at certain wells during 2002, but a longer monitoring period and more thorough interpretation are needed to ascertain if phytoremediation activities are affecting redox conditions and if biodegradation processes are changing over time. The Navy intends to complete a more thorough interpretation in preparation for the 5-year review of OU 1 scheduled for 2004. There were a few substantial differences between the 2002 concentrations and previously observed concentrations of volatile organic compounds. Total CVOC concentrations in 2002 samples decreased substantially in all piezometers sampled in the northern plantation, and the largest percentages of decrease were for the compounds trichloroethene (TCE) and cis-1,2-dichloroethene (cis-DCE). Changes in total CVOC concentrations in the southern plantation were less consistent

  2. Estimation of Future Earthquake Losses in California

    NASA Astrophysics Data System (ADS)

    Rowshandel, B.; Wills, C. J.; Cao, T.; Reichle, M.; Branum, D.

    2003-12-01

    Recent developments in earthquake hazards and damage modeling, computing, and data management and processing, have made it possible to develop estimates of the levels of damage from earthquakes that may be expected in the future in California. These developments have been mostly published in the open literature, and provide an opportunity to estimate the levels of earthquake damage Californians can expect to suffer during the next several decades. Within the past 30 years, earthquake losses have increased dramatically, mostly because our exposure to earthquake hazards has increased. All but four of the recent damaging earthquakes have occurred distant from California's major population centers. Two, the Loma Prieta earthquake and the San Fernando earthquake, occurred on the edges of major populated areas. Loma Prieta caused significant damage in the nearby Santa Cruz and in the more distant, heavily populated, San Francisco Bay area. The 1971 San Fernando earthquake had an epicenter in the lightly populated San Gabriel Mountains, but caused slightly over 2 billion dollars in damage in the Los Angeles area. As urban areas continue to expand, the population and infrastructure at risk increases. When earthquakes occur closer to populated areas, damage is more significant. The relatively minor Whittier Narrows earthquake of 1987 caused over 500 million dollars in damage because it occurred in the Los Angeles metropolitan area, not at its fringes. The Northridge earthquake had fault rupture directly beneath the San Fernando Valley, and caused about 46 billion dollars in damage. This vast increase in damage from the San Fernando earthquake reflected both the location of the earthquake directly beneath the populated area and the 23 years of continued development and resulting greater exposure to potential damage. We have calculated losses from potential future earthquake, both as scenarios of potential earthquakes and as annualized losses considering all the potential

  3. Darwin's earthquake.

    PubMed

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant.

  4. Earthquake Information System

    NASA Technical Reports Server (NTRS)

    1991-01-01

    IAEMIS (Integrated Automated Emergency Management Information System) is the principal tool of an earthquake preparedness program developed by Martin Marietta and the Mid-America Remote Sensing Center (MARC). It is a two-component set of software, data and procedures to provide information enabling management personnel to make informed decisions in disaster situations. The NASA-developed program ELAS, originally used to analyze Landsat data, provides MARC with a spatially-oriented information management system. Additional MARC projects include land resources management, and development of socioeconomic data.

  5. Radioanalytical Data Quality Objectives and Measurement Quality Objectives during a Federal Radiological Monitoring and Assessment Center Response

    SciTech Connect

    E. C. Nielsen

    2006-01-01

    During the early and intermediate phases of a nuclear or radiological incident, the Federal Radiological Monitoring and Assessment Center (FRMAC) collects environmental samples that are analyzed by organizations with radioanalytical capability. Resources dedicated to quality assurance (QA) activities must be sufficient to assure that appropriate radioanalytical measurement quality objectives (MQOs) and assessment data quality objectives (DQOs) are met. As the emergency stabilizes, QA activities will evolve commensurate with the need to reach appropriate DQOs. The MQOs represent a compromise between precise analytical determinations and the timeliness necessary for emergency response activities. Minimum detectable concentration (MDC), lower limit of detection, and critical level tests can all serve as measurements reflecting the MQOs. The relationship among protective action guides (PAGs), derived response levels (DRLs), and laboratory detection limits is described. The rationale used to determine the appropriate laboratory detection limit is described.

  6. Monitoring poison control center data to detect health hazards during hurricane season--Florida, 2003-2005.

    PubMed

    2006-04-21

    Eight hurricanes made landfall in Florida from August 13, 2004, through October 24, 2005. Each hurricane caused flooding and widespread power outages. In the fall of 2004, the Florida Department of Health (FDOH) began retrospectively reviewing data collected by the Florida Poison Information Center Network (FPICN) during the 2004 hurricane season. During the 2005 hurricane season, FDOH, in consultation with FPICN, initiated daily monitoring of FPICN records of exposures that might reflect storm-related health hazards. Analysis of these data determined that 28 carbon monoxide (CO) exposures were reported to FPICN in the 2 days after Hurricane Katrina made its August 25, 2005, landfall in Florida, en route to a second landfall on the Gulf Coast. Data on CO and other exposures were used to develop and distribute public health prevention messages to Florida communities affected by hurricanes.

  7. Launch Complex 39 Observation Gantry Area (SWMU# 107) Annual Long-Term Monitoring Report (Year 1) Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Johnson, Jill W.; Towns, Crystal

    2015-01-01

    This document has been prepared by Geosyntec Consultants, Inc. (Geosyntec) to present and discuss the findings of the 2014 and 2015 Long-Term Monitoring (LTM) activities that were completed at the Launch Complex 39 (LC39) Observation Gantry Area (OGA) located at the John F. Kennedy Space Center (KSC), Florida (Site). The remainder of this report includes: (i) a description of the Site location; (ii) summary of Site background and previous investigations; (iii) description of field activities completed as part of the annual LTM program at the Site; (iv) groundwater flow evaluation; (v) presentation and discussion of field and analytical results; and (vi) conclusions and recommendations. Applicable KSC Remediation Team (KSCRT) Meeting minutes are included in Attachment A. This Annual LTM Letter Report was prepared by Geosyntec Consultants (Geosyntec) for NASA under contract number NNK12CA13B, Delivery Order NNK13CA39T project number PCN ENV2188.

  8. Earthquakes and plate tectonics.

    USGS Publications Warehouse

    Spall, H.

    1982-01-01

    Earthquakes occur at the following three kinds of plate boundary: ocean ridges where the plates are pulled apart, margins where the plates scrape past one another, and margins where one plate is thrust under the other. Thus, we can predict the general regions on the earth's surface where we can expect large earthquakes in the future. We know that each year about 140 earthquakes of magnitude 6 or greater will occur within this area which is 10% of the earth's surface. But on a worldwide basis we cannot say with much accuracy when these events will occur. The reason is that the processes in plate tectonics have been going on for millions of years. Averaged over this interval, plate motions amount to several mm per year. But at any instant in geologic time, for example the year 1982, we do not know, exactly where we are in the worldwide cycle of strain build-up and strain release. Only by monitoring the stress and strain in small areas, for instance, the San Andreas fault, in great detail can we hope to predict when renewed activity in that part of the plate tectonics arena is likely to take place. -from Author

  9. The Cooperative Monitoring Center: Achieving cooperative security objectives through technical collaborations

    SciTech Connect

    Pregenzer, A.

    1996-08-01

    The post cold war security environment poses both difficult challenges and encouraging opportunities. Some of the most difficult challenges are related to regional conflict and the proliferation of weapons of mass destruction. New and innovative approaches to prevent the proliferation of weapons of mass destruction are essential. More effort must be focused on underlying factors that motivate countries to seek weapons of mass destruction. Historically the emphasis has been on denial: denying information, denying technology, and denying materials necessary to build such weapons. Though still important, those efforts are increasingly perceived to be insufficient, and initiatives that address underlying motivational factors are needed. On the opportunity side, efforts to establish regional dialogue and confidence-building measures are increasing in many areas. Such efforts can result in cooperative agreements on security issues such as border control, demilitarized zones, weapons delivery systems, weapons of mass destruction free zones, environmental agreements, and resource sharing. In some cases, implementing such cooperative agreements will mean acquiring, analyzing, and sharing large quantities of data and sensitive information. These arrangements for ``cooperative monitoring`` are becoming increasingly important to the security of individual countries, regions, and international institutions. However, many countries lack sufficient technical and institutional infrastructure to take full advantage of these opportunities. Constructing a peaceful twenty-first century will require that technology is brought to bear in the most productive and innovative ways to meet the challenges of proliferation and to maximize the opportunities for cooperation.

  10. Long Period Earthquakes Beneath California's Young and Restless Volcanoes

    NASA Astrophysics Data System (ADS)

    Pitt, A. M.; Dawson, P. B.; Shelly, D. R.; Hill, D. P.; Mangan, M.

    2013-12-01

    The newly established USGS California Volcano Observatory has the broad responsibility of monitoring and assessing hazards at California's potentially threatening volcanoes, most notably Mount Shasta, Medicine Lake, Clear Lake Volcanic Field, and Lassen Volcanic Center in northern California; and Long Valley Caldera, Mammoth Mountain, and Mono-Inyo Craters in east-central California. Volcanic eruptions occur in California about as frequently as the largest San Andreas Fault Zone earthquakes-more than ten eruptions have occurred in the last 1,000 years, most recently at Lassen Peak (1666 C.E. and 1914-1917 C.E.) and Mono-Inyo Craters (c. 1700 C.E.). The Long Valley region (Long Valley caldera and Mammoth Mountain) underwent several episodes of heightened unrest over the last three decades, including intense swarms of volcano-tectonic (VT) earthquakes, rapid caldera uplift, and hazardous CO2 emissions. Both Medicine Lake and Lassen are subsiding at appreciable rates, and along with Clear Lake, Long Valley Caldera, and Mammoth Mountain, sporadically experience long period (LP) earthquakes related to migration of magmatic or hydrothermal fluids. Worldwide, the last two decades have shown the importance of tracking LP earthquakes beneath young volcanic systems, as they often provide indication of impending unrest or eruption. Herein we document the occurrence of LP earthquakes at several of California's young volcanoes, updating a previous study published in Pitt et al., 2002, SRL. All events were detected and located using data from stations within the Northern California Seismic Network (NCSN). Event detection was spatially and temporally uneven across the NCSN in the 1980s and 1990s, but additional stations, adoption of the Earthworm processing system, and heightened vigilance by seismologists have improved the catalog over the last decade. LP earthquakes are now relatively well-recorded under Lassen (~150 events since 2000), Clear Lake (~60 events), Mammoth Mountain

  11. RST (Robust Satellite Techiniques) analysis for monitoring earth emitted radiation at the time of the Hector Mine 16th October 1999 earthquake

    NASA Astrophysics Data System (ADS)

    Lisi, M.; Filizzola, C.; Genzano, N.; Mazzeo, G.; Pergola, N.; Tramutoli, V.

    2009-12-01

    Several studies have been performed, in the past years, reporting the appearance of space-time anomalies in TIR satellite imagery, from weeks to days, before severe earthquakes. Different authors, in order to explain the appearance of anomalously high TIR records near the place and the time of earthquake occurrence, attributed their appearance to the increase of green-house gas (such as CO2, CH4, etc.) emission rates, to the modification of ground water regime and/or to the increase of convective heat flux. Among the others, a Robust Satellite data analysis Technique (RST), based on the RAT - Robust AVHRR (Advanced Very High Resolution Radiometer) Techniques - approach, was proposed to investigate possible relations between earthquake occurrence and space-time fluctuations of Earth’s emitted TIR radiation observed from satellite. The RST analysis is based on a statistically definition of “TIR anomalies” allowing their identification even in very different natural (e.g. related to atmosphere and/or surface) and observational (e.g. related to time/season, but also to solar and satellite zenithal angles) conditions. The correlation analysis (in the space-time domain) with earthquake occurrence is always carried out by using a validation/confutation approach, in order to verify the presence/absence of anomalous space-time TIR transients in the presence/absence of significant seismic activity. The RST approach was already tested in the case of tens of earthquakes occurred in different continents (Europe, Asia, America and Africa), in various geo-tectonic settings (compressive, extensional and transcurrent) and with a wide range of magnitudes (from 4.0 to 7.9). In this paper, the results of RST analysis performed over 7 years of TIR satellite records collected over the western part of the United States of America at the time of Hector Mine earthquake (16th October 1999, M 7.1) are presented and compared with an identical analysis (confutation) performed in

  12. The electronic behavior of a photosynthetic reaction center monitored by conductive atomic force microscopy.

    PubMed

    Mikayama, Takeshi; Iida, Kouji; Suemori, Yoshiharu; Dewa, Takehisa; Miyashita, Tokuji; Nango, Mamoru; Gardiner, Alastair T; Cogdell, Richard J

    2009-01-01

    The conductivity of a photosynthetic reaction center (RC) from Rhodobacter sphaeroides was measured with conductive atomic force microscopy (CAFM) on SAM-modified Au(111) substrates. 2-mercaptoethanol (2ME), 2-mercaptoacetic acid (MAC), 2-mercaptopyridine (2MP) and 4-mercaptopyridine (4MP) were prepared as SAM materials to investigate the stability and morphology of RCs on the substrate by using near-IR absorption spectroscopy and AFM, respectively. The clear presence of the three well known RC near-IR absorption peaks indicates that the RCs were native on the SAM-modified Au(111). Dense grains with various diameters of 5-20 nm, which corresponded to mixtures of single RCs up to aggregates of 10, were observed in topographs of RCs adsorbed on all the different SAM-modified Au(111) substrates. The size of currents obtained from the RC using a bare conductive cantilever were produced in the following order for SAM molecules: 2MP > 2ME > 4MP > MAC. A clear rectification of this current was observed for the modification of the Au(111) substrate with the pi-conjugated thiol, 2MP, indicating that 2MP was effective in both promoting the specific orientation of the RCs on the electrode and electron injection into the RC. Cyclic voltammetry measurements indicate that the 2MP is better mediator for the electron transfer between a quinone and substrate. The current with 2MP-modified cantilever was twice as high as that obtained with the Au-coated one alone, indicating that 2MP has an important role in lowering the electron injection barrier between special pair side of RC and gold electrode.

  13. Biological monitoring of mercury exposure in individuals referred to a toxicological center in Venezuela.

    PubMed

    Rojas, Maritza; Seijas, David; Agreda, Olga; Rodríguez, Maritza

    2006-02-01

    People in developing countries are often considered at greater risk of mercury (Hg) poisoning due to a variety of factors including a lack of awareness regarding their occupational risks. Individuals requiring urine mercury (U-Hg) analysis at the Center for Toxicological Investigations of the University of Carabobo (CITUC), between 1998 and 2002 were studied to identify demographic characteristics associated to U-Hg levels. The studied population included individuals with a history of exposure (or related exposures) to Hg processes, and was comprised of 1159 individuals (65 children, 1094 adults) ages 0.58-79 years old, mean 36.63+/-12.4. Children's geometric mean U-Hg levels were 2.73 microg/g Creatinine (Ct) and in adults 2.55 microg/g Ct. The highest frequency of adults' occupations were shipyard workers (35.47%), dentists (23.5%), lab technicians (11.43%), dental employees 10.42% and miners (10.2%). Chemical laboratory technicians had the highest mean U-Hg (4.46 microg/g Ct). Mean U-Hg levels in female adults (3.45 microg/g Ct) were statistically superior to levels in male adults (2.15 microg/g Ct). Two of the 172 women in reproductive age, had U-Hg levels higher than 78 microg/g Ct. Individuals from Falcon State were found to have the highest mean U-Hg (4.53 microg/g Ct). U-Hg levels higher than permissible limits were found in only 2 states (Carabobo and Bolivar) with a total of 24 cases. Although the results of this investigation were highly variable, the findings can be used to examine circumstances which influence mercury toxicity trends, and possibly used in future studies working to identify Hg exposures. PMID:16399001

  14. Data Management and Site-Visit Monitoring of the Multi-Center Registry in the Korean Neonatal Network

    PubMed Central

    Choi, Chang Won

    2015-01-01

    The Korean Neonatal Network (KNN), a nationwide prospective registry of very-low-birth-weight (VLBW, < 1,500 g at birth) infants, was launched in April 2013. Data management (DM) and site-visit monitoring (SVM) were crucial in ensuring the quality of the data collected from 55 participating hospitals across the country on 116 clinical variables. We describe the processes and results of DM and SVM performed during the establishment stage of the registry. The DM procedure included automated proof checks, electronic data validation, query creation, query resolution, and revalidation of the corrected data. SVM included SVM team organization, identification of unregistered cases, source document verification, and post-visit report production. By March 31, 2015, 4,063 VLBW infants were registered and 1,693 queries were produced. Of these, 1,629 queries were resolved and 64 queries remain unresolved. By November 28, 2014, 52 participating hospitals were visited, with 136 site-visits completed since April 2013. Each participating hospital was visited biannually. DM and SVM were performed to ensure the quality of the data collected for the KNN registry. Our experience with DM and SVM can be applied for similar multi-center registries with large numbers of participating centers. PMID:26566353

  15. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  16. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  17. A Broadly-Based Training Program in Volcano Hazards Monitoring at the Center for the Study of Active Volcanoes

    NASA Astrophysics Data System (ADS)

    Thomas, D. M.; Bevens, D.

    2015-12-01

    The Center for the Study of Active Volcanoes, in cooperation with the USGS Volcano Hazards Program at HVO and CVO, offers a broadly based volcano hazards training program targeted toward scientists and technicians from developing nations. The program has been offered for 25 years and provides a hands-on introduction to a broad suite of volcano monitoring techniques, rather than detailed training with just one. The course content has evolved over the life of the program as the needs of the trainees have changed: initially emphasizing very basic monitoring techniques (e.g. precise leveling, interpretation of seismic drum records, etc.) but, as the level of sophistication of the trainees has increased, training in more advanced technologies has been added. Currently, topics of primary emphasis have included volcano seismology and seismic networks; acquisition and modeling of geodetic data; methods of analysis and monitoring of gas geochemistry; interpretation of volcanic deposits and landforms; training in LAHARZ, GIS mapping of lahar risks; and response to and management of volcanic crises. The course also provides training on public outreach, based on CSAV's Hawaii-specific hazards outreach programs, and volcano preparedness and interactions with the media during volcanic crises. It is an intensive eight week course with instruction and field activities underway 6 days per week; it is now offered in two locations, Hawaii Island, for six weeks, and the Cascades volcanoes of the Pacific Northwest, for two weeks, to enable trainees to experience field conditions in both basaltic and continental volcanic environments. The survival of the program for more than two decades demonstrates that a need for such training exists and there has been interaction and contribution to the program by the research community, however broader engagement with the latter continues to present challenges. Some of the reasons for this will be discussed.

  18. A smartphone application for earthquakes that matter!

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    level of shaking intensity with empirical models of fatality losses calibrated on past earthquakes in each country. Non-seismic detections and macroseismic questionnaires collected online are combined to identify as many as possible of the felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the US Geological Survey, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. All together, we estimate that the number of detected felt earthquakes is around 1 000 per year, compared with the 35 000 earthquakes annually reported by the EMSC! Felt events are already the subject of the web page "Latest significant earthquakes" on EMSC website (http://www.emsc-csem.org/Earthquake/significant_earthquakes.php) and of a dedicated Twitter service @LastQuake. We will present the identification process of the earthquakes that matter, the smartphone application itself (to be released in May) and its future evolutions.

  19. Near Real-Time Determination of Earthquake Source Parameters for Tsunami Early Warning from Geodetic Observations

    NASA Astrophysics Data System (ADS)

    Manneela, Sunanda; Srinivasa Kumar, T.; Nayak, Shailesh R.

    2016-06-01

    Exemplifying the tsunami source immediately after an earthquake is the most critical component of tsunami early warning, as not every earthquake generates a tsunami. After a major under sea earthquake, it is very important to determine whether or not it has actually triggered the deadly wave. The near real-time observations from near field networks such as strong motion and Global Positioning System (GPS) allows rapid determination of fault geometry. Here we present a complete processing chain of Indian Tsunami Early Warning System (ITEWS), starting from acquisition of geodetic raw data, processing, inversion and simulating the situation as it would be at warning center during any major earthquake. We determine the earthquake moment magnitude and generate the centroid moment tensor solution using a novel approach which are the key elements for tsunami early warning. Though the well established seismic monitoring network, numerical modeling and dissemination system are currently capable to provide tsunami warnings to most of the countries in and around the Indian Ocean, the study highlights the critical role of geodetic observations in determination of tsunami source for high-quality forecasting.

  20. Earthquake location determination using data from DOMERAPI and BMKG seismic networks: A preliminary result of DOMERAPI project

    SciTech Connect

    Ramdhan, Mohamad; Nugraha, Andri Dian; Widiyantoro, Sri; Métaxian, Jean-Philippe; Valencia, Ayunda Aulia

    2015-04-24

    DOMERAPI project has been conducted to comprehensively study the internal structure of Merapi volcano, especially about deep structural features beneath the volcano. DOMERAPI earthquake monitoring network consists of 46 broad-band seismometers installed around the Merapi volcano. Earthquake hypocenter determination is a very important step for further studies, such as hypocenter relocation and seismic tomographic imaging. Ray paths from earthquake events occurring outside the Merapi region can be utilized to delineate the deep magma structure. Earthquakes occurring outside the DOMERAPI seismic network will produce an azimuthal gap greater than 180{sup 0}. Owing to this situation the stations from BMKG seismic network can be used jointly to minimize the azimuthal gap. We identified earthquake events manually and carefully, and then picked arrival times of P and S waves. The data from the DOMERAPI seismic network were combined with the BMKG data catalogue to determine earthquake events outside the Merapi region. For future work, we will also use the BPPTKG (Center for Research and Development of Geological Disaster Technology) data catalogue in order to study shallow structures beneath the Merapi volcano. The application of all data catalogues will provide good information as input for further advanced studies and volcano hazards mitigation.

  1. Evidence for Ancient Mesoamerican Earthquakes

    NASA Astrophysics Data System (ADS)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  2. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

  3. Identified EM Earthquake Precursors

    NASA Astrophysics Data System (ADS)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    recurrence, duration, and frequency response. At the Southern California field sites, one loop antenna was positioned for omni-directional reception and also detected a strong First Schumann Resonance; however, additional Schumann Resonances were absent. At the Timpson, TX field sites, loop antennae were positioned for directional reception, due to earthquake-induced, hydraulic fracturing activity currently conducted by the oil and gas industry. Two strong signals, one moderately strong signal, and approximately 6-8 weaker signals were detected in the immediate vicinity. The three stronger signals were mapped by a biangulation technique, followed by a triangulation technique for confirmation. This was the first antenna mapping technique ever performed for determining possible earthquake epicenters. Six and a half months later, Timpson experienced two M4 (M4.1 and M4.3) earthquakes on September 2, 2013 followed by a M2.4 earthquake three days later, all occurring at a depth of five kilometers. The Timpson earthquake activity now has a cyclical rate and a forecast was given to the proper authorities. As a result, the Southern California and Timpson, TX field results led to an improved design and construction of a third prototype antenna. With a loop antenna array, a viable communication system, and continuous monitoring, a full fracture cycle can be established and observed in real-time. In addition, field data could be reviewed quickly for assessment and lead to a much more improved earthquake forecasting capability. The EM precursors determined by this method appear to surpass all prior precursor claims, and the general public will finally receive long overdue forecasting.

  4. Anomalous Schumann resonance observed in China, possibly associated with Honshu, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Ouyang, X. Y.; Zhang, X. M.; Shen, X. H.; Miao, Y. Q.

    2012-04-01

    Schumann resonance (hereafter SR) occurs in the cavity between the Earth and the ionosphere, and it is originated by the global lightning activities [1]. Some recent publications showed that anomalous SR phenomena may occur before major earthquakes [2-4]. Considering good prospects for the application of SR in Earthquake monitoring, we have established four observatories in Yunnan province, a region with frequent seismicity in the southwest of China. Our instruments can provide three components of magnetic field in 0-30 Hz, including BNS(North-South component), BEW(East-West component) and BV (Vertical component). The sample frequency is 100 Hz. In this research, we use high quality data recorded at Yongsheng observatory (geographic coordinates: 26.7° N, 100.77°E) to analyze SR phenomena to find out anomalous effects possibly related with the Ms9.0 Earthquake (epicenter: 38.297° N, 142.372° E) near the east coast of Honshu, Japan on 11 March 2011. We select the data 15 days before and after the earthquake. SR in BNS and SR in BEWappear different in background characteristics. Frequencies of four SR modes in BNSare generally higher than that in BEW. Amplitude of SR in BNSis strong at around 05:00 LT, 15:00 LT and 23:00 LT of the day, while amplitude of SR in BEW is just intense around 16:00 LT, corresponding to about 08:00 UT. Because American, African and Asian thunderstorm centers play their dominant roles respectively in the intervals of 21:00UT±1h, 15:00UT±1h and 08:00UT±1h [1, 3], we can see that SR in BEWis most sensitive to signals from Asian center and SR in BNS is in good response to three centers. SR in BNS and SR in BEW have presented different features in the aspect of anomalous effects related with earthquakes. BEW component gives us a clear picture of anomalous SR phenomena, which are characterized by increase in amplitude of four SR modes and increase in frequency at first SR mode several days before the earthquake. The amplitude of four SR

  5. Real Time Earthquake Information System in Japan

    NASA Astrophysics Data System (ADS)

    Doi, K.; Kato, T.

    2003-12-01

    monitors earthquake data and analyzes earthquake activities and tsunami occurrence round-the-clock on a real-time basis. In addition to the above, JMA has been developing a system of Nowcast Earthquake Information which can provide its users with occurrence of an earthquake prior to arrival of strong ground motion for a decade. Earthquake Research Institute, the University of Tokyo, is preparing a demonstrative experiment in collaboration with JMA, for a better utilization of Nowcast Earthquake Information to apply actual measures to reduce earthquake disasters caused by strong ground motion.

  6. Design and characterization of the beam monitor detectors of the Italian National Center of Oncological Hadron-therapy (CNAO)

    NASA Astrophysics Data System (ADS)

    Giordanengo, S.; Donetti, M.; Garella, M. A.; Marchetto, F.; Alampi, G.; Ansarinejad, A.; Monaco, V.; Mucchi, M.; Pecka, I. A.; Peroni, C.; Sacchi, R.; Scalise, M.; Tomba, C.; Cirio, R.

    2013-01-01

    A new hadron-therapy facility implementing an active beam scanning technique has been developed at the Italian National Center of Oncological Hadron-therapy (CNAO). This paper presents the design and the characterization of the beam monitor detectors developed for the on-line monitoring and control of the dose delivered during a treatment at CNAO. The detectors are based on five parallel-plate transmission ionization chambers with either a single large electrode or electrodes segmented in 128 strips (strip chambers) and 32×32 pixels (pixel chamber). The detectors are arranged in two independent boxes with an active area larger than 200×200 mm2 and a total water equivalent thickness along the beam path of about 0.9 mm. A custom front-end chip with 64 channels converts the integrated ionization channels without dead-time. The detectors were tested at the clinical proton beam facility of the Paul Scherrer Institut (PSI) which implements a spot scanning technique, each spot being characterized by a predefined number of protons delivered with a pencil beam in a specified point of the irradiation field. The short-term instability was measured by delivering several identical spots in a time interval of few tenths of seconds and is found to be lower than 0.3%. The non-uniformity, measured by delivering sequences of spots in different points of the detector surface, results to be lower than 1% in the single electrode chambers and lower than 1.5% in the strip and pixel chambers, reducing to less than 0.5% and 1% in the restricted 100×100 mm2 central area of the detector.

  7. Selected natural attenuation monitoring data, Operable Unit 1, Naval Undersea Warfare Center, Division Keyport, Washington, June 2003

    USGS Publications Warehouse

    Dinicola, Richard S.; Huffman, R.L.

    2004-01-01

    Previous investigations have shown that natural attenuation and biodegradation of chlorinated volatile organic compounds (CVOCs) are substantial in shallow ground water beneath the 9-acre former landfill at Operable Unit 1 (OU 1), Naval Undersea Warfare Center (NUWC), Division Keyport, Washington. This report presents the ground-water geochemical and selected CVOC data collected at OU 1 by the U.S. Geological Survey (USGS) during June 17-20, 2003 in support of long-term monitoring for natural attenuation. Strongly reducing conditions favorable for reductive dechlorination of CVOCs were found in fewer upper-aquifer wells during June 2003 than were found during sampling periods in 2001 and 2002. Redox conditions in water from the intermediate aquifer just downgradient from the landfill remained somewhat favorable for reductive dechlorination. As was noted in previous monitoring reports, the changes in redox conditions observed at individual wells have not been consistent or substantial throughout either the upper or the intermediate aquifers. Compared to 2002 data, total CVOC concentrations in June 2003 were nearly unchanged in all northern plantation piezometers sampled, although the concentrations were historically low at two of those sites. Total CVOC concentrations decreased consistently in the southern plantation samples. Historically low total CVOC concentrations were observed in three of the piezometers sampled, and a two order-of-magnitude decrease in total CVOCs was observed at one of those sites. The observed decreases in CVOC concentrations appear to be in contrast with the 2003 redox data that suggested less favorable conditions for reductive dechlorination. The Navy and USGS plan to do more extensive data-collection and interpretation during 2004 to better understand and document possible changes in redox conditions and contaminant biodegradation.

  8. Earthquake occurrence and effects.

    PubMed

    Adams, R D

    1990-01-01

    Although earthquakes are mainly concentrated in zones close to boundaries of tectonic plates of the Earth's lithosphere, infrequent events away from the main seismic regions can cause major disasters. The major cause of damage and injury following earthquakes is elastic vibration, rather than fault displacement. This vibration at a particular site will depend not only on the size and distance of the earthquake but also on the local soil conditions. Earthquake prediction is not yet generally fruitful in avoiding earthquake disasters, but much useful planning to reduce earthquake effects can be done by studying the general earthquake hazard in an area, and taking some simple precautions. PMID:2347628

  9. Napa Earthquake impact on water systems

    NASA Astrophysics Data System (ADS)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  10. EARTHQUAKE TRIGGERING AND SPATIAL-TEMPORAL RELATIONS IN THE VICINITY OF YUCCA MOUNTAIN, NEVADA

    SciTech Connect

    na

    2001-02-08

    It is well accepted that the 1992 M 5.6 Little Skull Mountain earthquake, the largest historical event to have occurred within 25 km of Yucca Mountain, Nevada, was triggered by the M 7.2 Landers earthquake that occurred the day before. On the premise that earthquakes can be triggered by applied stresses, we have examined the earthquake catalog from the Southern Great Basin Digital Seismic Network (SGBDSN) for other evidence of triggering by external and internal stresses. This catalog now comprises over 12,000 events, encompassing five years of consistent monitoring, and has a low threshold of completeness, varying from M 0 in the center of the network to M 1 at the fringes. We examined the SGBDSN catalog response to external stresses such as large signals propagating from teleseismic and regional earthquakes, microseismic storms, and earth tides. Results are generally negative. We also examined the interplay of earthquakes within the SGBDSN. The number of ''foreshocks'', as judged by most criteria, is significantly higher than the background seismicity rate. In order to establish this, we first removed aftershocks from the catalog with widely used methodology. The existence of SGBDSN foreshocks is supported by comparing actual statistics to those of a simulated catalog with uniform-distributed locations and Poisson-distributed times of occurrence. The probabilities of a given SGBDSN earthquake being followed by one having a higher magnitude within a short time frame and within a close distance are at least as high as those found with regional catalogs. These catalogs have completeness thresholds two to three units higher in magnitude than the SGBDSN catalog used here. The largest earthquake in the SGBDSN catalog, the M 4.7 event in Frenchman Flat on 01/27/1999, was preceded by a definite foreshock sequence. The largest event within 75 km of Yucca Mountain in historical time, the M 5.7 Scotty's Junction event of 08/01/1999, was also preceded by foreshocks. The

  11. Center for Integration of Natural Disaster Information

    USGS Publications Warehouse

    U.S. Geological Survey

    2001-01-01

    The U.S. Geological Survey's Center for Integration of Natural Disaster Information (CINDI) is a research and operational facility that explores methods for collecting, integrating, and communicating information about the risks posed by natural hazards and the effects of natural disasters. The U.S. Geological Survey (USGS) is mandated by the Robert Stafford Act to warn citizens of impending landslides, volcanic eruptions, and earthquakes. The USGS also coordinates with other Federal, State, and local disaster agencies to monitor threats to communities from floods, coastal storms, wildfires, geomagnetic storms, drought, and outbreaks of disease in wildlife populations.

  12. PAGER - Rapid Assessment of an Earthquake's Impact

    USGS Publications Warehouse

    Earle, Paul; Wald, David J.

    2007-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system to rapidly assess the number of people and regions exposed to severe shaking by an earthquake, and inform emergency responders, government agencies, and the media to the scope of the potential disaster. PAGER monitors the U.S. Geological Survey?s near real-time U.S. and global earthquake detections and automatically identifies events that are of societal importance, well in advance of ground-truth or news accounts.

  13. Intracranial Pressure Monitoring in Severe Traumatic Brain Injury in Latin America: Process and Methods for a Multi-Center Randomized Controlled Trial

    PubMed Central

    Lujan, Silvia; Dikmen, Sureyya; Temkin, Nancy; Petroni, Gustavo; Pridgeon, Jim; Barber, Jason; Machamer, Joan; Cherner, Mariana; Chaddock, Kelley; Hendrix, Terence; Rondina, Carlos; Videtta, Walter; Celix, Juanita M.; Chesnut, Randall

    2012-01-01

    Abstract In patients with severe traumatic brain injury (TBI), the influence on important outcomes of the use of information from intracranial pressure (ICP) monitoring to direct treatment has never been tested in a randomized controlled trial (RCT). We are conducting an RCT in six trauma centers in Latin America to test this question. We hypothesize that patients randomized to ICP monitoring will have lower mortality and better outcomes at 6-months post-trauma than patients treated without ICP monitoring. We selected three centers in Bolivia to participate in the trial, based on (1) the absence of ICP monitoring, (2) adequate patient accession and data collection during the pilot phase, (3) preliminary institutional review board approval, and (4) the presence of equipoise about the value of ICP monitoring. We conducted extensive training of site personnel, and initiated the trial on September 1, 2008. Subsequently, we included three additional centers. A total of 176 patients were entered into the trial as of August 31, 2010. Current enrollment is 81% of that expected. The trial is expected to reach its enrollment goal of 324 patients by September of 2011. We are conducting a high-quality RCT to answer a question that is important globally. In addition, we are establishing the capacity to conduct strong research in Latin America, where TBI is a serious epidemic. Finally, we are demonstrating the feasibility and utility of international collaborations that share resources and unique patient populations to conduct strong research about global public health concerns. PMID:22435793

  14. Toward real-time regional earthquake simulation of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  15. Long-term monitoring of creep rate along the Hayward fault and evidence for a lasting creep response to 1989 Loma Prieta earthquake

    USGS Publications Warehouse

    Lienkaemper, J.J.; Galehouse, J.S.; Simpson, R.W.

    2001-01-01

    We present results from over 30 yr of precise surveys of creep along the Hayward fault. Along most of the fault, spatial variability in long-term creep rates is well determined by these data and can help constrain 3D-models of the depth of the creeping zone. However, creep at the south end of the fault stopped completely for more than 6 years after the M7 1989 Loma Prieta Earthquake (LPEQ), perhaps delayed by stress drop imposed by this event. With a decade of detailed data before LPEQ and a decade after it, we report that creep response to that event does indeed indicate the expected deficit in creep.

  16. The Seminole Serpent Warrior At Miramar, FL, Shows Settlement Locations Enabled Environmental Monitoring Reminiscent Of the Four-corners Kokopelli-like EMF Phenomena, and Related to Earthquakes, Tornados and Hurricanes.

    NASA Astrophysics Data System (ADS)

    Balam Matagamon, Chan; Pawa Matagamon, Sagamo

    2004-03-01

    Certain Native Americans of the past seem to have correctly deduced that significant survival information for their tradition-respecting cultures resided in EMF-based phenomena that they were monitoring. This is based upon their myths and the place or cult-hero names they bequeathed us. The sites we have located in FL have been detectable by us visually, usually by faint blue light, or by the elicitation of pin-like prickings, by somewhat intense nervous-system response, by EMF interactions with aural electrochemical systems that can elicit tinitus, and other ways. In the northeast, Cautantowit served as a harbinger of Indian summer, and appears to be another alter ego of the EMF. The Miami, FL Tequesta site along the river clearly correlates with tornado, earthquake and hurricane locations. Sites like the Mohave Deserts giant man may have had similar significance.

  17. 76 FR 61115 - Migrant and Seasonal Farmworkers (MSFWs) Monitoring Report and One-Stop Career Center Complaint...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-03

    ...-Stop Career Center Complaint/Referral Record: Comments Agency: Employment and Training Administration... 8429, One-Stop Career Center Complaint/ Referral Record, to March 1, 2015. The changes incorporated to... effectiveness of SWA service delivery to MSFWs. The ETA Form 8429, One-Stop Career Center...

  18. Monitoring

    DOEpatents

    Orr, Christopher Henry; Luff, Craig Janson; Dockray, Thomas; Macarthur, Duncan Whittemore

    2004-11-23

    The invention provides apparatus and methods which facilitate movement of an instrument relative to an item or location being monitored and/or the item or location relative to the instrument, whilst successfully excluding extraneous ions from the detection location. Thus, ions generated by emissions from the item or location can successfully be monitored during movement. The technique employs sealing to exclude such ions, for instance, through an electro-field which attracts and discharges the ions prior to their entering the detecting location and/or using a magnetic field configured to repel the ions away from the detecting location.

  19. Selected Natural Attenuation Monitoring Data, Operable Unit 1, Naval Undersea Warfare Center, Division Keyport, Washington, 2007 and 2008

    USGS Publications Warehouse

    Dinicola, R.S.; Huffman, R.L.

    2009-01-01

    Previous investigations indicate that natural attenuation and biodegradation of chlorinated volatile organic compounds (VOCs) are substantial in groundwater beneath the 9-acre former landfill at Operable Unit 1 (OU 1), Naval Undersea Warfare Center, Division Keyport, Washington. Phytoremediation combined with on-going natural attenuation processes was the preferred remedy selected by the Navy, as specified in the Record of Decision for the site. The Navy planted two hybrid poplar plantations on the landfill in spring 1999 to remove and to control the migration of chlorinated VOCs in shallow groundwater. The U.S. Geological Survey (USGS) has continued to monitor groundwater geochemistry to ensure that conditions remain favorable for contaminant biodegradation as specified in the Record of Decision. In this report are groundwater geochemical and selected VOC data collected at OU 1 by the USGS during June 18-21, 2007, and June 16-18, 2008, in support of long-term monitoring for natural attenuation. For 2007 and 2008, strongly reducing conditions (sulfate reduction and methanogenesis) most favorable for reductive dechlorination of VOCs were inferred for 9 of 16 upper-aquifer wells and piezometers in the northern and southern phytoremediation plantations. Predominant redox conditions in groundwater from the intermediate aquifer just downgradient from the landfill remained mildly reducing and somewhat favorable for reductive dechlorination of VOCs. Dissolved hydrogen (H2) concentrations measured in the upper aquifer during 2007 and 2008 generally have been lower than H2 concentrations measured before 2002. However, widespread and relatively high methane and sulfide concentrations indicate that the lower H2 concentrations measured do not support a trend from strongly to mildly reducing redox conditions because no widespread changes in groundwater redox conditions were identified that should result in less favorable conditions for the reductive dechlorination of the

  20. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  1. Temporal-Spatial Pattern of Pre-earthquake Signatures in Atmosphere and Ionosphere Associated with Major Earthquakes in Greece.

    NASA Astrophysics Data System (ADS)

    Calderon, I. S.; Ouzounov, D.; Anagnostopoulos, G. C.; Pulinets, S. A.; Davidenko, D.; Karastathis, V. K.; Kafatos, M.

    2015-12-01

    We are conducting validation studies on atmosphere/ionosphere phenomena preceding major earthquakes in Greece in the last decade and in particular the largest (M6.9) earthquakes that occurred on May 24, 2014 in the Aegean Sea and on February 14, 2008 in South West Peloponisos (Methoni). Our approach is based on monitoring simultaneously a series of different physical parameters from space: Outgoing long-wavelength radiation (OLR) on the top of the atmosphere, electron and electron density variations in the ionosphere via GPS Total Electron Content (GPS/TEC), and ULF radiation and radiation belt electron precipitation (RBEP) accompanied by VLF wave activity into the topside ionosphere. In particular, we analyzed prospectively and retrospectively the temporal and spatial variations of various parameters characterizing the state of the atmosphere and ionosphere several days before the two M6.9 earthquakes. Concerning the Methoni EQ, DEMETER data confirm an almost standard profile before large EQs, with TEC, ULF, VLF and RBEP activity preceding some (four) days the EQ occurrence and silence the day of EQ; furthermore, during the period before the EQ, a progressive concentration of ULF emission centers around the future epicenter was confirmed. Concerning the recent Greek EQ of May 24, 2014, thermal anomaly was discovered 30 days and TEC anomaly 38 hours in advance accordingly. The spatial characteristics of pre-earthquake anomalous behavior were associated with the epicentral region. Our analysis of simultaneous space measurements before the great EQs suggests that they follow a general temporal-spatial pattern, which has been seen in other large EQs worldwide.

  2. Real-Time Earthquake Analysis for Disaster Mitigation (READI) Network

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2014-12-01

    Real-time GNSS networks are making a significant impact on our ability to forecast, assess, and mitigate the effects of geological hazards. I describe the activities of the Real-time Earthquake Analysis for Disaster Mitigation (READI) working group. The group leverages 600+ real-time GPS stations in western North America operated by UNAVCO (PBO network), Central Washington University (PANGA), US Geological Survey & Scripps Institution of Oceanography (SCIGN project), UC Berkeley & US Geological Survey (BARD network), and the Pacific Geosciences Centre (WCDA project). Our goal is to demonstrate an earthquake and tsunami early warning system for western North America. Rapid response is particularly important for those coastal communities that are in the near-source region of large earthquakes and may have only minutes of warning time, and who today are not adequately covered by existing seismic and basin-wide ocean-buoy monitoring systems. The READI working group is performing comparisons of independent real time analyses of 1 Hz GPS data for station displacements and is participating in government-sponsored earthquake and tsunami exercises in the Western U.S. I describe a prototype seismogeodetic system using a cluster of southern California stations that includes GNSS tracking and collocation with MEMS accelerometers for real-time estimation of seismic velocity and displacement waveforms, which has advantages for improved earthquake early warning and tsunami forecasts compared to seismic-only or GPS-only methods. The READI working group's ultimate goal is to participate in an Indo-Pacific Tsunami early warning system that utilizes GNSS real-time displacements and ionospheric measurements along with seismic, near-shore buoys and ocean-bottom pressure sensors, where available, to rapidly estimate magnitude and finite fault slip models for large earthquakes, and then forecast tsunami source, energy scale, geographic extent, inundation and runup. This will require

  3. The size of earthquakes

    USGS Publications Warehouse

    Kanamori, H.

    1980-01-01

    How we should measure the size of an earthquake has been historically a very important, as well as a very difficult, seismological problem. For example, figure 1 shows the loss of life caused by earthquakes in recent times and clearly demonstrates that 1976 was the worst year for earthquake casualties in the 20th century. However, the damage caused by an earthquake is due not only to its physical size but also to other factors such as where and when it occurs; thus, figure 1 is not necessarily an accurate measure of the "size" of earthquakes in 1976. the point is that the physical process underlying an earthquake is highly complex; we therefore cannot express every detail of an earthquake by a simple straightforward parameter. Indeed, it would be very convenient if we could find a single number that represents the overall physical size of an earthquake. This was in fact the concept behind the Richter magnitude scale introduced in 1935. 

  4. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, S.E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  5. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  6. Estimating earthquake potential

    USGS Publications Warehouse

    Page, R.A.

    1980-01-01

    The hazards to life and property from earthquakes can be minimized in three ways. First, structures can be designed and built to resist the effects of earthquakes. Second, the location of structures and human activities can be chosen to avoid or to limit the use of areas known to be subject to serious earthquake hazards. Third, preparations for an earthquake in response to a prediction or warning can reduce the loss of life and damage to property as well as promote a rapid recovery from the disaster. The success of the first two strategies, earthquake engineering and land use planning, depends on being able to reliably estimate the earthquake potential. The key considerations in defining the potential of a region are the location, size, and character of future earthquakes and frequency of their occurrence. Both historic seismicity of the region and the geologic record are considered in evaluating earthquake potential. 

  7. Crustal earthquake triggering by modern great earthquakes on subduction zone thrusts

    NASA Astrophysics Data System (ADS)

    Gomberg, Joan; Sherrod, Brian

    2014-02-01

    Among the many questions raised by the recent abundance of great (M > 8.0) subduction thrust earthquakes is their potential to trigger damaging earthquakes on crustal faults within the overriding plate and beneath many of the world's densely populated urban centers. We take advantage of the coincident abundance of great earthquakes globally and instrumental observations since 1960 to assess this triggering potential by analyzing centroids and focal mechanisms from the centroid moment tensor catalog for events starting in 1976 and published reports about the M9.5 1960 Chile and M9.2 1964 Alaska earthquake sequences. We find clear increases in the rates of crustal earthquakes in the overriding plate within days following all subduction thrust earthquakes of M > 8.6, within about ±10° of the triggering event centroid latitude and longitude. This result is consistent with dynamic triggering of more distant increases of shallow seismicity rates at distances beyond ±10°, suggesting that dynamic triggering may be important within the near field too. Crustal earthquake rate increases may also follow smaller M > 7.5 subduction thrust events, but because activity typically occurs offshore in the immediately vicinity of the triggering rupture plane, it cannot be unambiguously attributed to sources within the overriding plate. These observations are easily explained in the context of existing earthquake scaling laws.

  8. 76 FR 69761 - National Earthquake Prediction Evaluation Council (NEPEC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-09

    ... briefings on lessons learned from the 2010 Chile and 2011 Japan subduction earthquakes, monitoring and....S. Geological Survey National Earthquake Prediction Evaluation Council (NEPEC) AGENCY: U.S. Geological Survey. ACTION: Notice of Meeting. SUMMARY: Pursuant to Public Law 96-472, the National...

  9. Supercomputing meets seismology in earthquake exhibit

    SciTech Connect

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2013-10-03

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  10. Supercomputing meets seismology in earthquake exhibit

    ScienceCinema

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2016-07-12

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  11. A continuation of base-line studies for environmentally monitoring Space Transportation Systems at John F. Kennedy Space Center. Volume 2: Chemical studies of rainfall and soil analysis

    NASA Technical Reports Server (NTRS)

    Madsen, B. C.

    1980-01-01

    The results of a study which was designed to monitor, characterize, and evaluate the chemical composition of precipitation (rain) which fell at the Kennedy Space Center, Florida (KSC) during the period July 1977 to March 1979 are reported. Results which were obtained from a soil sampling and associated chemical analysis are discussed. The purpose of these studies was to determine the environmental perturbations which might be caused by NASA space activities.

  12. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  13. Selected Natural Attenuation Monitoring Data, Operable Unit 1, Naval Undersea Warfare Center, Division Keyport, Washington, June 2006

    USGS Publications Warehouse

    Dinicola, R.S.; Huffman, R.L.

    2007-01-01

    Previous investigations have shown that natural attenuation and biodegradation of chlorinated volatile organic compounds (VOCs) are substantial in shallow ground water beneath the 9-acre former landfill at Operable Unit 1 (OU 1), Naval Undersea Warfare Center, Division Keyport, Washington. The U.S. Geological Survey (USGS) has continued to monitor ground-water geochemistry to assure that conditions remain favorable for contaminant biodegradation. This report presents ground-water geochemical and selected VOC data collected at OU 1 by the USGS during June 12-14, 2006, in support of long-term monitoring for natural attenuation. For June 2006, the strongly reducing conditions (sulfate reduction and methanogenesis) most favorable for reductive dechlorination of VOCs were inferred for 5 of 15 upper-aquifer sites in the northern and southern phytoremediation plantations. Predominant redox conditions in ground water from the intermediate aquifer just downgradient from the landfill remained mildly reducing and somewhat favorable for reductive dechlorination. Since about 2003, measured dissolved hydrogen concentrations in the upper aquifer generally have been lower than those previously measured, although methane and sulfide have continued to be detected throughout the upper aquifer beneath the landfill. Overall, no widespread changes in ground-water redox conditions were measured that should result in either more or less efficient biodegradation of chlorinated VOCs. For the northern plantation in 2006, chlorinated VOC concentrations at piezometers P1-3 and P1-4 were lower than previously measured, and trichloroethene (TCE), cis-1,2-dichloroethene (cis-DCE), or vinyl chloride (VC) were not detected at piezometers P1-1 and P1-5. The steady decrease in contaminant concentrations and the continued detection of the reductive dechlorination end-products ethene and ethane have been consistent throughout the upper aquifer beneath the northern plantation. For the southern

  14. Selected Natural Attenuation Monitoring Data, Operable Unit 1, Naval Undersea Warfare Center, Division Keyport, Washington, June 2005

    USGS Publications Warehouse

    Dinicola, Richard S.; Huffman, R.L.

    2006-01-01

    Previous investigations have shown that natural attenuation and biodegradation of chlorinated volatile organic compounds (VOCs) are substantial in shallow ground water beneath the 9-acre former landfill at Operable Unit 1 (OU-1), Naval Undersea Warfare Center, Division Keyport, Washington. The U.S. Geological Survey (USGS) has continued to monitor ground-water geochemistry to assure that conditions remain favorable for contaminant biodegradation. This report presents the ground-water geochemical and selected VOC data collected at OU-1 by the USGS during June 21-24, 2005, in support of long-term monitoring for natural attenuation. For June 2005, the strongly reducing conditions (sulfate reduction and methanogenesis) most favorable for reductive dechlorination of chlorinated VOCs were detected in fewer upper-aquifer wells than were detected during 2004. Redox conditions in ground water from the intermediate aquifer just downgradient of the landfill remained somewhat favorable for reductive dechlorination. Overall, the changes in redox conditions observed at individual wells have not been consistent or substantial throughout either the upper or the intermediate aquifers. In apparent contrast to changes in redox conditions, the chlorinated VOC concentrations were lower than previously measured in many of the piezometers in the northern phytoremediation plantation. The decrease in contaminant concentrations beneath the northern plantation and the end-product (ethane and ethene) evidence for reductive dechlorination are consistent with 2000-04 results. In the southern phytoremediation plantation, changes in chlorinated VOC concentrations were variable. Most notable was a substantial decrease in the sum of trichloroethene, cis-1,2-dichloroethene, and vinyl chloride concentrations at piezometer P1-9 from 75,000 to 1,000 micrograms per liter between 2004 and 2005. The high concentrations of the reductive dechlorination end-products ethane and ethene measured at the most

  15. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  16. Earthquakes, October 1975

    USGS Publications Warehouse

    Person, W.J.

    1976-01-01

    October was an active month seismically, although there were no damaging earthquakes in the United States. Several States experienced earthquakes that were felt sharply. There were four major earthquakes in other parts of the world, including a magntidue 7.4 in the Philippine Islands that killed on person. 

  17. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  18. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  19. School Safety and Earthquakes.

    ERIC Educational Resources Information Center

    Dwelley, Laura; Tucker, Brian; Fernandez, Jeanette

    1997-01-01

    A recent assessment of earthquake risk to Quito, Ecuador, concluded that many of its public schools are vulnerable to collapse during major earthquakes. A subsequent examination of 60 buildings identified 15 high-risk buildings. These schools were retrofitted to meet standards that would prevent injury even during Quito's largest earthquakes. US…

  20. Retrospective stress-forecasting of earthquakes

    NASA Astrophysics Data System (ADS)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  1. Lessons in bridge damage learned from the Wenchuan earthquake

    NASA Astrophysics Data System (ADS)

    Yen, W. Phillip; Chen, Genda; Yashinski, Mark; Hashash, Youssef; Holub, Curtis; Wang, Kehai; Guo, Xiaodong

    2009-06-01

    A strong earthquake occurred in Wenchuan County, Sichuan Province, China, on May 12, 2008. Shortly after the earthquake, the Turner-Fairbank Highway Research Center of the Federal Highway Administration, in partnership with the Research Institute of Highways, the Ministry of Communication of China, led a reconnaissance team to conduct a post-earthquake bridge performance investigation of the transportation system in the earthquake affected areas. The U.S. transportation system reconnaissance team visited the area during July 20-24, 2008. This paper presents the findings and lessons learned by the team.

  2. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  3. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  4. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  5. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  6. Cooperative Monitoring Center Occasional Paper/13: Cooperative monitoring for confidence building: A case study of the Sino-Indian border areas

    SciTech Connect

    SIDHU,WAHEGURU PAL SINGH; YUAN,JING-DONG; BIRINGER,KENT L.

    1999-08-01

    This occasional paper identifies applicable cooperative monitoring techniques and develops models for possible application in the context of the border between China and India. The 1993 and 1996 Sino-Indian agreements on maintaining peace and tranquility along the Line of Actual Control (LAC) and establishing certain confidence building measures (CBMs), including force reductions and limitation on military exercises along their common border, are used to examine the application of technically based cooperative monitoring in both strengthening the existing terms of the agreements and also enhancing trust. The paper also aims to further the understanding of how and under what conditions technology-based tools can assist in implementing existing agreements on arms control and confidence building. The authors explore how cooperative monitoring techniques can facilitate effective implementation of arms control agreements and CBMS between states and contribute to greater security and stability in bilateral, regional, and global contexts.

  7. Earthquake forecasting and warning

    SciTech Connect

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  8. Earthquake Hazard in the Heart of the Homeland

    USGS Publications Warehouse

    Gomberg, Joan; Schweig, Eugene

    2007-01-01

    Evidence that earthquakes threaten the Mississippi, Ohio, and Wabash River valleys of the Central United States abounds. In fact, several of the largest historical earthquakes to strike the continental United States occurred in the winter of 1811-1812 along the New Madrid seismic zone, which stretches from just west of Memphis, Tenn., into southern Illinois. Several times in the past century, moderate earthquakes have been widely felt in the Wabash Valley seismic zone along the southern border of Illinois and Indiana. Throughout the region, between 150 and 200 earthquakes are recorded annually by a network of monitoring instruments, although most are too small to be felt by people. Geologic evidence for prehistoric earthquakes throughout the region has been mounting since the late 1970s. But how significant is the threat? How likely are large earthquakes and, more importantly, what is the chance that the shaking they cause will be damaging?

  9. The 2015 Mw 7.1 earthquake on the Charlie-Gibbs transform fault: Repeating earthquakes and multimodal slip on a slow oceanic transform

    NASA Astrophysics Data System (ADS)

    Aderhold, K.; Abercrombie, R. E.

    2016-06-01

    The 2015 Mw 7.1 earthquake on the Charlie-Gibbs transform fault along the Mid-Atlantic Ridge is the latest in a series of seven large earthquakes since 1923. We propose that these earthquakes form a pair of quasi-repeating sequences with the largest magnitudes and longest repeat times for such sequences observed to date. We model teleseismic body waves and find that the 2015 earthquake ruptured a distinct segment of the transform from the previous 1998 earthquake. The two events display similarities to earthquakes in 1974 and 1967, respectively. We observe large oceanic transform earthquakes to exhibit characteristic slip behavior, initiating with small slip near the ridge, and propagating unilaterally to significant slip asperities nearer the center of the transform. These slip distributions combined with apparent segmentation support multimode slip behavior with fault slip accommodated both seismically during large earthquakes and aseismically in between.

  10. Oscillating brittle and viscous behavior through the earthquake cycle in the Red River Shear Zone: Monitoring flips between reaction and textural softening and hardening

    NASA Astrophysics Data System (ADS)

    Wintsch, Robert P.; Yeh, Meng-Wan

    2013-03-01

    greenschist facies conditions with the base of the crustal seismic zone suggests that the implied oscillations in strain rate may have been related to the earthquake cycle.

  11. Unexpectedly frequent occurrence of very small repeating earthquakes (-5.1 ≤ Mw ≤ -3.6) in a South African gold mine: Implications for monitoring intraplate faults

    NASA Astrophysics Data System (ADS)

    Naoi, Makoto; Nakatani, Masao; Igarashi, Toshihiro; Otsuki, Kenshiro; Yabe, Yasuo; Kgarume, Thabang; Murakami, Osamu; Masakale, Thabang; Ribeiro, Luiz; Ward, Anthony; Moriya, Hirokazu; Kawakata, Hironori; Nakao, Shigeru; Durrheim, Raymond; Ogasawara, Hiroshi

    2015-12-01

    We observed very small repeating earthquakes with -5.1 ≤ Mw ≤ -3.6 on a geological fault at 1 km depth in a gold mine in South Africa. Of the 851 acoustic emissions that occurred on the fault during the 2 month analysis period, 45% were identified as repeaters on the basis of waveform similarity and relative locations. They occurred steadily at the same location with similar magnitudes, analogous to repeaters at plate boundaries, suggesting that they are repeat ruptures of the same asperity loaded by the surrounding aseismic slip (background creep). Application of the Nadeau and Johnson (1998) empirical formula (NJ formula), which relates the amount of background creep and repeater activity and is well established for plate boundary faults, to the present case yielded an impossibly large estimate of the background creep. This means that the presently studied repeaters were produced more efficiently, for a given amount of background creep, than expected from the NJ formula. When combined with an independently estimated average stress drop of 16 MPa, which is not particularly high, it suggests that the small asperities of the presently studied repeaters had a high seismic coupling (almost unity), in contrast to one physical interpretation of the plate boundary repeaters. The productivity of such repeaters, per unit background creep, is expected to increase strongly as smaller repeaters are considered (∝ Mo -1/3 as opposed to Mo -1/6 of the NJ formula), which may be usable to estimate very slow creep that may occur on intraplate faults.

  12. Uplift and Subsidence Associated with the Great Aceh-Andaman Earthquake of 2004

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The magnitude 9.2 Indian Ocean earthquake of December 26, 2004, produced broad regions of uplift and subsidence. In order to define the lateral extent and the downdip limit of rupture, scientists from Caltech, Pasadena, Calif.; NASA's Jet Propulsion Laboratory, Pasadena, Calif.; Scripps Institution of Oceanography, La Jolla, Calif.; the U.S. Geological Survey, Pasadena, Calif.; and the Research Center for Geotechnology, Indonesian Institute of Sciences, Bandung, Indonesia; first needed to define the pivot line separating those regions. Interpretation of satellite imagery and a tidal model were one of the key tools used to do this.

    These pre-Sumatra earthquake (a) and post-Sumatra earthquake (b) images of North Sentinel Island in the Indian Ocean, acquired from the Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) instrument on NASA's Terra spacecraft, show emergence of the coral reef surrounding the island following the earthquake. The tide was 30 plus or minus 14 centimeters lower in the pre-earthquake image (acquired November 21, 2000) than in the post-earthquake image (acquired February 20, 2005), requiring a minimum of 30 centimeters of uplift at this locality. Observations from an Indian Coast Guard helicopter on the northwest coast of the island suggest that the actual uplift is on the order of 1 to 2 meters at this site.

    In figures (c) and (d), pre-earthquake and post-earthquake ASTER images of a small island off the northwest coast of Rutland Island, 38 kilometers east of North Sentinel Island, show submergence of the coral reef surrounding the island. The tide was higher in the pre-earthquake image (acquired January 1, 2004) than in the post-earthquake image (acquired February 4, 2005), requiring subsidence at this locality. The pivot line must run between North Sentinel and Rutland islands. Note that the scale for the North Sentinel Island images differs from that for the Rutland Island images.

    The tidal model used

  13. Seismic quiescence precursors to two M7 earthquakes on Sakhalin Island, measured by two methods

    NASA Astrophysics Data System (ADS)

    Wyss, Max; Sobolev, Gennady; Clippard, James D.

    2004-08-01

    Two large earthquakes occurred during the last decade on Sakhalin Island, the M w 7.6 Neftegorskoe earthquake of 27 May 1995 and the M w 6.8 Uglegorskoe earthquake of 4 August 2000, in the north and south of the island, respectively. Only about five seismograph stations record earthquakes along the 1000 km, mostly strike-slip plate boundary that transects the island from north to south. In spite of that, it was possible to investigate seismicity patterns of the last two to three decades quantitatively. We found that in, and surrounding, their source volumes, both of these main shocks were preceded by periods of pronounced seismic quiescence, which lasted 2.5 ± 0.5 years. The distances to which the production of earthquakes was reduced reached several hundred kilometers. The probability that these periods of anomalously low seismicity occurred by chance is estimated to be about 1% to 2%. These conclusions were reached independently by the application of two methods, which are based on different approaches. The RTL-algorithm measures the level of seismic activity in moving time windows by counting the number of earthquakes, weighted by their size, and inversely weighted by their distance, in time and space from the point of observation. The Z-mapping approach measures the difference of the seismicity rate, within moving time windows, to the background rate by the standard deviate Z. This generates an array of comparisons that cover all of the available time and space, and that can be searched for all anomalous departures from the normal seismicity rate. The RTL-analysis was based on the original catalog with K-classes measuring the earthquake sizes; the Z-mapping was based on the catalog with Ktransformed into magnitudes. The RTL-analysis started with data from 1980, the Z-mapping technique used the data from 1974 on. In both methods, cylindrical volumes, centered at the respective epicenters, were sampled. The Z-mapping technique additionally investigated the

  14. Missing great earthquakes

    USGS Publications Warehouse

    Hough, Susan E.

    2013-01-01

    The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

  15. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  16. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  17. Earthquake detection through computationally efficient similarity search.

    PubMed

    Yoon, Clara E; O'Reilly, Ossian; Bergen, Karianne J; Beroza, Gregory C

    2015-12-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection-identification of seismic events in continuous data-is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact "fingerprints" of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes.

  18. Post-earthquake building safety assessments for the Canterbury Earthquakes

    USGS Publications Warehouse

    Marshall, J.; Barnes, J.; Gould, N.; Jaiswal, K.; Lizundia, B.; Swanson, David A.; Turner, F.

    2012-01-01

    This paper explores the post-earthquake building assessment program that was utilized in Christchurch, New Zealand following the Canterbury Sequence of earthquakes beginning with the Magnitude (Mw.) 7.1 Darfield event in September 2010. The aftershocks or triggered events, two of which exceeded Mw 6.0, continued with events in February and June 2011 causing the greatest amount of damage. More than 70,000 building safety assessments were completed following the February event. The timeline and assessment procedures will be discussed including the use of rapid response teams, selection of indicator buildings to monitor damage following aftershocks, risk assessments for demolition of red-tagged buildings, the use of task forces to address management of the heavily damaged downtown area and the process of demolition. Through the post-event safety assessment program that occurred throughout the Canterbury Sequence of earthquakes, many important lessons can be learned that will benefit future response to natural hazards that have potential to damage structures.

  19. Benefits of Earthquake Early Warning to Large Municipalities (Invited)

    NASA Astrophysics Data System (ADS)

    Featherstone, J.

    2013-12-01

    The City of Los Angeles has been involved in the testing of the Cal Tech Shake Alert, Earthquake Early Warning (EQEW) system, since February 2012. This system accesses a network of seismic monitors installed throughout California. The system analyzes and processes seismic information, and transmits a warning (audible and visual) when an earthquake occurs. In late 2011, the City of Los Angeles Emergency Management Department (EMD) was approached by Cal Tech regarding EQEW, and immediately recognized the value of the system. Simultaneously, EMD was in the process of finalizing a report by a multi-discipline team that visited Japan in December 2011, which spoke to the effectiveness of EQEW for the March 11, 2011 earthquake that struck that country. Information collected by the team confirmed that the EQEW systems proved to be very effective in alerting the population of the impending earthquake. The EQEW in Japan is also tied to mechanical safeguards, such as the stopping of high-speed trains. For a city the size and complexity of Los Angeles, the implementation of a reliable EQEW system will save lives, reduce loss, ensure effective and rapid emergency response, and will greatly enhance the ability of the region to recovery from a damaging earthquake. The current Shake Alert system is being tested at several governmental organizations and private businesses in the region. EMD, in cooperation with Cal Tech, identified several locations internal to the City where the system would have an immediate benefit. These include the staff offices within EMD, the Los Angeles Police Department's Real Time Analysis and Critical Response Division (24 hour crime center), and the Los Angeles Fire Department's Metropolitan Fire Communications (911 Dispatch). All three of these agencies routinely manage the collaboration and coordination of citywide emergency information and response during times of crisis. Having these three key public safety offices connected and included in the

  20. Mexican Earthquakes and Tsunamis Catalog Reviewed

    NASA Astrophysics Data System (ADS)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  1. Lightning Activities and Earthquakes

    NASA Astrophysics Data System (ADS)

    Liu, Jann-Yenq

    2016-04-01

    The lightning activity is one of the key parameters to understand the atmospheric electric fields and/or currents near the Earth's surface as well as the lithosphere-atmosphere coupling during the earthquake preparation period. In this study, to see whether or not lightning activities are related to earthquakes, we statistically examine lightning activities 30 days before and after 78 land and 230 sea M>5.0 earthquakes in Taiwan during the 12-year period of 1993-2004. Lightning activities versus the location, depth, and magnitude of earthquakes are investigated. Results show that lightning activities tend to appear around the forthcoming epicenter and are significantly enhanced a few, especially 17-19, days before the M>6.0 shallow (depth D< 20 km) land earthquakes. Moreover, the size of the area around the epicenter with the statistical significance of lightning activity enhancement is proportional to the earthquake magnitude.

  2. Patient experiences with self-monitoring renal function after renal transplantation: results from a single-center prospective pilot study

    PubMed Central

    van Lint, Céline L; van der Boog, Paul JM; Wang, Wenxin; Brinkman, Willem-Paul; Rövekamp, Ton JM; Neerincx, Mark A; Rabelink, Ton J; van Dijk, Sandra

    2015-01-01

    Background After a kidney transplantation, patients have to visit the hospital often to monitor for early signs of graft rejection. Self-monitoring of creatinine in addition to blood pressure at home could alleviate the burden of frequent outpatient visits, but only if patients are willing to self-monitor and if they adhere to the self-monitoring measurement regimen. A prospective pilot study was conducted to assess patients’ experiences and satisfaction. Materials and methods For 3 months after transplantation, 30 patients registered self-measured creatinine and blood pressure values in an online record to which their physician had access to. Patients completed a questionnaire at baseline and follow-up to assess satisfaction, attitude, self-efficacy regarding self-monitoring, worries, and physician support. Adherence was studied by comparing the number of registered with the number of requested measurements. Results Patients were highly motivated to self-monitor kidney function, and reported high levels of general satisfaction. Level of satisfaction was positively related to perceived support from physicians (P<0.01), level of self-efficacy (P<0.01), and amount of trust in the accuracy of the creatinine meter (P<0.01). The use of both the creatinine and blood pressure meter was considered pleasant and useful, despite the level of trust in the accuracy of the creatinine device being relatively low. Trust in the accuracy of the creatinine device appeared to be related to level of variation in subsequent measurement results, with more variation being related to lower levels of trust. Protocol adherence was generally very high, although the range of adherence levels was large and increased over time. Conclusion Patients’ high levels of satisfaction suggest that at-home monitoring of creatinine and blood pressure after transplantation offers a promising strategy. Important prerequisites for safe implementation in transplant care seem to be support from physicians

  3. Earthquake at 40 feet

    USGS Publications Warehouse

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  4. Business Activity Monitoring: Real-Time Group Goals and Feedback Using an Overhead Scoreboard in a Distribution Center

    ERIC Educational Resources Information Center

    Goomas, David T.; Smith, Stuart M.; Ludwig, Timothy D.

    2011-01-01

    Companies operating large industrial settings often find delivering timely and accurate feedback to employees to be one of the toughest challenges they face in implementing performance management programs. In this report, an overhead scoreboard at a retailer's distribution center informed teams of order selectors as to how many tasks were…

  5. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    NASA Astrophysics Data System (ADS)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ≈3 lower than average for California earthquakes. I

  6. Earthquakes, November-December 1973

    USGS Publications Warehouse

    Person, W.J.

    1974-01-01

    Other parts of the world suffered fatalities and significant damage from earthquakes. In Iran, an earthquake killed one person, injured many, and destroyed a number of homes. Earthquake fatalities also occurred in the Azores and in Algeria. 

  7. Are Earthquake Magnitudes Clustered?

    SciTech Connect

    Davidsen, Joern; Green, Adam

    2011-03-11

    The question of earthquake predictability is a long-standing and important challenge. Recent results [Phys. Rev. Lett. 98, 098501 (2007); ibid.100, 038501 (2008)] have suggested that earthquake magnitudes are clustered, thus indicating that they are not independent in contrast to what is typically assumed. Here, we present evidence that the observed magnitude correlations are to a large extent, if not entirely, an artifact due to the incompleteness of earthquake catalogs and the well-known modified Omori law. The latter leads to variations in the frequency-magnitude distribution if the distribution is constrained to those earthquakes that are close in space and time to the directly following event.

  8. Earthquakes and Plate Boundaries

    ERIC Educational Resources Information Center

    Lowman, Paul; And Others

    1978-01-01

    Contains the contents of the Student Investigation booklet of a Crustal Evolution Education Project (CEEP) instructional modules on earthquakes. Includes objectives, procedures, illustrations, worksheets, and summary questions. (MA)

  9. First Results of the Regional Earthquake Likelihood Models Experiment

    USGS Publications Warehouse

    Schorlemmer, D.; Zechar, J.D.; Werner, M.J.; Field, E.H.; Jackson, D.D.; Jordan, T.H.

    2010-01-01

    The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize earthquake occurrence hypotheses in the form of prospective earthquake rate forecasts in California. RELM members, working in small research groups, developed more than a dozen 5-year forecasts; they also outlined a performance evaluation method and provided a conceptual description of a Testing Center in which to perform predictability experiments. Subsequently, researchers working within the Collaboratory for the Study of Earthquake Predictability (CSEP) have begun implementing Testing Centers in different locations worldwide, and the RELM predictability experiment-a truly prospective earthquake prediction effort-is underway within the U. S. branch of CSEP. The experiment, designed to compare time-invariant 5-year earthquake rate forecasts, is now approximately halfway to its completion. In this paper, we describe the models under evaluation and present, for the first time, preliminary results of this unique experiment. While these results are preliminary-the forecasts were meant for an application of 5 years-we find interesting results: most of the models are consistent with the observation and one model forecasts the distribution of earthquakes best. We discuss the observed sample of target earthquakes in the context of historical seismicity within the testing region, highlight potential pitfalls of the current tests, and suggest plans for future revisions to experiments such as this one. ?? 2010 The Author(s).

  10. AMBIENT AIR MONITORING AT GROUND ZERO AND LOWER MANHATTAN FOLLOWING THE COLLAPSE OF THE WORLD TRADE CENTER

    EPA Science Inventory

    The U.S. EPA National Exposure Research Laboratory (NERL) collaborated with EPA's Regional offices to establish a monitoring network to characterize ambient air concentrations of particulate matter (PM) and air toxics in lower Manhattan following the collapse of the World Trade...

  11. Electric field and ion density anomalies in the mid latitude ionosphere: Possible connection with earthquakes?

    NASA Astrophysics Data System (ADS)

    Gousheva, M. N.; Glavcheva, R. P.; Danov, D. L.; Hristov, P. L.; Kirov, B. B.; Georgieva, K. Y.

    2008-07-01

    The problem of earthquake prediction has stimulated the search for a correlation between seismic activity and ionospherical anomalies. We found observational evidence of possible earthquake effects in the near-equatorial and low latitude ionosphere; these ionospheric anomalies have been proposed by Gousheva et al. [Gousheva, M., Glavcheva, R., Danov, D., Angelov P., Hristov, P., Influence of earthquakes on the electric field disturbances in the ionosphere on board of the Intercosmos-Bulgaria-1300 satellite. Compt. Rend. Acad. Bulg. Sci. 58 (8) 911-916, 2005a; Gousheva, M., Glavcheva, R., Danov, D., Angelov, P., Hristov, P., Kirov, B., Georgieva, K., Observation from the Intercosmos-Bulgaria-1300 satellite of anomalies associated with seismic activity. In: Poster Proceeding of 2nd International Conference on Recent Advances in Space Technologies: Space in the Service of Society, RAST '2005, June 9-11, Istanbul, Turkey, pp. 119-123, 2005b; Gousheva, M., Glavcheva, R., Danov, D., Angelov, P., Hristov, P., Kirov, B., Georgieva, K., Satellite monitoring of anomalous effects in the ionosphere probably related to strong earthquakes. Adv. Space Res. 37 (4), 660-665, 2006]. This paper presents new results from observations of the quasi-static electric field and ion density on board INTERCOSMOS-BULGARIA-1300 satellite in the mid latitude ionosphere above sources of moderate earthquakes. Data from INTERCOSMOS-BULGARIA-1300 satellite and seismic data (World Data Center, Denver, Colorado, USA) for magnetically quiet and medium quiet days are juxtaposed in time-space domain. For satellite's orbits in the time period 15.09-01.10.1981 an increase in the horizontal and vertical components of the quasi-static electric field and fluctuations of the ion density are observed over zones of forthcoming seismic events. Some similar post effects are observed too. The emphasis of this paper is put on the anomalies which specify the mid latitude ionosphere. The obtained results contain

  12. Waveform Cross-Correlation for Improved North Texas Earthquake Locations

    NASA Astrophysics Data System (ADS)

    Phillips, M.; DeShon, H. R.; Oldham, H. R.; Hayward, C.

    2014-12-01

    In November 2013, a sequence of earthquakes began in Reno and Azle, TX, two communities located northwest of Fort Worth in an area of active oil and gas extraction. Only one felt earthquake had been reported within the area before the occurrence of probable injection-induced earthquakes at the Dallas-Fort Worth airport in 2008. The USGS National Earthquakes Information Center (NEIC) has reported 27 felt earthquakes in the Reno-Azle area through January 28, 2014. A temporary seismic network was installed beginning in December 2013 to acquire data to improve location and magnitude estimates and characterize the earthquake sequence. Here, we present high-resolution relative earthquake locations derived using differential time data from waveform cross-correlation. Cross-correlation is computed using the GISMO software suite and event relocation is done using double difference relocation techniques. Waveform cross-correlation of the local data indicates high (>70%) similarity between 4 major swarms of events lasting between 18 and 24 hours. These swarms are temporal zones of high event frequency; 1.4% of the time series data accounts for 42.1% of the identified local earthquakes. Local earthquakes are occurring along the Newark East Fault System, a NE-SW striking normal fault system previously thought inactive at depths between 2 and 8 km in the Ellenburger limestone formation and underlying Precambrian basement. Data analysis is ongoing and continued characterization of the associated fault will provide improved location estimates.

  13. Catalog of Earthquake Hypocenters at Alaskan Volcanoes: January 1 through December 31, 2006

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl

    2008-01-01

    Between January 1 and December 31, 2006, AVO located 8,666 earthquakes of which 7,783 occurred on or near the 33 volcanoes monitored within Alaska. Monitoring highlights in 2006 include: an eruption of Augustine Volcano, a volcanic-tectonic earthquake swarm at Mount Martin, elevated seismicity and volcanic unrest at Fourpeaked Mountain, and elevated seismicity and low-level tremor at Mount Veniaminof and Korovin Volcano. A new seismic subnetwork was installed on Fourpeaked Mountain. This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field during 2006, (2) a description of earthquake detection, recording, analysis, and data archival systems, (3) a description of seismic velocity models used for earthquake locations, (4) a summary of earthquakes located in 2006, and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2006.

  14. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2006

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Searcy, Cheryl

    2008-01-01

    Between January 1 and December 31, 2006, AVO located 8,666 earthquakes of which 7,783 occurred on or near the 33 volcanoes monitored within Alaska. Monitoring highlights in 2006 include: an eruption of Augustine Volcano, a volcanic-tectonic earthquake swarm at Mount Martin, elevated seismicity and volcanic unrest at Fourpeaked Mountain, and elevated seismicity and low-level tremor at Mount Veniaminof and Korovin Volcano. A new seismic subnetwork was installed on Fourpeaked Mountain. This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field during 2006, (2) a description of earthquake detection, recording, analysis, and data archival systems, (3) a description of seismic velocity models used for earthquake locations, (4) a summary of earthquakes located in 2006, and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2006.

  15. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  16. Development of Diagnostic Reference Levels Using a Real-Time Radiation Dose Monitoring System at a Cardiovascular Center in Korea.

    PubMed

    Kim, Jungsu; Seo, Deoknam; Choi, Inseok; Nam, Sora; Yoon, Yongsu; Kim, Hyunji; Her, Jae; Han, Seonggyu; Kwon, Soonmu; Park, Hunsik; Yang, Dongheon; Kim, Jungmin

    2015-12-01

    Digital cardiovascular angiography accounts for a major portion of the radiation dose among the examinations performed at cardiovascular centres. However, dose-related information is neither monitored nor recorded systemically. This report concerns the construction of a radiation dose monitoring system based on digital imaging and communications in medicine (DICOM) data and its use at the cardiovascular centre of the University Hospitals in Korea. The dose information was analysed according to DICOM standards for a series of procedures, and the formulation of diagnostic reference levels (DRLs) at our cardiovascular centre represents the first of its kind in Korea. We determined a dose area product (DAP) DRL for coronary angiography of 75.6 Gy cm(2) and a fluoroscopic time DRL of 318.0 s. The DAP DRL for percutaneous transluminal coronary intervention was 213.3 Gy cm(2), and the DRL for fluoroscopic time was 1207.5 s.

  17. Development of Diagnostic Reference Levels Using a Real-Time Radiation Dose Monitoring System at a Cardiovascular Center in Korea.

    PubMed

    Kim, Jungsu; Seo, Deoknam; Choi, Inseok; Nam, Sora; Yoon, Yongsu; Kim, Hyunji; Her, Jae; Han, Seonggyu; Kwon, Soonmu; Park, Hunsik; Yang, Dongheon; Kim, Jungmin

    2015-12-01

    Digital cardiovascular angiography accounts for a major portion of the radiation dose among the examinations performed at cardiovascular centres. However, dose-related information is neither monitored nor recorded systemically. This report concerns the construction of a radiation dose monitoring system based on digital imaging and communications in medicine (DICOM) data and its use at the cardiovascular centre of the University Hospitals in Korea. The dose information was analysed according to DICOM standards for a series of procedures, and the formulation of diagnostic reference levels (DRLs) at our cardiovascular centre represents the first of its kind in Korea. We determined a dose area product (DAP) DRL for coronary angiography of 75.6 Gy cm(2) and a fluoroscopic time DRL of 318.0 s. The DAP DRL for percutaneous transluminal coronary intervention was 213.3 Gy cm(2), and the DRL for fluoroscopic time was 1207.5 s. PMID:25700616

  18. Center of Excellence for Applied Mathematical and Statistical Research in support of development of multicrop production monitoring capability

    NASA Technical Reports Server (NTRS)

    Woodward, W. A.; Gray, H. L.

    1983-01-01

    Efforts in support of the development of multicrop production monitoring capability are reported. In particular, segment level proportion estimation techniques based upon a mixture model were investigated. Efforts have dealt primarily with evaluation of current techniques and development of alternative ones. A comparison of techniques is provided on both simulated and LANDSAT data along with an analysis of the quality of profile variables obtained from LANDSAT data.

  19. Earthquakes and Schools

    ERIC Educational Resources Information Center

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  20. Earthquake research in China

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    The prediction of the Haicheng earthquake was an extraordinary achievement by the geophysical workers of the People's Republic of China, whose national program in earthquake reserach was less than 10 years old at the time. To study the background to this prediction, a delgation of 10 U.S scientists, which I led, visited China in June 1976. 

  1. Can we control earthquakes?

    USGS Publications Warehouse

    Raleigh, B.

    1977-01-01

    In 1966, it was discovered that high pressure injection of industrial waste fluids into the subsurface near Denver, Colo., was triggering earthquakes. While this was disturbing at the time, it was also exciting because there was immediate speculation that here at last was a mechanism to control earthquakes.  

  2. a Real-Time Earthquake Moment Tensor Scanning Code for the Antelope System (brtt, Inc)

    NASA Astrophysics Data System (ADS)

    Macpherson, K. A.; Ruppert, N. A.; Freymueller, J. T.; Lindquist, K.; Harvey, D.; Dreger, D. S.; Lombard, P. N.; Guilhem, A.

    2015-12-01

    While all seismic observatories routinely determine hypocentral location and local magnitude within a few minutes of an earthquake's occurrence, the ability to estimate seismic moment and sense of slip in a similar time frame is less widespread. This is unfortunate, because moment and mechanism are critical parameters for rapid hazard assessment; for larger events, moment magnitude is more reliable due to the tendency of local magnitude to saturate, and certain mechanisms such as off-shore thrust events might indicate earthquakes with tsunamigenic potential. In order to increase access to this capability, we have developed a continuous moment tensor scanning code for Antelope, the ubiquitous open-architecture seismic acquisition and processing software in use around the world. The scanning code, which uses an algorithm that has previously been employed for real-time monitoring at the University of California, Berkeley, is able to produce full moment tensor solutions for moderate events from regional seismic data. The algorithm monitors a grid of potential sources by continuously cross-correlating pre-computed synthetic seismograms with long-period recordings from a sparse network of broad-band stations. The code package consists of 3 modules. One module is used to create a monitoring grid by constructing source-receiver geometry, calling a frequency-wavenumber code to produce synthetics, and computing the generalized linear inverse of the array of synthetics. There is a real-time scanning module that correlates streaming data with pre-inverted synthetics, monitors the variance reduction, and writes the moment tensor solution to a database if an earthquake detection occurs. Finally, there is an 'off-line' module that is very similar to the real-time scanner, with the exception that it utilizes pre-recorded data stored in Antelope databases and is useful for testing purposes or for quickly producing moment tensor catalogs for long time series. The code is open source

  3. Earthquake-explosion discrimination using diffusion maps

    NASA Astrophysics Data System (ADS)

    Rabin, N.; Bregman, Y.; Lindenbaum, O.; Ben-Horin, Y.; Averbuch, A.

    2016-09-01

    Discrimination between earthquakes and explosions is an essential component of nuclear test monitoring and it is also important for maintaining the quality of earthquake catalogs. Currently used discrimination methods provide a partial solution to the problem. In this work, we apply advanced machine learning methods and in particular diffusion maps for modeling and discriminating between seismic signals. Diffusion maps enable us to construct a geometric representation that capture the intrinsic structure of the seismograms. The diffusion maps are applied after a pre-processing step, in which seismograms are converted to normalized sonograms. The constructed low-dimensional model is used for automatic earthquake-explosion discrimination of data that is collected in single seismic stations. We demonstrate our approach on a data set comprising seismic events from the Dead Sea area. The diffusion-based algorithm provides correct discrimination rate that is higher than 90%.

  4. 2003-2004 Campaign GPS Geodetic Monitoring of Surface Deformation Proximal to Volcanic Centers, Commonwealth of Dominica, Lesser Antilles.

    NASA Astrophysics Data System (ADS)

    Davidson, R. T.; Turner, H. L.; Blessing, B. C.; Parra, J.; Fitzgibbon, K.; Jansma, P.; Mattioli, G.

    2004-12-01

    The Commonwealth of Dominica, located midway along the Lesser Antilles island arc, is home to several (at least eight) potentially active volcanic centers. Spurred by recent seismic crises on the island - in the south from 1998-2000 and in the north in 2003 - twelve GPS monuments were installed in two field campaigns in 2001 and 2003. All twelve sites, along with five of six newly installed sites, were occupied continuously for ~2.5 or more UTC days in 2004 using Ashtech Z-12 dual-frequency, code-phase receivers and choke ring antenna to assess the highly complex and possibly interconnected volcanic systems of Dominica. We examine data from the 2003-2004 epochs because of the highly variable, shallow seismicity preceding this period. This way one can potentially isolate the changes that occurred without the data from previous observations influencing the results. Although only two epochs have been included, data quality and reliability can be established from sites distant from volcanic centers, as such sites show consistent velocities from all three epochs of observation over the 2001-2004 period. Between 2003 and 2004, multiple sites show velocities that are inconsistent with a simple tectonic interpretation of elastic strain accumulation along the plate interface. Sites located in the vicinity of the volcanic centers in the south central part of the island are moving faster than the 3 epoch 2001-2004 average of the velocities, which is approximately 7mm/year. The four sites at which greater movement has been noted have velocities ranging from approximately 10 to 27 mm/year. We note that the largest surface deformation signal is seen in the south during the same period when the shallow seismicity was at a maximum in the north of the island. While the spatial distribution of sites remains sparse and the velocities relatively imprecise, the preliminary results may indicate shallow magmatic emplacement, geothermal fluctuations, or structural instability in that part

  5. Estimating Temperature Retrieval Accuracy Associated With Thermal Band Spatial Resolution Requirements for Center Pivot Irrigation Monitoring and Management

    NASA Technical Reports Server (NTRS)

    Ryan, Robert E.; Irons, James; Spruce, Joseph P.; Underwood, Lauren W.; Pagnutti, Mary

    2006-01-01

    This study explores the use of synthetic thermal center pivot irrigation scenes to estimate temperature retrieval accuracy for thermal remote sensed data, such as data acquired from current and proposed Landsat-like thermal systems. Center pivot irrigation is a common practice in the western United States and in other parts of the world where water resources are scarce. Wide-area ET (evapotranspiration) estimates and reliable water management decisions depend on accurate temperature information retrieval from remotely sensed data. Spatial resolution, sensor noise, and the temperature step between a field and its surrounding area impose limits on the ability to retrieve temperature information. Spatial resolution is an interrelationship between GSD (ground sample distance) and a measure of image sharpness, such as edge response or edge slope. Edge response and edge slope are intuitive, and direct measures of spatial resolution are easier to visualize and estimate than the more common Modulation Transfer Function or Point Spread Function. For these reasons, recent data specifications, such as those for the LDCM (Landsat Data Continuity Mission), have used GSD and edge response to specify spatial resolution. For this study, we have defined a 400-800 m diameter center pivot irrigation area with a large 25 K temperature step associated with a 300 K well-watered field surrounded by an infinite 325 K dry area. In this context, we defined the benchmark problem as an easily modeled, highly common stressing case. By parametrically varying GSD (30-240 m) and edge slope, we determined the number of pixels and field area fraction that meet a given temperature accuracy estimate for 400-m, 600-m, and 800-m diameter field sizes. Results of this project will help assess the utility of proposed specifications for the LDCM and other future thermal remote sensing missions and for water resource management.

  6. An assessment of seismic monitoring in the United States; requirement for an Advanced National Seismic System

    USGS Publications Warehouse

    ,

    1999-01-01

    This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.

  7. The magnitude distribution of earthquakes near Southern California faults

    USGS Publications Warehouse

    Page, M.T.; Alderson, D.; Doyle, J.

    2011-01-01

    We investigate seismicity near faults in the Southern California Earthquake Center Community Fault Model. We search for anomalously large events that might be signs of a characteristic earthquake distribution. We find that seismicity near major fault zones in Southern California is well modeled by a Gutenberg-Richter distribution, with no evidence of characteristic earthquakes within the resolution limits of the modern instrumental catalog. However, the b value of the locally observed magnitude distribution is found to depend on distance to the nearest mapped fault segment, which suggests that earthquakes nucleating near major faults are likely to have larger magnitudes relative to earthquakes nucleating far from major faults. Copyright 2011 by the American Geophysical Union.

  8. Radiation Environment at LEO in the frame of Space Monitoring Data Center at Moscow State University - recent, current and future missions

    NASA Astrophysics Data System (ADS)

    Myagkova, Irina; Kalegaev, Vladimir; Panasyuk, Mikhail; Svertilov, Sergey; Bogomolov, Vitaly; Bogomolov, Andrey; Barinova, Vera; Barinov, Oleg; Bobrovnikov, Sergey; Dolenko, Sergey; Mukhametdinova, Ludmila; Shiroky, Vladimir; Shugay, Julia

    2016-04-01

    Radiation Environment of Near-Earth space is one of the most important factors of space weather. Space Monitoring Data Center of Moscow State University provides operational control of radiation conditions at Low Earth's Orbits (LEO) of the near-Earth space using data of recent (Vernov, CORONAS series), current (Meteor-M, Electro-L series) and future (Lomonosov) space missions. Internet portal of Space Monitoring Data Center of Skobeltsyn Institute of Nuclear Physics of Lomonosov Moscow State University (SINP MSU) http://swx.sinp.msu.ru/ provides possibilities to control and analyze the space radiation conditions in the real time mode together with the geomagnetic and solar activity including hard X-ray and gamma- emission of solar flares. Operational data obtained from space missions at L1, GEO and LEO and from the Earth's magnetic stations are used to represent radiation and geomagnetic state of near-Earth environment. The models of space environment that use space measurements from different orbits were created. Interactive analysis and operational neural network forecast services are based on these models. These systems can automatically generate alerts on particle fluxes enhancements above the threshold values, both for SEP and relativistic electrons of outer Earth's radiation belt using data from GEO and LEO as input. As an example of LEO data we consider data from Vernov mission, which was launched into solar-synchronous orbit (altitude 640 - 83 0 km, inclination 98.4°, orbital period about 100 min) on July 8, 2014 and began to receive scientific information since July 20, 2014. Vernov mission have provided studies of the Earth's radiation belt relativistic electron precipitation and its possible connection with atmosphere transient luminous events, as well as the solar hard X-ray and gamma-emission measurements. Radiation and electromagnetic environment monitoring in the near-Earth Space, which is very important for space weather study, was also realised

  9. Modeling earthquake dynamics

    NASA Astrophysics Data System (ADS)

    Charpentier, Arthur; Durand, Marilou

    2015-07-01

    In this paper, we investigate questions arising in Parsons and Geist (Bull Seismol Soc Am 102:1-11, 2012). Pseudo causal models connecting magnitudes and waiting times are considered, through generalized regression. We do use conditional model (magnitude given previous waiting time, and conversely) as an extension to joint distribution model described in Nikoloulopoulos and Karlis (Environmetrics 19: 251-269, 2008). On the one hand, we fit a Pareto distribution for earthquake magnitudes, where the tail index is a function of waiting time following previous earthquake; on the other hand, waiting times are modeled using a Gamma or a Weibull distribution, where parameters are functions of the magnitude of the previous earthquake. We use those two models, alternatively, to generate the dynamics of earthquake occurrence, and to estimate the probability of occurrence of several earthquakes within a year or a decade.

  10. Demand surge following earthquakes

    USGS Publications Warehouse

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  11. A search for paleoliquefaction and evidence bearing on the recurrence behavior of the great 1811-12 New Madrid earthquakes

    USGS Publications Warehouse

    Wesnousky, S.G.; Leffler, L.M.

    1994-01-01

    In the winter of 1811-12, three of the largest historic earthquakes in the United States occurred near New Madrid, Missouri. Seismicity continues to the present day throughout a tightly clustered pattern of epicenters centered on the bootheel of Missouri, including parts of northeastern Arkansas, northwestern Tennessee, western Kentucky, and southern Illinois. In 1990, the New Madrid seismic zone/central United States became the first seismically active region east of the Rocky Mountains to be designated a priority research area within the National Earthquake Hazards Reduction Program (NEHRP). This professional paper is a collection of papers, some published separately, presenting results of the newly intensified research program in this area. Major components of this research program include tectonic framework studies, seismicity and deformation monitoring and modeling, improved seismic hazard and risk assessments, and cooperative hazard mitigation studies.

  12. Earthquake-induced water-level fluctuations at Yucca Mountain, Nevada, June 1992

    SciTech Connect

    O`Brien, G.M.

    1993-07-01

    This report presents earthquake-induced water-level and fluid-pressure data for wells in the Yucca Mountain area, Nevada, during June 1992. Three earthquakes occurred which caused significant water-level and fluid-pressure responses in wells. Wells USW H-5 and USW H-6 are continuously monitored to detect short-term responses caused by earthquakes. Two wells, monitored hourly, had significant, longer-term responses in water level following the earthquakes. On June 28, 1992, a 7.5-magnitude earthquake occurred near Landers, California causing an estimated maximum water-level change of 90 centimeters in well USW H-5. Three hours later a 6.6-magnitude earthquake occurred near Big Bear Lake, California; the maximum water-level fluctuation was 20 centimeters in well USW H-5. A 5.6-magnitude earthquake occurred at Little Skull Mountain, Nevada, on June 29, approximately 23 kilometers from Yucca Mountain. The maximum estimated short-term water-level fluctuation from the Little Skull Mountain earthquake was 40 centimeters in well USW H-5. The water level in well UE-25p {number_sign}1, monitored hourly, decreased approximately 50 centimeters over 3 days following the Little Skull Mountain earthquake. The water level in UE-25p {number_sign}1 returned to pre-earthquake levels in approximately 6 months. The water level in the lower interval of well USW H-3 increased 28 centimeters following the Little Skull Mountain earthquake. The Landers and Little Skull Mountain earthquakes caused responses in 17 intervals of 14 hourly monitored wells, however, most responses were small and of short duration. For several days following the major earthquakes, many smaller magnitude aftershocks occurred causing measurable responses in the continuously monitored wells.

  13. Crowd-Sourced Global Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  14. Mono Lake earthquake of October 23, 1990

    SciTech Connect

    McNutt, S.; Bryant, W.; Wilson, R.

    1991-02-01

    On October 23, 1990, a moderate earthquake of local magnitude (M{sub L}) 5.7 shook the Mono Lake area, a region known for its recent volcanic and tectonic activity. The earthquake was centered approximately 5 miles north of Lee Vining and 16 miles southeast of Bridgeport, near Black Point, an isolated flat-topped hill on the north shore of Mono Lake. Shaking from the earthquake was felt at approximately Modified Mercalli Intensity VI in the local area and weakly throughout much of north central California as far west as Sacramento and the San Francisco Bay area. This article summarizes the seismological features of the earthquake and relates the findings made during a surface fault rupture investigation of epicentral area by Division of Mines and Geology (DMG) geologists. To demonstrate how this earthquake fits into the regional tectonic setting, the character of this event is compared to that of other noteworthy seismic events that have occurred over the last 12 years.

  15. Converter Compressor Building, SWMU 089, Hot Spot Areas 1, 2, and 5 Operations, Maintenance, and Monitoring Report, Kennedy Space Center, Florida

    NASA Technical Reports Server (NTRS)

    Wilson, Deborah M.

    2015-01-01

    This Operations, Maintenance, and Monitoring Report (OMMR) presents the findings, observations, and results from operation of the air sparging (AS) interim measure (IM) for Hot Spot (HS) Areas 1, 2, and 5 at the Converter Compressor Building (CCB) located at Kennedy Space Center (KSC), Florida. The objective of the IM at CCB HS Areas 1, 2, and 5 is to decrease concentrations of volatile organic compounds (VOCs) in groundwater in the treatment zones via AS to levels that will enable a transition to a monitored natural attenuation (MNA) phase. This OMMR presents system operations and maintenance (O&M) information and performance monitoring results since full-scale O&M began in June 2014 (2 months after initial system startup in April 2014), including quarterly performance monitoring events in July and October 2014 and January and May 2015. Based on the results to date, the AS system is operating as designed and is meeting the performance criteria and IM objective. The performance monitoring network is adequately constructed for assessment of IM performance at CCB HS Areas 1, 2, and 5. At the March 2014 KSC Remediation Team (KSCRT) Meeting, team consensus was reached for the design prepared for expansion of the system to treat the HS 4 area, and at the November 2014 KSCRT Meeting, team consensus was reached that HS 3 was adequately delineated horizontally and vertically and for selection of AS for the remedial approach for HS 3. At the July 2015 KSCRT meeting, team consensus was reached to continue IM operations in all zones until HSs 3 and 4 is operational, once HS 3 and 4 zones are operational discontinue operations in HS 1, 2, and 5 zones where concentrations are less than GCTLs to observe whether rebounding conditions occur. Team consensus was also reached to continue quarterly performance monitoring to determine whether operational zones achieve GCTLs and to continue annual IGWM of CCB-MW0012, CCBMW0013, and CCB-MW0056, located south of the treatment area. The

  16. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  17. The USGS Earthquake Notification Service (ENS): Customizable notifications of earthquakes around the globe

    USGS Publications Warehouse

    Wald, Lisa A.; Wald, David J.; Schwarz, Stan; Presgrave, Bruce; Earle, Paul; Martinez, Eric; Oppenheimer, David

    2008-01-01

    At the beginning of 2006, the U.S. Geological Survey (USGS) Earthquake Hazards Program (EHP) introduced a new automated Earthquake Notification Service (ENS) to take the place of the National Earthquake Information Center (NEIC) "Bigquake" system and the various other individual EHP e-mail list-servers for separate regions in the United States. These included northern California, southern California, and the central and eastern United States. ENS is a "one-stop shopping" system that allows Internet users to subscribe to flexible and customizable notifications for earthquakes anywhere in the world. The customization capability allows users to define the what (magnitude threshold), the when (day and night thresholds), and the where (specific regions) for their notifications. Customization is achieved by employing a per-user based request profile, allowing the notifications to be tailored for each individual's requirements. Such earthquake-parameter-specific custom delivery was not possible with simple e-mail list-servers. Now that event and user profiles are in a structured query language (SQL) database, additional flexibility is possible. At the time of this writing, ENS had more than 114,000 subscribers, with more than 200,000 separate user profiles. On a typical day, more than 188,000 messages get sent to a variety of widely distributed users for a wide range of earthquake locations and magnitudes. The purpose of this article is to describe how ENS works, highlight the features it offers, and summarize plans for future developments.

  18. Simulating Earthquakes for Science and Society: Earthquake Visualizations Ideal for use in Science Communication and Education

    NASA Astrophysics Data System (ADS)

    de Groot, R.

    2008-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently gained visibility via television news coverage in Southern California. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin & Brick, 2002). For example, The Great Southern California ShakeOut was based on a potential magnitude 7.8 earthquake on the southern San Andreas fault. The visualization created for the ShakeOut was a key scientific and communication tool for the earthquake drill. This presentation will also feature SCEC Virtual Display of Objects visualization software developed by SCEC Undergraduate Studies in Earthquake Information Technology interns. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  19. Cooperative Monitoring Center Occasional Paper/16: The Potential of Technology for the Control of Small Weapons: Applications in Developing Countries

    SciTech Connect

    ALTMANN, JURGEN

    2000-07-01

    For improving the control of small arms, technology provides many possibilities. Present and future technical means are described in several areas. With the help of sensors deployed on the ground or on board aircraft, larger areas can be monitored. Using tags, seals, and locks, important objects and installations can be safeguarded better. With modern data processing and communication systems, more information can be available, and it can be more speedily processed. Together with navigation and transport equipment, action can be taken faster and at greater range. Particular considerations are presented for cargo control at roads, seaports, and airports, for monitoring designated lines, and for the control of legal arms. By starting at a modest level, costs can be kept low, which would aid developing countries. From the menu of technologies available, systems need to be designed for the intended application and with an understanding of the local conditions. It is recommended that states start with short-term steps, such as acquiring more and better radio transceivers, vehicles, small aircraft, and personal computers. For the medium term, states should begin with experiments and field testing of technologies such as tags, sensors, and digital communication equipment.

  20. Emergency response and medical rescue in the worst hit Mianyang areas after the Wenchuan earthquake.

    PubMed

    Lei, Bai Ling; Zhou, Yun; Zhu, Ying; Huang, Xuan Yin; Han, Si Run; Ma, Qiang; He, Jing; Li, Yong Qing

    2008-11-01

    The 12 May 2008 earthquake caused damage to 88% of the health systems in the worst hit areas of Mianyang with 326 casualties and the direct economic loss of RMB 3124 billion. Within 30 minutes of the earthquake, the Mianyang headquarters for earthquake disaster relief and the Mianyang public health headquarters for medical rescue and treatment were organized. Five medical teams were sent to Beichuang County, the worst hit Mianyang area, four hours after the earthquake. A total of 22,947 wounded and sick people were delivered to local hospitals after simple triage and rapid treatment through three stations. By 30 June, the Mianyang medical organization had received 379,600 people and admitted 21,628 inpatients. These 2772 severely wounded (including 146 with limbs amputated and 846 who died in hospital). Since 17 May, 3381 wounded had been transferred to 14 provincial and city-level hospitals across China. On 20 June, the Mianyang Rehabilitation Center for wounded and sick people was established and received 156 rehabilitation inpatients. Together with the medical team for psychological intervention, they provided psychological support for over 70,000 people. Within two hours of the earthquake, the Mianyang Organization for Health and Epidemic Control and Prevention launched the emergency response plan for major natural disasters. The organization sent emergency teams for disease prevention and control and completed disinfection and burial of corpses and disposal of carcasses, monitoring of water quality and epidemics, disinfection of environmental ruins, epidemic control in resettled areas, precautions against secondary disasters caused by the earthquake, and large-scale health education. The emergency command system for medical rescue and disease control and prevention in the Mianyang areas integrated resources, carried out unified command, and responded rapidly. Furthermore, the headquarters of medical relief co-ordinated and united the governmental and

  1. Changes in permeability caused by earthquakes

    NASA Astrophysics Data System (ADS)

    Manga, Michael; Wang, Chi-Yuen; Shi, Zheming

    2016-04-01

    Earthquakes induce a range of hydrological responses, including changes in streamflow and changes in the water level in wells. Here we show that many of these responses are caused the changes in permeability produced by the passage of seismic waves. First we analyze streams that were dry or nearly dry before the 2014 M6 Napa, California, earthquake butstarted to flow after the earthquake. We show that the new flows were meteoric in origin and originate in the nearby mountains. Responses are not correlated with the sign of static strains implying seismic waves liberated this water, presumably by changing permeability. We also analyze a large network of wells in China that responded to 4 large earthquakes. We monitor permeability changes through their effect on the water level response to solid Earth tides. We find that when earthquakes produce sustained changes in water level, permeability also changes. Wells with water level changes that last for only days show no evidence for changes in aquifer permeability.

  2. The mass balance of earthquakes and earthquake sequences

    NASA Astrophysics Data System (ADS)

    Marc, O.; Hovius, N.; Meunier, P.

    2016-04-01

    Large, compressional earthquakes cause surface uplift as well as widespread mass wasting. Knowledge of their trade-off is fragmentary. Combining a seismologically consistent model of earthquake-triggered landsliding and an analytical solution of coseismic surface displacement, we assess how the mass balance of single earthquakes and earthquake sequences depends on fault size and other geophysical parameters. We find that intermediate size earthquakes (Mw 6-7.3) may cause more erosion than uplift, controlled primarily by seismic source depth and landscape steepness, and less so by fault dip and rake. Such earthquakes can limit topographic growth, but our model indicates that both smaller and larger earthquakes (Mw < 6, Mw > 7.3) systematically cause mountain building. Earthquake sequences with a Gutenberg-Richter distribution have a greater tendency to lead to predominant erosion, than repeating earthquakes of the same magnitude, unless a fault can produce earthquakes with Mw > 8 or more.

  3. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    earthquakes within, as an average 90s of their occurrence, and can map, in certain cases, the damaged areas. Thanks to the flashsourced and crowdsourced information, we developed an innovative Twitter earthquake information service (currently under test and to be open by November) which intends to offer notifications for earthquakes that matter for the public only. It provides timely information for felt and damaging earthquakes regardless their magnitude and heads-up for seismologists. In conclusion, the experience developed at the EMSC demonstrates the benefit of involving eyewitnesses in earthquake surveillance. The data collected directly and indirectly from eyewitnesses complement information derived from monitoring networks and contribute to improved services. By increasing interaction between science and society, it opens new opportunities for raising awareness on seismic hazard.

  4. The Himalayan Seismogenic Zone: A New Frontier for Earthquake Research

    NASA Astrophysics Data System (ADS)

    Brown, Larry; Hubbard, Judith; Karplus, Marianne; Klemperer, Simon; Sato, Hiroshi

    2016-04-01

    significance of blind splay faulting in accommodating slip? m) Do lithologic contrasts juxtaposed across the continental seismogenic zone play a role in the rheological behavior of the SZ in the same manner as proposed for the ocean SZ? Major differences in the study of the continental vs oceanic seismogenic zone include the fact that Himalaya structures are open to: a) direct geological observation via field mapping b) dense and wide aperture monitoring of surface strain via GPS and INSAR c) extensive sampling of geofluids via surface flows and shallow drill holes d) cost effective deployment of long term geophysical arrays (e.g. seismic and MT) designed to detect subtle variations if physical properties within the seismogenic zone, and ultimately, e) a fixed platform for deep drilling of past and future rupture zones It remains to be established whether the Himalayan seismogenic zone has the potential for earthquakes of the greatest magnitudes (e.g. 9.0+). However, there is no question that future ruptures in this system represent a serious threat to major population centers (megacities) in the Indian subcontinent. For this reason alone the HSZ is deserving of a major new international, multidisciplinary effort.

  5. Magnitude and location of historical earthquakes in Japan and implications for the 1855 Ansei Edo earthquake

    USGS Publications Warehouse

    Bakun, W.H.

    2005-01-01

    Japan Meteorological Agency (JMA) intensity assignments IJMA are used to derive intensity attenuation models suitable for estimating the location and an intensity magnitude Mjma for historical earthquakes in Japan. The intensity for shallow crustal earthquakes on Honshu is equal to -1.89 + 1.42MJMA - 0.00887?? h - 1.66log??h, where MJMA is the JMA magnitude, ??h = (??2 + h2)1/2, and ?? and h are epicentral distance and focal depth (km), respectively. Four earthquakes located near the Japan Trench were used to develop a subducting plate intensity attenuation model where intensity is equal to -8.33 + 2.19MJMA -0.00550??h - 1.14 log ?? h. The IJMA assignments for the MJMA7.9 great 1923 Kanto earthquake on the Philippine Sea-Eurasian plate interface are consistent with the subducting plate model; Using the subducting plate model and 226 IJMA IV-VI assignments, the location of the intensity center is 25 km north of the epicenter, Mjma is 7.7, and MJMA is 7.3-8.0 at the 1?? confidence level. Intensity assignments and reported aftershock activity for the enigmatic 11 November 1855 Ansei Edo earthquake are consistent with an MJMA 7.2 Philippine Sea-Eurasian interplate source or Philippine Sea intraslab source at about 30 km depth. If the 1855 earthquake was a Philippine Sea-Eurasian interplate event, the intensity center was adjacent to and downdip of the rupture area of the great 1923 Kanto earthquake, suggesting that the 1855 and 1923 events ruptured adjoining sections of the Philippine Sea-Eurasian plate interface.

  6. Implications for earthquake risk reduction in the United States from the Kocaeli, Turkey, earthquake of August 17, 1999

    USGS Publications Warehouse

    ,

    2000-01-01

    This report documents implications for earthquake risk reduction in the U.S. The magnitude 7.4 earthquake caused 17,127 deaths, 43,953 injuries, and displaced more than 250,000 people from their homes. The report warns that similar disasters are possible in the United States where earthquakes of comparable size strike the heart of American urban areas. Another concern described in the report is the delayed emergency response that was caused by the inadequate seismic monitoring system in Turkey, a problem that contrasts sharply with rapid assessment and response to the September Chi-Chi earthquake in Taiwan. Additionally, the experience in Turkey suggests that techniques for forecasting earthquakes may be improving.

  7. Earthquakes; January-February 1982

    USGS Publications Warehouse

    Person, W.J.

    1982-01-01

    In the United States, a number of earthquakes occurred, but only minor damage was reported. Arkansas experienced a swarm of earthquakes beginning on January 12. Canada experienced one of its strongest earthquakes in a number of years on January 9; this earthquake caused slight damage in Maine. 

  8. Earthquakes, November-December 1975

    USGS Publications Warehouse

    Person, W.J.

    1976-01-01

    Hawaii experienced its strongest earthquake in more than a century. The magnitude 7.2 earthquake on November 29, killed at least 2 and injured about 35. These were the first deaths from an earthquake in the United States dince the San Fernando earthquake of Febraury 1971. 

  9. Earthquakes, September-October 1986

    USGS Publications Warehouse

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  10. Earthquakes; July-August, 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    Earthquake activity during this period was about normal. Deaths from earthquakes were reported from Greece and Guatemala. Three major earthquakes (magnitude 7.0-7.9) occurred in Taiwan, Chile, and Costa Rica. In the United States, the most significant earthquake was a magnitude 5.6 on August 13 in southern California. 

  11. Early non-invasive cardiac output monitoring in hemodynamically unstable intensive care patients: A multi-center randomized controlled trial

    PubMed Central

    2011-01-01

    Introduction Acute hemodynamic instability increases morbidity and mortality. We investigated whether early non-invasive cardiac output monitoring enhances hemodynamic stabilization and improves outcome. Methods A multicenter, randomized controlled trial was conducted in three European university hospital intensive care units in 2006 and 2007. A total of 388 hemodynamically unstable patients identified during their first six hours in the intensive care unit (ICU) were randomized to receive either non-invasive cardiac output monitoring for 24 hrs (minimally invasive cardiac output/MICO group; n = 201) or usual care (control group; n = 187). The main outcome measure was the proportion of patients achieving hemodynamic stability within six hours of starting the study. Results The number of hemodynamic instability criteria at baseline (MICO group mean 2.0 (SD 1.0), control group 1.8 (1.0); P = .06) and severity of illness (SAPS II score; MICO group 48 (18), control group 48 (15); P = .86)) were similar. At 6 hrs, 45 patients (22%) in the MICO group and 52 patients (28%) in the control group were hemodynamically stable (mean difference 5%; 95% confidence interval of the difference -3 to 14%; P = .24). Hemodynamic support with fluids and vasoactive drugs, and pulmonary artery catheter use (MICO group: 19%, control group: 26%; P = .11) were similar in the two groups. The median length of ICU stay was 2.0 (interquartile range 1.2 to 4.6) days in the MICO group and 2.5 (1.1 to 5.0) days in the control group (P = .38). The hospital mortality was 26% in the MICO group and 21% in the control group (P = .34). Conclusions Minimally-invasive cardiac output monitoring added to usual care does not facilitate early hemodynamic stabilization in the ICU, nor does it alter the hemodynamic support or outcome. Our results emphasize the need to evaluate technologies used to measure stroke volume and cardiac output--especially their impact on the process of care--before any large

  12. Simulating Earthquakes for Science and Society: New Earthquake Visualizations Ideal for Use in Science Communication

    NASA Astrophysics Data System (ADS)

    de Groot, R. M.; Benthien, M. L.

    2006-12-01

    The Southern California Earthquake Center (SCEC) has been developing groundbreaking computer modeling capabilities for studying earthquakes. These visualizations were initially shared within the scientific community but have recently have gained visibility via television news coverage in Southern California. These types of visualizations are becoming pervasive in the teaching and learning of concepts related to earth science. Computers have opened up a whole new world for scientists working with large data sets, and students can benefit from the same opportunities (Libarkin &Brick, 2002). Earthquakes are ideal candidates for visualization products: they cannot be predicted, are completed in a matter of seconds, occur deep in the earth, and the time between events can be on a geologic time scale. For example, the southern part of the San Andreas fault has not seen a major earthquake since about 1690, setting the stage for an earthquake as large as magnitude 7.7 -- the "big one." Since no one has experienced such an earthquake, visualizations can help people understand the scale of such an event. Accordingly, SCEC has developed a revolutionary simulation of this earthquake, with breathtaking visualizations that are now being distributed. According to Gordin and Pea (1995), theoretically visualization should make science accessible, provide means for authentic inquiry, and lay the groundwork to understand and critique scientific issues. This presentation will discuss how the new SCEC visualizations and other earthquake imagery achieve these results, how they fit within the context of major themes and study areas in science communication, and how the efficacy of these tools can be improved.

  13. An Atlas of ShakeMaps for Selected Global Earthquakes

    USGS Publications Warehouse

    Allen, Trevor I.; Wald, David J.; Hotovec, Alicia J.; Lin, Kuo-Wan; Earle, Paul; Marano, Kristin D.

    2008-01-01

    An atlas of maps of peak ground motions and intensity 'ShakeMaps' has been developed for almost 5,000 recent and historical global earthquakes. These maps are produced using established ShakeMap methodology (Wald and others, 1999c; Wald and others, 2005) and constraints from macroseismic intensity data, instrumental ground motions, regional topographically-based site amplifications, and published earthquake-rupture models. Applying the ShakeMap methodology allows a consistent approach to combine point observations with ground-motion predictions to produce descriptions of peak ground motions and intensity for each event. We also calculate an estimated ground-motion uncertainty grid for each earthquake. The Atlas of ShakeMaps provides a consistent and quantitative description of the distribution and intensity of shaking for recent global earthquakes (1973-2007) as well as selected historic events. As such, the Atlas was developed specifically for calibrating global earthquake loss estimation methodologies to be used in the U.S. Geological Survey Prompt Assessment of Global Earthquakes for Response (PAGER) Project. PAGER will employ these loss models to rapidly estimate the impact of global earthquakes as part of the USGS National Earthquake Information Center's earthquake-response protocol. The development of the Atlas of ShakeMaps has also led to several key improvements to the Global ShakeMap system. The key upgrades include: addition of uncertainties in the ground motion mapping, introduction of modern ground-motion prediction equations, improved estimates of global seismic-site conditions (VS30), and improved definition of stable continental region polygons. Finally, we have merged all of the ShakeMaps in the Atlas to provide a global perspective of earthquake ground shaking for the past 35 years, allowing comparison with probabilistic hazard maps. The online Atlas and supporting databases can be found at http://earthquake.usgs.gov/eqcenter/shakemap/atlas.php/.

  14. GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine

    NASA Astrophysics Data System (ADS)

    Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.

    2015-12-01

    The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.

  15. Satellite Relay Telemetry of Seismic Data in Earthquake Prediction and Control

    NASA Technical Reports Server (NTRS)

    Jackson, W. H.; Eaton, J. P.

    1971-01-01

    The Satellite Telemetry Earthquake Monitoring Program was started to evaluate the applicability of satellite relay telemetry in the collection of seismic data from a large number of dense seismograph clusters laid out along the major fault systems of western North America. Prototype clusters utilizing phone-line telemetry were then being installed by the National Center for Earthquake Research in 3 regions along the San Andreas fault in central California; and the experience of installing and operating the clusters and in reducing and analyzing the seismic data from them was to provide the raw materials for evaluation in the satellite relay telemetry project. The principal advantages of the satellite relay system over commercial telephone or microwave systems were: (1) it could be made less prone to massive failure during a major earthquake; (2) it could be extended readily into undeveloped regions; and (3) it could provide flexible, uniform communications over large sections of major global tectonic zones. Fundamental characteristics of a communications system to cope with the large volume of raw data collected by a short-period seismograph network are discussed.

  16. Inversion for slip distribution for the 2012 Costa Rica earthquake

    NASA Astrophysics Data System (ADS)

    McCormack, K. A.; Hesse, M. A.; Stadler, G.

    2014-12-01

    On 5 September 2012, a major megathrust earthquake (Mw=7.6) ruptured the plate interface beneath the Nicoya Peninsula, Costa Rica. This event was centered 12 km offshore of the central Nicoya coast, at a depth of 18 km. The maximum slip exceeded 2 meters, and the rupture spread outward along the plate interface to encompass 3000 km2 of the Nicoya seismogenic zone. More than 1700 aftershocks were recorded within the first 5 days. These aftershocks outlined two distinct rupture patches; one centered on the central coast and the other beneath the southern tip of the peninsula. We formulate a Bayesian inverse problem to infer the coseismic slip on the fault plane based on instantaneous surface displacements and changes in well heads in order to image the remaining "locked" patch that has been inferred previously. We compute the maximum a posteriori (MAP) estimate of the posterior slip distribution on the fault, and use a local Gaussian approximation around the MAP point to characterize the uncertainty. The elastic deformation is computed using a finite element method that allows for the spatial variation of elastic properties that has been observed in the crust overlying the seismogenic zone. We solve the optimization problem using gradients obtained from adjoints. The linearity of the inverse problem allows for the efficient solution of the optimal experimental design problem for the placement of the GPS stations to monitor the remaining locked patch. In the future, the results obtained here will provide the initial condition for a time-dependent poroelastic model for fault slip and fluid migration due to overpressure caused by a megathrust earthquake. This will provide constraints on the crustal permeability structure in a tectonically active region.

  17. Evaluation of real-time tsunami earthquake discriminants

    NASA Astrophysics Data System (ADS)

    Hagerty, M. T.; Hirshorn, B. F.; Weinstein, S.; Knight, W. R.; Whitmore, P.

    2014-12-01

    Tsunami earthquakes generate a disproportionally large tsunami for their seismic moment. For a tsunami warning center, they are especially difficult to detect in real-time since magnitude alone is insufficient to issue an alert. Recently, several methods have been developed to identify tsunami earthquakes, including: various energy magnitude estimates (e.g., MED, Lomax et al, 2007), the theta discriminant (Newman & Okal, 1998), RTerg (Newman & Convers, 2010), TACER (Convers & Newman, 2013), mHFER (Hara, 2007), and rupture duration, TR (Lomax & Michelini, 2009 & 2010). Each method makes particular assumptions about the rupture process and subsequent tsunami generation that lead to the use of different algorithms to estimate the radiated seismic energy and/or rupture duration. However, the various methods are essentially comparing these estimates to the long period seismic moment, in order to identify the unusually long durations, slow ruptures, and small stress drops that characterize tsunami earthquakes. We test the tsunami earthquake discriminants on a dataset of subduction zone earthquakes containing several tsunami earthquakes with the goal of determining which methods (or combination of methods) are well-suited to real-time implementation at the U.S. tsunami warning centers in Hawaii and Alaska. Of particular interest is the ability of each method to correctly identify known tsunami earthquakes with a minimum of false positives and with a minimum of apriori assumptions about any individual event that might bias a real-time detection system.

  18. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    SciTech Connect

    Saragoni, G. Rodolfo

    2008-07-08

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.

  19. Phase Transformations and Earthquakes

    NASA Astrophysics Data System (ADS)

    Green, H. W.

    2011-12-01

    Phase transformations have been cited as responsible for, or at least involved in, "deep" earthquakes for many decades (although the concept of "deep" has varied). In 1945, PW Bridgman laid out in detail the string of events/conditions that would have to be achieved for a solid/solid transformation to lead to a faulting instability, although he expressed pessimism that the full set of requirements would be simultaneously achieved in nature. Raleigh and Paterson (1965) demonstrated faulting during dehydration of serpentine under stress and suggested dehydration embrittlement as the cause of intermediate depth earthquakes. Griggs and Baker (1969) produced a thermal runaway model of a shear zone under constant stress, culminating in melting, and proposed such a runaway as the origin of deep earthquakes. The discovery of Plate Tectonics in the late 1960s established the conditions (subduction) under which Bridgman's requirements for earthquake runaway in a polymorphic transformation could be possible in nature and Green and Burnley (1989) found that instability during the transformation of metastable olivine to spinel. Recent seismic correlation of intermediate-depth-earthquake hypocenters with predicted conditions of dehydration of antigorite serpentine and discovery of metastable olivine in 4 subduction zones, suggests strongly that dehydration embrittlement and transformation-induced faulting are the underlying mechanisms of intermediate and deep earthquakes, respectively. The results of recent high-speed friction experiments and analysis of natural fault zones suggest that it is likely that similar processes occur commonly during many shallow earthquakes after initiation by frictional failure.

  20. Earthquakes and faults in southern California (1970-2010)

    USGS Publications Warehouse

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.

    2012-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.3 in southern California (1970–2010). The bathymetry was generated from digital files from the California Department of Fish And Game, Marine Region, Coastal Bathymetry Project. Elevation data are from the U.S. Geological Survey National Elevation Database. Landsat satellite image is from fourteen Landsat 5 Thematic Mapper scenes collected between 2009 and 2010. Fault data are reproduced with permission from 2006 California Geological Survey and U.S. Geological Survey data. The earthquake data are from the U.S. Geological Survey National Earthquake Information Center.

  1. Earthquakes in the Central United States, 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Volpi, Christina M.

    2010-01-01

    This publication is an update of an earlier report, U.S. Geological Survey (USGS) Geologic Investigation I-2812 by Wheeler and others (2003), titled ?Earthquakes in the Central United States-1699-2002.? Like the original poster, the center of the updated poster is a map showing the pattern of earthquake locations in the most seismically active part of the central United States. Arrayed around the map are short explanatory texts and graphics, which describe the distribution of historical earthquakes and the effects of the most notable of them. The updated poster contains additional, post 2002, earthquake data. These are 38 earthquakes covering the time interval from January 2003 to June 2010, including the Mount Carmel, Illinois, earthquake of 2008. The USGS Preliminary Determination of Epicenters (PDE) was the source of these additional data. Like the I-2812 poster, this poster was prepared for a nontechnical audience and designed to inform the general public as to the widespread occurrence of felt and damaging earthquakes in the Central United States. Accordingly, the poster should not be used to assess earthquake hazard in small areas or at individual locations.

  2. Sun, Moon and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  3. Landslides caused by earthquakes.

    USGS Publications Warehouse

    Keefer, D.K.

    1984-01-01

    Data from 40 historical world-wide earthquakes were studied to determine the characteristics, geologic environments, and hazards of landslides caused by seismic events. This sample was supplemented with intensity data from several hundred US earthquakes to study relations between landslide distribution and seismic parameters. Correlations between magnitude (M) and landslide distribution show that the maximum area likely to be affected by landslides in a seismic event increases from approximately 0 at M = 4.0 to 500 000 km2 at M = 9.2. Each type of earthquake-induced landslide occurs in a particular suite of geologic environments. -from Author

  4. Earthquake engineering in Peru

    USGS Publications Warehouse

    Vargas, N.J

    1983-01-01

    During the last decade, earthquake engineering research in Peru has been carried out at the Catholic University of Peru and at the Universidad Nacional de Ingeniera (UNI). The Geophysical Institute (IGP) under the auspices of the Organization of American States (OAS) has initiated in Peru other efforts in regional seismic hazard assessment programs with direct impact to the earthquake engineering program. Further details on these programs have been reported by L. Ocola in the Earthquake Information Bulletin, January-February 1982, vol. 14, no. 1, pp. 33-38. 

  5. Earthquakes and emergence

    NASA Astrophysics Data System (ADS)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  6. Correlating precursory declines in groundwater radon with earthquake magnitude.

    PubMed

    Kuo, T

    2014-01-01

    Both studies at the Antung hot spring in eastern Taiwan and at the Paihe spring in southern Taiwan confirm that groundwater radon can be a consistent tracer for strain changes in the crust preceding an earthquake when observed in a low-porosity fractured aquifer surrounded by a ductile formation. Recurrent anomalous declines in groundwater radon were observed at the Antung D1 monitoring well in eastern Taiwan prior to the five earthquakes of magnitude (Mw ): 6.8, 6.1, 5.9, 5.4, and 5.0 that occurred on December 10, 2003; April 1, 2006; April 15, 2006; February 17, 2008; and July 12, 2011, respectively. For earthquakes occurring on the longitudinal valley fault in eastern Taiwan, the observed radon minima decrease as the earthquake magnitude increases. The above correlation has been proven to be useful for early warning local large earthquakes. In southern Taiwan, radon anomalous declines prior to the 2010 Mw 6.3 Jiasian, 2012 Mw 5.9 Wutai, and 2012 ML 5.4 Kaohsiung earthquakes were also recorded at the Paihe spring. For earthquakes occurring on different faults in southern Taiwan, the correlation between the observed radon minima and the earthquake magnitude is not yet possible.

  7. The ARIA project: Advanced Rapid Imaging and Analysis for Natural Hazard Monitoring and Response

    NASA Astrophysics Data System (ADS)

    Owen, S. E.; Webb, F.; Simons, M.; Rosen, P. A.; Cruz, J.; Yun, S.; Fielding, E. J.; Moore, A. W.; Hua, H.; Agram, P.; Lundgren, P.

    2012-12-01

    ARIA is a joint JPL/Caltech coordinated effort to automate geodetic imaging capabilities for hazard response and societal benefit. Over the past decade, space-based geodetic measurements such as InSAR and GPS have provided new assessment capabilities and situational awareness on the size and location of earthquakes following seismic disasters and on volcanic eruptions following magmatic events. Geodetic imaging's unique ability to capture surface deformation in high spatial and temporal resolution allow us to resolve the fault geometry and distribution of slip associated with any given earthquake in correspondingly high spatial & temporal detail. In addition, remote sensing with radar provides change detection and damage assessment capabilities for earthquakes, floods and other disasters that can image even at night or through clouds. These data sets are still essentially hand-crafted, and thus are not generated rapidly and reliably enough for informing decision-making agencies and the public following an earthquake. We are building an end-to-end prototype geodetic imaging data system that would form the foundation for an envisioned operational hazard response center integrating InSAR, GPS, seismology, and modeling to deliver monitoring, actionable science, and situational awareness products. This prototype exploits state-of-the-art analysis algorithms from technologists and scientists, These algorithms enable the delivery of actionable products from larger data sets with enhanced modeling and interpretation, and the development of next generation techniques. We are collaborating with USGS scientists in both the earthquake and volcano science program for our initial data product infusion. We present our progress to date on development of prototype data system and demonstration data products, and example responses we have run such as generating products for the 2011 M9.0 Tohoku-oki, M6.3 Christchurch earthquakes, the 2011 M7.1 Van earthquake, and several simulated

  8. The BEYOND center of excellence for the effective exploitation of satellite time series towards natural disasters monitoring and assessment

    NASA Astrophysics Data System (ADS)

    Kontoes, Charalampos; Papoutsis, Ioannis; Amiridis, Vassilis; Balasis, George; Keramitsoglou, Iphigenia; Herekakis, Themistocles; Christia, Eleni

    2014-05-01

    BEYOND project (2013-2016, 2.3Meuro) funded under the FP7-REGPOT scheme is an initiative which aims to build a Centre of Excellence for Earth Observation (EO) based monitoring of natural disasters in south-eastern Europe (http://beyond-eocenter.eu/), established at the National Observatory of Athens (NOA). The project focuses on capacity building on top of the existing infrastructure, aiming at unlocking the institute's potential through the systematic interaction with high-profile partners across Europe, and at consolidating state-of-the-art equipment and technological know-how that will allow sustainable cutting-edge interdisciplinary research to take place with an impact on the regional and European socioeconomic welfare. The vision is to set up innovative integrated observational solutions to allow a multitude of space borne and ground-based monitoring networks to operate in a complementary and cooperative manner, create archives and databases of long series of observations and higher level products, and make these available for exploitation with the involvement of stakeholders. In BEYOND critical infrastructural components are being procured for fostering access, use, retrieval and analysis of long EO data series and products. In this framework NOA has initiated activities for the development, installation and operation of important acquisition facilities and hardware modules, including space based observational infrastructures as the X-/L-band acquisition station for receiving EOS Aqua/Terra, NPP, JPSS, NOAA, Metop, Feng Yun data in real time, the setting up of an ESA's Mirror Site of Sentinel missions to be operable from 2014 onwards, an advanced Raman Lidar portable station, a spectrometer facility, several ground magnetometer stations. All these are expected to work in synergy with the existing capacity resources and observational networks including the MSG/SEVIRI acquisition station, nationwide seismographic, GPS, meteo and atmospheric networks. The

  9. The SCEC-USGS Dynamic Earthquake Rupture Code Comparison Exercise - Simulations of Large Earthquakes and Strong Ground Motions

    NASA Astrophysics Data System (ADS)

    Harris, R.

    2015-12-01

    I summarize the progress by the Southern California Earthquake Center (SCEC) and U.S. Geological Survey (USGS) Dynamic Rupture Code Comparison Group, that examines if the results produced by multiple researchers' earthquake simulation codes agree with each other when computing benchmark scenarios of dynamically propagating earthquake ruptures. These types of computer simulations have no analytical solutions with which to compare, so we use qualitative and quantitative inter-code comparisons to check if they are operating satisfactorily. To date we have tested the codes against benchmark exercises that incorporate a range of features, including single and multiple planar faults, single rough faults, slip-weakening, rate-state, and thermal pressurization friction, elastic and visco-plastic off-fault behavior, complete stress drops that lead to extreme ground motion, heterogeneous initial stresses, and heterogeneous material (rock) structure. Our goal is reproducibility, and we focus on the types of earthquake-simulation assumptions that have been or will be used in basic studies of earthquake physics, or in direct applications to specific earthquake hazard problems. Our group's goals are to make sure that when our earthquake-simulation codes simulate these types of earthquake scenarios along with the resulting simulated strong ground shaking, that the codes are operating as expected. For more introductory information about our group and our work, please see our group's overview papers, Harris et al., Seismological Research Letters, 2009, and Harris et al., Seismological Research Letters, 2011, along with our website, scecdata.usc.edu/cvws.

  10. GEM - The Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  11. "Not just eliminating the mosquito but draining the swamp": A critical geopolitics of Turkish Monitoring Center for Drugs and Drug Addiction and Turkey's approach to illicit drugs.

    PubMed

    Evered, Kyle T; Evered, Emine Ö

    2016-07-01

    In the 1970s, Turkey ceased to be a significant producer state of illicit drugs, but it continued to serve as a key route for the trade of drugs between East and West. Over the past decade, however, authorities identified two concerns beyond its continued transit state status. These reported problems entail both new modes of production and a rising incidence of drug abuse within the nation-state - particularly among its youth. Amid these developments, new law enforcement institutions emerged and acquired European sponsorship, leading to the establishment of TUBİM (the Turkish Monitoring Center for Drugs and Drug Addiction). Coordinating with and reporting to the European Union agency EMCDDA (the European Monitoring Center for Drugs and Drug Addiction), TUBİM's primary assigned duties entail the collection and analysis of data on drug abuse, trafficking, and prevention, the geographic identification of sites of concern (e.g. consumption, drug-related crimes, and peoples undergoing treatment), and the production of annual national reports. In this article, we examine the geopolitical origins of TUBİM as Turkey's central apparatus for confronting drug problems and its role as a vehicle for policy development, interpretation, and enforcement. In doing so, we emphasize the political and spatial dimensions inherent to the country's institutional and policy-driven approaches to contend with drug-related problems, and we assess how this line of attack reveals particular ambiguities in mission when evaluated from scales at world regional, national, and local levels. In sum, we assess how Turkey's new institutional and legislative landscapes condition the state's engagements with drug use, matters of user's health, and policy implementation at local scales and amid ongoing political developments.

  12. "Not just eliminating the mosquito but draining the swamp": A critical geopolitics of Turkish Monitoring Center for Drugs and Drug Addiction and Turkey's approach to illicit drugs.

    PubMed

    Evered, Kyle T; Evered, Emine Ö

    2016-07-01

    In the 1970s, Turkey ceased to be a significant producer state of illicit drugs, but it continued to serve as a key route for the trade of drugs between East and West. Over the past decade, however, authorities identified two concerns beyond its continued transit state status. These reported problems entail both new modes of production and a rising incidence of drug abuse within the nation-state - particularly among its youth. Amid these developments, new law enforcement institutions emerged and acquired European sponsorship, leading to the establishment of TUBİM (the Turkish Monitoring Center for Drugs and Drug Addiction). Coordinating with and reporting to the European Union agency EMCDDA (the European Monitoring Center for Drugs and Drug Addiction), TUBİM's primary assigned duties entail the collection and analysis of data on drug abuse, trafficking, and prevention, the geographic identification of sites of concern (e.g. consumption, drug-related crimes, and peoples undergoing treatment), and the production of annual national reports. In this article, we examine the geopolitical origins of TUBİM as Turkey's central apparatus for confronting drug problems and its role as a vehicle for policy development, interpretation, and enforcement. In doing so, we emphasize the political and spatial dimensions inherent to the country's institutional and policy-driven approaches to contend with drug-related problems, and we assess how this line of attack reveals particular ambiguities in mission when evaluated from scales at world regional, national, and local levels. In sum, we assess how Turkey's new institutional and legislative landscapes condition the state's engagements with drug use, matters of user's health, and policy implementation at local scales and amid ongoing political developments. PMID:27267659

  13. Earthquake source properties and wave propagation in Eastern North America

    NASA Astrophysics Data System (ADS)

    Magalhaes de Matos Viegas Fernandes, Gisela Sofia

    The study of intraplate earthquakes is fundamental for the understanding of the physics of faulting, seismic hazard assessment, and nuclear monitoring, but large to moderate well recorded intraplate earthquakes are scarce. I use the best recorded earthquake in Eastern North America (ENA)---the Mw 5.0 20 April 2002, Au Sable Forks, NY, earthquake and its aftershock sequence to investigate wave propagation and earthquake source properties in ENA. The Au Sable Forks epicenter is located near the boundary of two distinct geological provinces Appalachian (New England) and Grenville (New York). Existing regional one-dimensional (1D) crustal models were derived from seismic surveys or from sparse ground-motions recordings from regional moderate earthquakes. I obtain improved 1D crustal models for these two provinces by forward modeling, for the first time, multi-path high-quality ground-motions of a moderate earthquake in ENA. Using Au Sable Forks earthquake records at 16 stations (epicentral distances < 400 km) at intermediate frequencies (<1 Hz), I generate synthetic seismograms using the frequency-wave number method. The new models improve the fit of synthetics to data at all 6 stations in the Grenville province and at 5 of the 10 stations in the Appalachian province. I identify complex wave paths along the boundary between the provinces, and 3% azimuthal anisotropy in the Appalachian crust. It is unknown how much earthquake source properties depend on the tectonic setting in which the earthquakes occur. Debate exists regarding the invariance of stress drop with earthquake size in ENA, and whether earthquakes in intraplate regions have higher stress drops than those in more tectonically active regions. I estimate source parameters for 22 earthquakes (M1-M5) of the Au Sable Forks sequence, using two alternative methods: a direct wave method (Empirical Green's Function) and a coda wave method (Coda Ratio) applied for the first time to small magnitude earthquakes. Both

  14. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  15. Nonlinear processes in earthquakes

    SciTech Connect

    Jones, E.M.; Frohlich, C.

    1998-12-31

    This is the final report of a one-year, Laboratory Directed Research and Development (LDRD) project at Los Alamos National Laboratory (LANL). Three-dimensional, elastic-wave-propagation calculations were performed to define the effects of near-source geologic structure on the degree to which seismic signals produced by earthquakes resemble {open_quotes}non-double-couple{close_quotes} sources. Signals from sources embedded in a subducting slab showed significant phase and amplitude differences compared with a {open_quotes}no-slab{close_quotes} case. Modifications to the LANL elastic-wave propagation code enabled improved simulations of path effects on earthquake and explosion signals. These simulations demonstrate that near-source, shallow, low-velocity basins can introduce earthquake-like features into explosion signatures through conversion of compressive (P-wave) energy to shear (S- and R-wave) modes. Earthquake sources simulated to date do not show significant modifications.

  16. Earthquake resistant design

    SciTech Connect

    Dowrick, D.J.

    1988-01-01

    The author discusses recent advances in earthquake-resistant design. This book covers the entire design process, from aspects of loading to details of construction. Early chapters offer a broad theoretical background; later chapters provide rigorous coverage of practical aspects.

  17. The Centers for Disease Control and Prevention's public health response to monitoring Tdap safety in pregnant women in the United States

    PubMed Central

    Moro, Pedro L; McNeil, Michael M; Sukumaran, Lakshmi; Broder, Karen R

    2015-01-01

    In 2010, in response to a widespread pertussis outbreak and neonatal deaths, California became the first state to recommend routine administration of tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis (Tdap) vaccine during pregnancy. In 2011, the Advisory Committee on Immunization Practices (ACIP) followed with a similar recommendation for Tdap vaccination during pregnancy for previously unvaccinated women. In 2012, this recommendation was expanded to include Tdap vaccination of every pregnant woman during each pregnancy. These recommendations were based on urgent public health needs and available evidence on the safety of other inactivated vaccines during pregnancy. However, there were limited data on the safety of Tdap during pregnancy. In response to the new ACIP recommendations, the Centers for Disease Control and Prevention (CDC) implemented ongoing collaborative studies to evaluate whether vaccination with Tdap during pregnancy adversely affects the health of mothers and their offspring and provide the committee with regular updates. The current commentary describes the public health actions taken by CDC to respond to the ACIP recommendation to study and monitor the safety of Tdap vaccines in pregnant women and describes the current state of knowledge on the safety of Tdap vaccines in pregnant women. Data from the various monitoring activities support the safety of Tdap use during pregnancy. PMID:26378718

  18. The State of the Colorado River Ecosystem in Grand Canyon: A Report of the Grand Canyon Monitoring and Research Center 1991-2004

    USGS Publications Warehouse

    Gloss, Steven P.; Lovich, Jeffrey E.; Melis, Theodore S.

    2005-01-01

    This report is an important milestone in the effort by the Secretary of the Interior to implement the Grand Canyon Protection Act of 1992 (GCPA; title XVIII, secs. 1801-1809, of Public Law 102-575), the most recent authorizing legislation for Federal efforts to protect resources downstream from Glen Canyon Dam. The chapters that follow are intended to provide decision makers and the American public with relevant scientific information about the status and recent trends of the natural, cultural, and recreational resources of those portions of Grand Canyon National Park and Glen Canyon National Recreation Area affected by Glen Canyon Dam operations. Glen Canyon Dam is one of the last major dams that was built on the Colorado River and is located just south of the Arizona-Utah border in the lower reaches of Glen Canyon National Recreation Area, approximately 15 mi (24 km) upriver from Grand Canyon National Park (fig. 1). The information presented here is a product of the Glen Canyon Dam Adaptive Management Program (GCDAMP), a federally authorized initiative to ensure that the primary mandate of the GCPA is met through advances in information and resource management. The U.S. Geological Survey`s (USGS) Grand Canyon Monitoring and Research Center (GCMRC) has responsibility for the scientific monitoring and research efforts for the program, including the preparation of reports such as this one.

  19. The Centers for Disease Control and Prevention's public health response to monitoring Tdap safety in pregnant women in the United States.

    PubMed

    Moro, Pedro L; McNeil, Michael M; Sukumaran, Lakshmi; Broder, Karen R

    2015-01-01

    In 2010, in response to a widespread pertussis outbreak and neonatal deaths, California became the first state to recommend routine administration of tetanus toxoid, reduced diphtheria toxoid, and acellular pertussis (Tdap) vaccine during pregnancy. In 2011, the Advisory Committee on Immunization Practices (ACIP) followed with a similar recommendation for Tdap vaccination during pregnancy for previously unvaccinated women. In 2012, this recommendation was expanded to include Tdap vaccination of every pregnant woman during each pregnancy. These recommendations were based on urgent public health needs and available evidence on the safety of other inactivated vaccines during pregnancy. However, there were limited data on the safety of Tdap during pregnancy. In response to the new ACIP recommendations, the Centers for Disease Control and Prevention (CDC) implemented ongoing collaborative studies to evaluate whether vaccination with Tdap during pregnancy adversely affects the health of mothers and their offspring and provide the committee with regular updates. The current commentary describes the public health actions taken by CDC to respond to the ACIP recommendation to study and monitor the safety of Tdap vaccines in pregnant women and describes the current state of knowledge on the safety of Tdap vaccines in pregnant women. Data from the various monitoring activities support the safety of Tdap use during pregnancy. PMID:26378718

  20. TLC-Asthma: An Integrated Information System for Patient-centered Monitoring, Case Management, and Point-of-Care Decision Support

    PubMed Central

    Adams, William G.; Fuhlbrigge, Anne L.; Miller, Charles W.; Panek, Celeste G.; Gi, Yangsoon; Loane, Kathleen C.; Madden, Nancy E.; Plunkett, Anne M.; Friedman, Robert H.

    2003-01-01

    A great deal of successful work has been done in the area of EMR development, implementation, and evaluation. Less work has been done in the area of automated systems for patients. Efforts to link data at multiple levels – the patient, the case manager, and the clinician have been rudimentary to-date. In this paper we present a model information system that integrates patient health information across multiple domains to support the monitoring and care of children with persistent asthma. The system has been developed for use in a multi-specialty group practice and includes three primary components: 1) a patient-centered telephone-linked communication system; 2) a web-based alert reporting and nurse case-management system; and 3) EMR-based provider communication to support clinical decision making at the point-of-care. The system offers a model for a new level of connectivity for health information that supports customized monitoring, IT-enabled nurse case-managers, and the delivery of longitudinal data to clinicians to support the care of children with persistent asthma. Systems like the one described are well -suited, perhaps essential, technologies for the care of children and adults with chronic conditions such as asthma. PMID:14728122

  1. The New Madrid earthquakes

    SciTech Connect

    Obermeier, S.F.

    1989-01-01

    Two interpreted 1811-12 epicenters generally agree well with zones of seismicity defined by modern, small earthquakes. Bounds on accelerations are placed at the limits of sand blows, generated by the 1811-12 earthquakes in the St. Francis Basin. Conclusions show how the topstratum thickness, sand size of the substratum, and thickness of alluvium affected the distribution of sand blows in the St. Francis Basin.

  2. Earthquake education in California

    USGS Publications Warehouse

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  3. Injection-induced earthquakes

    USGS Publications Warehouse

    Ellsworth, William L.

    2013-01-01

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  4. Protocolized fluid therapy in brain-dead donors: The multi-center randomized MOnIToR trial

    PubMed Central

    Al-Khafaji, Ali; Elder, Michele; Lebovitz, Daniel J; Murugan, Raghavan; Souter, Michael; Stuart, Susan; Wahed, Abdus S.; Keebler, Ben; Dils, Dorrie; Mitchell, Stephanie; Shutterly, Kurt; Wilkerson, Dawn; Pearse, Rupert; Kellum, John A

    2015-01-01

    BACKGROUND Critical shortages of organs for transplantation jeopardize many lives. Observational data suggest that better fluid management for deceased organ donors could increase organ recovery. We conducted the first large multi-center randomized trial in brain-dead donors to determine whether protocolized fluid therapy increases organs transplanted. METHODS We randomly assigned donors to either protocolized or usual care in eight organ procurement organizations. A “protocol-guided fluid therapy” algorithm targeting cardiac index, mean arterial pressure and pulse pressure variation was used. Our primary outcome was the number of organs transplanted per donor and our primary analysis was intention-to-treat. Secondary analyses included: 1) modified intention-to-treat where only subjects able to receive the intervention were included, and 2) twelve-month survival in transplant recipients. The study was stopped early. RESULTS We enrolled 556 donors; 279 protocolized care, 277 usual care. Groups had similar characteristics at baseline. The study protocol could be implemented in 76% of subjects randomized to the intervention. There was no significant difference in mean number of organs transplanted per donor: 3.39 organs per donor, (95%CI: 3.14-3.63) with protocolized care, compared to usual care 3.29 (95%CI: 3.04-3.54) (mean difference, 0.1, 95%CI: -0.25 to 0.45; p=0.56). In modified intention-to-treat analysis the mean number of organs increased (3.52 organs per donor, 95%CI: 3.23-3.8) but was not statistically significant (mean difference, 0.23, 95%CI: -0.15-0.61; p=0.23). Among the 1430 recipients of organs from study subjects, with data available, 56 deaths (7.8%) occurred in the protocolized care arm and 56 (7.9%) in the usual care arm in the first year (Hazard Ratio: 0.97, p=0.86). CONCLUSIONS In brain-dead organ donors, protocol-guided fluid therapy compared to usual care may not increase the number of organs transplanted per donor. PMID:25583616

  5. Real-time Tsunami Warning Operations at the NOAA West Coast/Alaska Tsunami Warning Center

    NASA Astrophysics Data System (ADS)

    Whitmore, P.; Huang, P.; Crowley, H.; Ferris, J.; Hale, D.; Knight, W.; Medbery, A.; Nyland, D.; Preller, C.; Turner, B.; Urban, G.

    2007-12-01

    The West Coast/Alaska Tsunami Warning Center (WCATWC) in Palmer, Alaska and the Pacific Tsunami Warning Center (PTWC) in Ewa Beach, Hawaii, provide tsunami warning services for a large portion of the world's coasts. The WCATWC has primary responsibility for providing tsunami detection, warnings, and forecasts to Canada, Puerto Rico, Virgin Islands, and all U.S. States except Hawaii. WCATWC also acts as back-up for the PTWC, requiring the center to constantly monitor global tsunami activities by rapidly detecting and evaluating earthquakes for their tsunamigenic potential. The Centers' goals are to issue initial messages as quickly as possible to alert those near the source to potential danger (assuming there is any), and to follow that with a reasonable forecast of impact level. With these goals in mind, a Watchstander's initial action is based entirely on estimates of tsunami potential from the earthquake's source parameters. The course of action for the first message is determined primarily by the earthquake's magnitude, location, tsunami history, tsunami travel time, estimated threat based on pre-computed models, and pre-set criteria. Supplemental messages, if necessary, are based on wave observations and forecasts generated from hydrodynamic models (which are calibrated with near real-time observations). In April 2006, the WCATWC increased staff level so that the Center can be staffed 24/7 with two watchstanders. Since then, the Center's response time for events within the primary area-of-responsibility has decreased to less than 5 minutes. In order to illustrate the WCATWC's real time tsunami warning operational environment, tsunami warning operation timelines for several tsunamigenic earthquakes - including the September 12 southern Sumatra 8.4 and the January 13 Kuril Island 8.1 earthquakes - are provided. The timelines highlight the key parameters and observations that guide tsunami warning operations chronicling the event through: 1) initial alarm, 2

  6. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2008

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.

    2009-01-01

    Between January 1 and December 31, 2008, the Alaska Volcano Observatory (AVO) located 7,097 earthquakes of which 5,318 occurred within 20 kilometers of the 33 volcanoes monitored by the AVO. Monitoring highlights in 2008 include the eruptions of Okmok Caldera, and Kasatochi Volcano, as well as increased unrest at Mount Veniaminof and Redoubt Volcano. This catalog includes descriptions of: (1) locations of seismic instrumentation deployed during 2008; (2) earthquake detection, recording, analysis, and data archival systems; (3) seismic velocity models used for earthquake locations; (4) a summary of earthquakes located in 2008; and (5) an accompanying UNIX tar-file with a summary of earthquake origin times, hypocenters, magnitudes, phase arrival times, location quality statistics, daily station usage statistics, and all files used to determine the earthquake locations in 2008.

  7. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we